In our last case study on L’express we demonstrated how one of the largest publishers in Africa used AdNgin to increase their ad earnings. But not of all of our readers have large websites with hundreds of thousands of daily page views. In fact, many of you operate small but successful niche blogs, review websites, content curation operations, forums, or other web properties that don’t have the kind of traffic that L’express brings in on a daily basis. If that sounds like you read on. If it doesn’t, read on anyway.
What is carz.co.il?
Carz is an Israeli niche website that provides unique information and reviews on specific car models.
Most of their visitors are potential car buyers, which, you guessed it, pulls in nice CPCs from premium advertisers like car manufacturers and insurance companies. But not only can a niche website attract premium advertisers, the other great thing about niche websites is that they provide a higher likelihood that visitors will be interested in the ad and click on it.
Everybody gets something. Visitors receive a superior user experience with ads they are interested in, the publisher makes more ad revenue, and the advertiser sees higher conversion rates. It’s Christmas in July.
So when carz.co.il started to see their AdSense earnings plateau they naturally decided to take action and approached AdNgin. With a strong focus on improving ad revenue while also nurturing a comfortable user experience, AdNgin’s multi-arm bandit testing platform was a perfect fit for carz.co.il.
The guys at carz.co.il wasted no time in setting up around 20 different experiments using AdNgin’s testing platform. The experiments ran on various pages and tested things like device specific set ups, ad placements, ad sizes, color palettes, and more.
Over the period of the experiments, carz.co.il reported a 42% lift in Google AdSense revenue.
What’s multi-arm bandit testing and how is it different from A/B testing?
Before we jump into what we tested and what the results were it’s important that understand that AdNgin runs on a multi-arm bandit algorithm.
You’ve probably heard of A/B testing but, for most, bandit tests are still just as a mystery.
Bandit tests are similar but different from A/B tests. For instance, let’s say you decided to create two versions of your homepage, one control and one variation.
Normal A/B tests will usually split traffic equally between the versions. You then wait to achieve statistical significance to choose the best performing version. Bandits, on the other hand, work on two phases but are actually much faster.
In the first phase, called the exploration phase, a small proportion of the the traffic receives all versions equally. This allows the multi-arm bandit algorithm to choose the best performing version quickly.
In the second phase, called the exploitation phase, most of the traffic will receive the version that performed better in stage 1. The second version will still be allocated a small proportion of the traffic just to make sure the conclusions from stage 1 were correct.
Bandit tests are popular when testing elements directly related to conversions because they display the better performing version quickly and prevent losing too many conversions or clicks in the case of advertising.
Bandit tests are very effective when testing banner ads. There are bandit testing platforms, such as AdNgin specifically dedicated to testing banner sizes, location, color palette, mobile vs. desktop, and many other parameters.
When every impression equals money, using bandit tests can save publishers a lot of it. Despite this very simple logic, few webmasters bandit test their banner ads.
Since few publishers optimize their ads, publishers that test their ads are ahead of the curve and therefore even more likely to see positive results. Below are three of the experiments ran by carz.co.il.
Experiment 1: Homepage Mobile vs. Desktop
Carz approached their AdNgin experiments very strategically. They created various experiments to run across their website. One of those experiments consisted of creating separate setups of their homepage for mobile and desktop users.
The desktop variations consisted of one above the fold (ATF) ad unit that displayed a 728 x 90 banner, and a below the fold (BTF) ad unit which rotated between a 300 x 250 box banner, a 728 x 90 leaderboard, and another box banner sized 336 x 280 pixels.
For the below the fold ad unit that rotated between three ad sizes we saw that the largest banners yielded the highest CTR. In fact, CTR increased by 81% between the leaderboard and the 336 x 290 box banner. Nothing too jaw dropping here. However, things get a little bit more interesting when we compare Mobile vs. Desktop
Before we get into the results. Let me provide you with the mobile setup that was tested. For mobile, the AdNgin experiment consisted of a 320 x 100 above the fold leaderboard, a 320 x 50 sticky ad or also know as a mobile anchor ads, and a below the fold ad unit that rotated between three different box banners, 300 x 250, 250 x 250, and 200 x 200 pixels.
Experiment 1: Results
The experiment that ran on the below the fold ad unit demonstrated that, similarly to the desktop ad unit, bigger is better. Some of you might think that there isn’t much of a difference between a 300 x 250 banner and a 250 x 250 banner ad but there’s actually a 63% improvement in CTR for the larger version and a 95% improvement between the largest and smallest units. Sometimes a small change can make a big difference.
Desktop vs. Mobile
The desktop and mobile setups may not be the same but when we compare between them we still come up with some interesting insights:
- The above the fold leaderboard produced 83% more click-throughs on mobile than it did on desktop devices.
- Similar thing happened with the below the fold ad units. Mobile ads produced 39% more click-throughs than their desktop counters.
- The mobile anchor ad produced exactly the same CTR as the average of all the below the fold box banners, which is telling about the effectiveness of mobile anchor ads.
Experiment 2: Car Page
The second experiment that we’re going to share with you here is a little bit more complicated. It has a smorgasboard of smaller experiments going on at the same time so try to keep up. The experiment was setup as follows:
- The guys at Carz created three different page setups.
- Within every page setups they placed three ad units.
- Every ad unit rotated between 2 -3 different ads.
- All three setups ran on desktop devices only.
Each setup was different in placement and size. Control (Desktop):
- The first ad unit was placed above the fold and rotated between leaderboard style banners.
- Another ad unit was placed on the right hand side of the page and rotated between two different skyscraper banners a 300 x 600 and a 160 x 600.
- A third below the fold and above the query box. The ad unit rotated between 300 x 250 box banners with different color palettes.
Variation 1 (Desktop):
- Again, same as variation 1, an ad unit located above the fold which rotated between leaderboard style banners.
- The skyscraper ad unit from variation 1 was placed on the left hand side instead of the right and only ran 160 x 600 sizes.
- The same ad unit as variation 1 placed below the fold and above the query box. The ad unit rotated between 300 x 250 box banners with different color palettes.
Variation 2 (Desktop):
- Unlike the control and variation 1, the leaderboard was above the fold but under the car image instead of above it.
- A right side skyscraper ad unit rotated between 300 x 600 skyscrapers with various color palettes (same as variation 1, except no 160 x 600 sizes).
- Here again ran the same ad unit as variation 1 placed below the fold and above the query box and rotated between 300 x 250 sizes with various color palettes.
Experiment 2: Results
I’m sure you’re having a hard time keeping track of all these variations and this is all starting to look a bit complicated. But actually, it isn’t. The AdNgin editor allows you to drag and drop ad units on a WYSIWYG interface (what-you-see-is-what-you-get), same as WIX uses to create quick and easy websites. So it’s actually pretty easy to do. Now, once the experiment was set up, it was activated and variations were displayed over 50,000 times to produce the following insights for carz.co.il:
- For overall page CTR, the control demonstrated the highest average CTR of all page variations. It produced a 78% lift over version 2 and 64% lift in CTR over version 3.
- The control for the above the fold ad unit also performed better than his competitors. By 82% over variation 1 and by 41% over variation 2.
- For the skyscraper, the control dominated again with 89% increase in CTR over variation 1 and 67% improvement over variation 2.
- Finally, our below the fold ad unit, did not behave any differently from the other ad units with 75% improvement for the control over variation 1 and 59% improvement over variation 2.
What does this all mean?
So now that we’ve fully crunched the numbers let’s get down to analyzing the data. Cause data without insights is pretty pointless.
- Placing a large skyscraper on the right hand side produces higher CTR than the same skyscraper on the left hand side. You may be thinking…”Wait, this is a Hebrew website so visitors read from right-to-left. Does that mean that for left-to-right websites the opposite would be true?” Not necessarily. In our previous case study we conducted for L’express, a French language website, we also found that a page setup heavy on banners on the right when compared to the left got higher CTRs. Obviously, this does not mean that right side will always perform better than the left hand side but it’s definitely something worth testing on your website. Especially, because of the second takeaway below.
- We also saw a possible correlations between placing the skyscraper unit on the right hand side and increased CTR and AdSense RPM for the entire page. In other words, always consider how making a change to a certain ad unit can change the flow and user experience of the entire page for better or worse.
- An ad located high on the page does not guarantee higher CTR than an ad located lower on the page. Though we did not go into detail in these experiments, it’s important to realise that you can place an ad too far up above your content. Where did we see this? Variation 3’s above the fold unit was placed lower on the page than in the control and and in variation 2. Variation 3’s ad unit still managed to deliver a 10% increase in CTR over variation 2’s ad unit. But, variation 3 still got lower CTR than the control, which goes to show you that analyzing your AdSense performance you need to analyze it at the page not the ad unit level, and that leads to the fourth and final takeaway.
- When analyzing AdSense performance it is best to have a holistic approach. Analyzing performance at the ad unit level can leave you blind to the page’s user experience and will hurt you over the long run. Also, ad units don’t exist in a vacuum. They are placed on a page and have a symbiotic relationship with the other ad units on the page, not to mention the content. Therefore, if you want to have actionable insight from your data and improve your AdSense revenue you need to analyze your entire page setup and observe the effect it has on your RPM.
Putting all this data together can be exhausting. I know I could use a drink right now. But nowadays with ad blockers, mobile monetization woes, publisher coalitions, and programmatic media buying it’s more than necessary to use your data and start testing your AdSense to increase CTR and revenue. In fact, if there is one single metric you need to optimize for to increase your AdSense revenue it’s CTR. It can actually also help increase your AdSense CPCs.
For Carz, their experiments provided an overall increase in revenue of 22% and CTR went up by 17%. They continue to see increased revenue by using AdNgin’s platform to test their page setup. If you’d like to see it in action, you can visit their website and refresh your browser to observe the change in ad location, sizes, color palettes (for text ads), and much more.