Enjoy unlimited access to all forum features for FREE! Optional upgrade available for extra perks.

Bounce Rates

Status
Not open for further replies.
Joined
Jul 11, 2008
Posts
4,616
Reaction score
209
A while ago newbie (http://www.acorndomains.co.uk/new-domainers/47778-uk-hosting-multiple-sites-2.html#post170998) posted about my bounce rates, and was saying 20% is hard to achieve, and 12% is amazing. I started doing SEO on said site, and my bounce rate has now consistantly (for the last few months anyway) been 2-3%, so under 3%.

What I have noticed is, the more I tweak my demo site the less user loyalty I seem to have, I've have upto 95% new users, but it averages around 50% where before it used to be about 20% new users, so user loyalty seems to be the price of a low bounce rate.

What I'm wondering is what is considered 'normal' ? because when I goto the "Benchmark" set my site is mostly +% but new users, time on site and total page views is negative.

Is it possible to fix one problem without creating another problem ?, I seem to have fixed my bounce rate but created more than I fixed.

Here is a screenshot from Analytics, does that look normal ? I don't normally use Analytics but figured it may help.

Has anyone else got a site with similar uniques to compare ?
 

Attachments

  • Clip0001.jpg
    Clip0001.jpg
    77.2 KB · Views: 194
That's quite amazing to see. It's usually thought of that a high percentage of regular visitors can raise your bounce on the basis of 'bookmark visiting' (a favourite page to get what they expect, then they're off again). However, your site seems to go to the other extreme. Has the total number of visitors been fairly level, but with a changed split between regular & new visitors? Also, are they all entering through the same page, or is it random?

I wouldn't knock the low bounce though, you're site appears to get the visitors attention based on those figures!
 
Spread over half dozen or so entry pages.

Only 1 page has a high bounce rate, which is the index page, I think the large header (300px) is the cause of that tho. A resolution lower than 800 will lose alot of the page to the header images, and the actual content will be half way down the page and not much above the fold but I'll work on that.
 

Attachments

  • enter.jpg
    enter.jpg
    33.1 KB · Views: 175
Oh the overall visitors to the site is increasing by about 100 uniques a week, maybe more, I haven't looked much at it, but bandwidth use is slowly growing. Let me login to Awestats and look at the number.

I'm just a little concerned that I'm sending my visitors off elsewhere and they ain't coming back.
 
Right, I've had Analytics running on this 6 weeks since I started to tweak the site.

In taking my bounce rate down to 1.5% of 4,000 uniques, I have reduced the 'onsite time" from 2:40ish to 1:10ish, new users from 30% upto 98% at first but this last week has settled around 65%.

Revenue from ads has gone down despite traffic growing 1,000%, affiliate revenue has also suffered.

This is making me think, my bounce rate was high before because people found the affiliate item they wanted, and buggered off to my affiliates, where as now they are liking my site so much they don't want to go :p

Amazon used to get about 200 clicks a week, and about 10 sales from that, now its about 75 clicks and 0 sales.

Anyway I'm stopping work on this site because I seem to plug 1 hole only to expose a bigger one :(

I just thought my experiences may help someone concerned with a high bounce rate :)
 
Bounce rate should be measuring visitors that come into the site and immediately go back to whence they came, rather than those that find what they want and leave via an alternate route.

I'd be concerned if the bounce rate was in the 70%+ range, but I see many good sites that are consistently at a 40-50% bounce rate.

The 2.21% bounce rate coupled with a 1:10 time on site really doesn't make a lot of sense to me, lower bounce rate should imply higher times on site.
 
I don't normally use analytics, the data returned by Analytics doesn't match my server stats. Server stats show 500ish uniques per day, analytics shows currently 200ish.

I haven't touched the site in a few days before that last post, and as for today the stats look like.
14th Sept - 14th Oct
3,812 Visits
3,577 Absolute Unique Visitors
18,417 Pageviews
4.83 Average Pageviews
00:01:08 Time on Site
1.55% Bounce Rate

Awstats again shows different number of uniques, visitors and page views for the same period, so not sure if google doesn't count every hit or if my google stats are messed up or what.

I've give up on analytics the numbers seem whacky.
 
AWStats will show all visitors to the site - Google Analytics will only show those with javascript enabled, so all search engine spiders, email scraper bots and other bots won't show on the GA numbers.

GA is a fairly good indicator of actual real people looking at the site, server side stats are very usueful for analysing bot traffic and server errors.
 
Thanks Ian, only just saw this.

So google will not see mobile phone users for the most part, which means the mobile phone views (which are according to aws 50+ a day), also won't track the RSS readers either, which are about 200 a day according to awestats).

Any suggestions anyone for tracking RSS/Mobile users ?

Just an update on this little SEO project, 2 months after I started :)

I can't seem to make the user stay on the site longer than 1:XX:00 minutes, I can make them view more pages, I have taken the total page views as high at 10, but settled for 6-7 which allows my interstertial ads to kick in, twice on pages 3 and 5.

The bounce rate just seems to keep falling.

Here is a current snapshot(s), which the interesting point that Firefox has beaten IE now.
 

Attachments

  • spots.jpg
    spots.jpg
    48.9 KB · Views: 73
AWStats will show all visitors to the site - Google Analytics will only show those with javascript enabled, so all search engine spiders, email scraper bots and other bots won't show on the GA numbers.

Depending on version of awstats this is true, however I use:

Advanced Web Statistics 6.9 (build 1.925) - Created by awstats (plugins: geoipfree)

Which splits Viewed traffic and Not viewed traffic.

Definition of the latter is "Not viewed traffic includes traffic generated by robots, worms, or replies with special HTTP status codes."

There is an even later one - 6.95 out now.
 
Advanced Web Statistics 6.9 (build 1.925) - Created by awstats (plugins: geoipfree)

This is the version I have but the stats are clearly not matched to google.

When looking deeper into my stats, I have 100s of daily hits on the RSS and by Mobile phones included in awestats, but would the RSS / Mobiles account for 60% discrepancy ?
 

Attachments

  • spots2.jpg
    spots2.jpg
    56.9 KB · Views: 62
Does it show you the split on the overview at the top?

Likewise mobile use... that is still web browsers?! ;)
 
Yeah the breakdown only breaks down pages/hits/bandwidth not uniques.

So I assumed that awestats discarded this data and didn't include it elsewhere ?

If i remove the not viewed numbers and compare the result to google, it leaves 6-7k pages difference, which maybe the number of mobile devices and rss feeds that google is missing because of javascript.

I know both my mobiles don't do javascript, which makes browsing acorn a nightmare due to the dropdown menus, so if google doesn't work without JS as above, that may explain the differences, made up by mobile devices/rss loads.
 

Attachments

  • spots3.jpg
    spots3.jpg
    45.9 KB · Views: 49
Last edited:
Pages in your first screen grab are 67,565 and second one pages viewed are 67,604 - so guess thats the same just taken different times :)

ie. the day by day data is the 'viewed' traffic, and does not count the 'not viewed'

This logic matches the site stats I am looking at as well for one of mine, so its not a fluke!

As for the 6-7k page difference, I think there has to be a margin of error due to the nature of the two software products not having a shared defintion of what is a bot or the like.

Likewise the two stats programs by their nature are different beasts, by definition Google can only see what you let it see, whereas AW 'sees' the lot. Perhaps this is best shown in 404 / 500 pages , so the server shows an error page but there is no Google JS on it.

Likewise AWStats needs updating, whereas Google knows things much quicker and can identify (and bin) bots sooner.

10% as an error margin does just sound high as a gut feeling - is the site non-standard in any manner?
 
This site has Wordpress and some custom mods on it, thats about it, afaik every page served inc errors etc has the code on.

I reloaded the page for the second screenshot so it updated the data by a few minutes hense the page increase.

So its safe to assume googles stats are accurate as is the bounce rate etc? I'm really not liking the amount of my time I'm devoting this because of analytics.
 
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Featured Services

Sedo - it.com Premiums

IT.com

Premium Members

AucDom
UKBackorder
Be a Squirrel
Acorn Domains Merch
MariaBuy Marketplace

New Threads

Domain Forum Friends

Other domain-related communities we can recommend.

Our Mods' Businesses

Perfect
Service
Laskos
URL Shortener
*the exceptional businesses of our esteemed moderators
Top Bottom