Membership is FREE, giving all registered users unlimited access to every Acorn Domains feature, resource, and tool! Optional membership upgrades unlock exclusive benefits like profile signatures with links, banner placements, appearances in the weekly newsletter, and much more - customized to your membership level!

Site being penalised by google

Status
Not open for further replies.
Joined
Mar 14, 2007
Posts
396
Reaction score
2
My foldablebikes /co/uk is being penalised by google.

It's a shopping site for folding bikes that uses a product feed.

I think the amount of dynamic generated pages may be to blame.

I've tried blocking the /shop directory in the robots.txt but then I get a 'severe health warning' in google web master tools.

I've submitted some articles and a few blog posts but no luck getting it moved up in googles serps.

I'm considering moving the /shop directory to a sub domain to see if that helps (shop.foldablebikes...)

searching 'foldable bikes' on bing and I'm number 1 in the UK.

Any opinions or suggestions.

Phil
 
Most datafeed sites have been killed off by the big one.

If you want to get it back, it's a redesign, new content, new concept.

But they do come back well, so worth a try.
 
bin it. turn it into a proper website. Get a whole bunch of proper content written by contentnow then take the top 10 bikes you know were selling and do a proper page of content for each one.

Grow it from there.

End goal should be dropshipping them with one of the shops you were feeding with affiliate sales.

You need terms and conditions, privacy etc etc type pages.
 
You could always try the method that I choose to get my site back into Google...

Although it's probably best not to ask :D

It involved for the first few weeks trying to change my site to please Google, then after being declined for the third time I wrote them a strongly worded email with plenty of swearing, a rant and finally to tell them I was putting my site back to what it was before it was banned....

3 days later they sent me an email apologising, now I feel guilty as hell. But they do state their re inclusion requests are all read by human people.

With yours though, Johnny and JMOT are spot on and I'd say that is the way to go.
 
Cheers Guys

I'm a bit disappointed, I've recently taught myself how to use datafeeds, php and mysql too.

I have a ordered a few articles and do a quick re-design like you say.

Cheers

Phil
 
Phil,

Welcome to heartbreak hotel... owned by Google.
 
Here's what I had ...

  • 111 pages.
  • Dynamic page for each bike.
  • Blocked the dynamic and shop pages in the robots.txt file

This resulted in a severe health warning in google and the site being panalised.

To try and resolve this I un-blocked the dynamic pages and the shop directory, solving the severe health warning message in google webmaster.

I've removed the dynamic pages, making the 'read more' button go directly to the merchants site. This has reduced the site from 111 pages to 45.

I will add an articles page when I receives them and remove the table of bikes from the main page.

If that doesn't resolve the big G issues then it will have to be re-design and removal of the feeds.

Thanks

Phil
 
Usually gives you more information on what the severe health warning is?

Have you changed URLs? (some possible 404s)
Have you got a non-www duplicate
Have you got unique content on category pages

With the robots.txt disallowing 'important' pages I usually just ignore it as G doesn't seem to understand I want external links blocked
 
It was 'blocking important pages'.

I did have some unique content about the brand of each bike.

I'll see what a few more articles bring along with the reduced number of pages.

I may also drop the cheaper bikes and focus on the more expensive models to reduce pages even more.

This would improve the ratio of unique content / datafeed content.
 
It's a shame 'cos I was really chuffed with myself for cracking the datafeed and php stuff.
 
I did mention it in your 'affiliate data feed' thread that you just can't get away with automating more than 30-40 product pages to begin with, you need to slowly either drip-feed product pages or create them with some unique information.

I had 30 sites running on my own data feed system and had over 60k product pages in total. Ranked well for 6-8 weeks then got a colossal slapping down to 700th / last page of G for every site - site wide penalty for everything. All recovered now, but had to try several things to finally crack it.

What I've learnt - write useful content and reduce data feed content - your sites will recover but you need to write lots of good content (which it seems is available in your sector - product information) which is useful for the customer
 
Last edited:
Are you downloading a datafeed straight in to an SQL table, then displaying the content from there.....

....or are you dynamically loading the datafeed straight to your site?
 
Well done for getting the mysql and datafeeds working. It is an issue with dupe content.

For your theme I'd say go static pages for each product with unique product descriptions as these products are less likely to change or go out of fashion like seasonal products. Or if you still wish to use your datafeed stuff then setup a separate table, use the same product_id and product_description fields to join to the existing table. Have unique content created and stored in the new table then you will still be able to run updates of prices etc on the original table. This should get rid of the dupe content filter/penalty.

Or you can try adding the following code to make sure the product landing pages are not indexed but you'll most likely want to rank for product landing pages to ....
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

You may have to wait 60 days for reinclusion.

I've found my feed driven sites are ok if there is a higher percentage of unique content across the site.

Hope this helps.
 
Blocking pages in robots is a sure fire way of telling G that you are larking around.

Produce sites people find interesting and useful. Then sell off the back of that. Not the other way round.
 
Blocking pages in robots is a sure fire way of telling G that you are larking around.

Yes & no.

I block pages that are duplicated across many sites such as T's n C's, privacy policy and other pages that are simply of no relevance or use to be indexed.

However I've also not personally blocked dozens of pages from sites before.

Again, however, I know several prominent SEO consultants who advise clients to block out their database driven search results pages to not fill Google full of crap... I think if you view it from Googles view point they'd prefer you to not fill their index with crap so I cant see doing this being a problem if the pages that ARE allowed are very good quality, unique and authoritative.
 
I did mention it in your 'affiliate data feed' thread that you just can't get away with automating more than 30-40 product pages to begin with, you need to slowly either drip-feed product pages or create them with some unique information.

What I've learnt - write useful content and reduce data feed content - your sites will recover but you need to write lots of good content (which it seems is available in your sector - product information) which is useful for the customer

Cheers Nick

I had hope blocking the shop directory via the robots.txt would help with the dynamic content.

I'm waiting for a some articles being written, when I receive them I'll remove the feeds. create a new homepage and articles page.

One G has indexed that I'll introduce a few bikes via a feed.
 
Last edited:
Are you downloading a datafeed straight in to an SQL table, then displaying the content from there.....

....or are you dynamically loading the datafeed straight to your site?

The only way I know to do it...

upload a feed to my hosting then run a php script on a webpage to insert the data into my database.

If there's a simpler way I been keen to find out.
 
The only way I know to do it...

upload a feed to my hosting then run a php script on a webpage to insert the data into my database.

If there's a simpler way I been keen to find out.

that's what we do - also then allows you to pull in data from other providers and look to offer price comparisons across a few providers with very little extra coding.
 
Why would G penalise you for dynamic pages. Makes no sense.

You won't find many stores creating a static page for their products - almost all would be creating some dynamic page (or thousands of dynamic pages) without penalisation.

Again many store owners would be displaying content from a database - again no penalisation should exist for this either.

For product feeds it's a case of removing anything that could be duplicated across other affiliates (same dynamically called image, same title, same description etc....) it's this you need to rewrite. Works ok if your products are rarely changing.

You can still use your dynamic feed for the prices - add in to a second table and do an inner join in your PHP command

The tricky part is hiding the affiliate link.....this I think is probably the most difficult part. Keep it in and G penalises you. Hide and G penalises you. Direct the link to an inner page - add a do not follow to the link - and use HTACCESS to redirect to your affiliate code - and G becomes suspicious and penalises you

Damned if you do and damned if you don't
 
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Members online

Premium Members

New Threads

Domain Forum Friends

Our Mods' Businesses

*the exceptional businesses of our esteemed moderators
General chit-chat
Help Users
  • No one is chatting at the moment.
      There are no messages in the current room.
      Top Bottom