|Domain Sales Prices||Services Offered||Domain Software||Domain Drop Dates||Domain Registration||Website Hosting||Deleted Domains List|
|Search Engine Optimisation Discuss Search Engine Optimisation|
| ||LinkBack||Thread Tools||Display Modes|
|22-11-2008, 08:48:40 PM||#1 (permalink)|
How Many IPs on a Class C
Over in a 'Contextual Advertising' thread the subject digressed and a question was posed as to how many sites are OK on the same IP/subnet.
In a video by Matt Cutts: http://video.google.com/videoplay?do...60678227172395 he says that 5 is OK but with 2000 you might get dropped.
We have 400 but have used experts to create content that was very expensive to create, is highly ranked and well linked to from great sources - all of the links were voluntary without expecption - nothing paid, etc. We aimed to make really great content.
In the contextual thread another publisher had 5 sites and got dropped out of the index.
I am unsure about how many sites might be safe. On one side of the argument one could argue that if there are enough votes (links) then Google finds a trusted source of quality content. On the other one could argue that Google might be saying 'why is there a need for 400+ sites on one IP'.
I was going to get up to 1000 sites but to be honest I am worried and think we might stop where we are and concentrate on making what we have much better. Anyone have any thoughts?
|22-11-2008, 09:20:14 PM||#2 (permalink)|
'experts' and content can cost alot but that is zero guidence on quality.
One thing I noted from clicking through on another thread and a google is you have alot of similar text ie. contact pages etc , so it might not just be hosting locations that cause issues.
It will not help with ease of scale, but why not treat them all as seperate sites or at very least smaller groupings.
You mentioned if google linked the lot then you would be out of the game, and to be honest that could happen tomorrow with the current setup IMHO.
It does seem higher than needed risk currently, but you are in a lucky position with resources and awareness to be proactive rather than reactive to any google-slap.
Google likes good 'proper' sites, you are creating lots of high quality MFA style sites. You need to convince google that is not the case, and stuff like the 400 on one ip , similar text and styles will not help that.
Not sure if this is better on the other thread, but its all related
|22-11-2008, 09:32:21 PM||#3 (permalink)|
| || |
An average shared server will have 300+, and a well built cluster will have 10,000+ (owned by different people).
I know servage has 10,000s that are all interlinked (porn) and not been de-indexed, I have a dozen or so on there.
I think minisites, and sites that offer no real benefit to the index, stuff like that may suffer, but real sites shouldnt have many problems.
Last edited by Skinner; 22-11-2008 at 09:37:52 PM.
|22-11-2008, 10:13:52 PM||#4 (permalink)|
This can be down to WHOIS records, site structures, repetative content etc etc.
The question is the difference between a MFA style site and a 'real' site, and the solution to that problem is google's product
|23-11-2008, 01:16:27 AM||#5 (permalink)|
| || |
I think Whois is the key here, they compare the whois to the sites and if they are very very similar and content is fodder they get the chop.
|23-11-2008, 09:35:09 AM||#7 (permalink)|
In my case I think Google might also ask - why 1000 sites with 100 pages and not one site with 100,000 pages. My reason was to create 'expert' sites. The problem is that the Google algorithm might not think like me - as per that Matt Cutts video.
|23-11-2008, 12:40:49 PM||#8 (permalink)|
| || |
I am facing this problem at the moment. My issue if that I want to create local geo sites with useful information that people feel "ownership" of. Many others do the one URL/town but I want the actual town URL for mine. It's just a different way of doing it. However, I want to avoid being dumped my G.
My view is that the sites need to be different in many ways. Structure etc. I am planning on using a few different hosts (eggs in a few baskets etc). I won't be linking between them, and where I do I'll consider nofollow.
John, I like what you have done. And it's a similar project to mine in many respects. Like you say you have to ask why didn't you just spend your 400 x .co.uk on one mega generic... and your answer being that you want that "expert" status reflected in the URL. I think thats a very honest way of thinking. The issue I have found here is that it would be much easier to build inlinks to a one URL site, than one with 400, thats a 400 x dillution. maybe thats the issue, that our competitiors therefore have a 400 x advantage on votes... just a thought. Good job though, especially the girl in the sewing video. Wooow!
|23-11-2008, 12:50:01 PM||#9 (permalink)|
It would be intersting to see how the two models would stand up, and at the end which would be preferable ie.
400 x mini sites with content
one 'information.co.uk' site with several hundred subdomains or the like.
As above no chance of getting info.co.uk or the like for four figures, but it could be worth a think as if you could migrate to such a setup you would instantly have a pile of sites to help you boost it.
|23-11-2008, 12:55:30 PM||#10 (permalink)|
| || |
|Thread||Thread Starter||Domain Name Community||Replies||Last Post|
|3000 2nd class stamps||Leeroy||General Board||0||17-07-2008 05:05:58 PM|
|Network Solutions named in class action suit - TamilStar.com||RSS||Domain Name News||0||26-02-2008 01:59:02 PM|
|Network Solutions named in class action suit - VNUNet.com||RSS||Domain Name News||0||26-02-2008 12:59:04 PM|
|Class action suit launched against Registerfly, ICANN - Malaysia Sun||RSS||Domain Name News||0||28-03-2007 08:59:13 PM|