Enjoy unlimited access to all forum features for FREE! Optional upgrade available for extra perks.

How Many IPs on a Class C

Status
Not open for further replies.
Joined
Jan 7, 2006
Posts
86
Reaction score
3
Over in a 'Contextual Advertising' thread the subject digressed and a question was posed as to how many sites are OK on the same IP/subnet.

In a video by Matt Cutts: http://video.google.com/videoplay?docid=3583760678227172395 he says that 5 is OK but with 2000 you might get dropped.

We have 400 but have used experts to create content that was very expensive to create, is highly ranked and well linked to from great sources - all of the links were voluntary without expecption - nothing paid, etc. We aimed to make really great content.

In the contextual thread another publisher had 5 sites and got dropped out of the index.

I am unsure about how many sites might be safe. On one side of the argument one could argue that if there are enough votes (links) then Google finds a trusted source of quality content. On the other one could argue that Google might be saying 'why is there a need for 400+ sites on one IP'.

I was going to get up to 1000 sites but to be honest I am worried and think we might stop where we are and concentrate on making what we have much better. Anyone have any thoughts?
 
'experts' and content can cost alot but that is zero guidence on quality.

One thing I noted from clicking through on another thread and a google is you have alot of similar text ie. contact pages etc , so it might not just be hosting locations that cause issues.

It will not help with ease of scale, but why not treat them all as seperate sites or at very least smaller groupings.

You mentioned if google linked the lot then you would be out of the game, and to be honest that could happen tomorrow with the current setup IMHO.

It does seem higher than needed risk currently, but you are in a lucky position with resources and awareness to be proactive rather than reactive to any google-slap.

Google likes good 'proper' sites, you are creating lots of high quality MFA style sites. You need to convince google that is not the case, and stuff like the 400 on one ip , similar text and styles will not help that.

Not sure if this is better on the other thread, but its all related :)
 
An average shared server will have 300+, and a well built cluster will have 10,000+ (owned by different people).

I know servage has 10,000s that are all interlinked (porn) and not been de-indexed, I have a dozen or so on there.

I think minisites, and sites that offer no real benefit to the index, stuff like that may suffer, but real sites shouldnt have many problems.
 
Last edited:
An average shared server will have 300+, and a well built cluster will have 10,000+ (owned by different people).

I know servage has 10,000s that are all interlinked (porn) and not been de-indexed, I have a dozen or so on there.

I think minisites, and sites that offer no real benefit to the index, stuff like that may suffer, but real sites shouldnt have many problems.

Agreed - Google has the tech to 'see' the difference between hosting companies and mini networks of sites.

This can be down to WHOIS records, site structures, repetative content etc etc.

The question is the difference between a MFA style site and a 'real' site, and the solution to that problem is google's product :)
 
I think Whois is the key here, they compare the whois to the sites and if they are very very similar and content is fodder they get the chop.
 
So if google has the tech to compare the whois - if you stuck up 100 interlink sites all on different servers would you still get penalised?
 
The question is the difference between a MFA style site and a 'real' site, and the solution to that problem is google's product :)

I am sure that Google is fine with MFA sites - what about about.com which could be defined as a MFA site. I guess the issue is the quality threshold of the content.

In my case I think Google might also ask - why 1000 sites with 100 pages and not one site with 100,000 pages. My reason was to create 'expert' sites. The problem is that the Google algorithm might not think like me - as per that Matt Cutts video.
 
I am facing this problem at the moment. My issue if that I want to create local geo sites with useful information that people feel "ownership" of. Many others do the one URL/town but I want the actual town URL for mine. It's just a different way of doing it. However, I want to avoid being dumped my G.

My view is that the sites need to be different in many ways. Structure etc. I am planning on using a few different hosts (eggs in a few baskets etc). I won't be linking between them, and where I do I'll consider nofollow.

John, I like what you have done. And it's a similar project to mine in many respects. Like you say you have to ask why didn't you just spend your 400 x .co.uk on one mega generic... and your answer being that you want that "expert" status reflected in the URL. I think thats a very honest way of thinking. The issue I have found here is that it would be much easier to build inlinks to a one URL site, than one with 400, thats a 400 x dillution. maybe thats the issue, that our competitiors therefore have a 400 x advantage on votes... just a thought. Good job though, especially the girl in the sewing video. Wooow!
 
My view is that the sites need to be different in many ways. Structure etc. I am planning on using a few different hosts (eggs in a few baskets etc). I won't be linking between them, and where I do I'll consider nofollow.

Good way of doing it, and juggling the WHOIS records help. I am sure Google does not break Nom's t&c's but somehow they know whats what on .uk as well :)

Like you say you have to ask why didn't you just spend your 400 x .co.uk on one mega generic... and your answer being that you want that "expert" status reflected in the URL. I think thats a very honest way of thinking. The issue I have found here is that it would be much easier to build inlinks to a one URL site, than one with 400, thats a 400 x dillution.

Interesting point, however 400 x £5 = £2k and that wont get you far in mega generic style names.

It would be intersting to see how the two models would stand up, and at the end which would be preferable ie.

400 x mini sites with content
one 'information.co.uk' site with several hundred subdomains or the like.

As above no chance of getting info.co.uk or the like for four figures, but it could be worth a think as if you could migrate to such a setup you would instantly have a pile of sites to help you boost it.
 
As above no chance of getting info.co.uk or the like for four figures, but it could be worth a think as if you could migrate to such a setup you would instantly have a pile of sites to help you boost it.

Remember the age effect too. Buying a 1997 name for example could jump you beyond the G MFA trigger.
 
John, I like what you have done. And it's a similar project to mine in many respects. Like you say you have to ask why didn't you just spend your 400 x .co.uk on one mega generic... and your answer being that you want that "expert" status reflected in the URL. I think thats a very honest way of thinking. The issue I have found here is that it would be much easier to build inlinks to a one URL site, than one with 400, thats a 400 x dillution. maybe thats the issue, that our competitiors therefore have a 400 x advantage on votes... just a thought. Good job though, especially the girl in the sewing video. Wooow!

It's not clear that building links into one site is easier. It's also probable that having a tight subject expert site is less confusing for the visitor and links well with related sites.

On the subject of domain name I don't believe age is any factor. We bought a couple of older domains and put new content on - no difference to free reg. The search engines will know ownership has changed hands and content changed. I figure both of those and you're out. Best way would be to buy an old site, keep the content and change it over time. We never did this but hear it's worked well for lots of people.

We always use free reg. Wordtracker the keywords and make sure they are in the domain name. Costs 2.50 a year and works a treat. Unless you're building a brand etc far better to spend the cash on content.
 
I feel your better to build authority with one site rather than 400,

Age of domain is not the factor but age of site in the index along with and more importantly age of & quality of backlinks carries plenty of weight. If you had great content on one site you would be more than the sum of your parts imo and especially when it came to ranking and rolling out new parts of the site it would be a breeze.

I'd buy a quality generic with age that already ranked for it's name and suited an umbrella site like about or howto. Ranking 400 sites against using one massive vehicle would seem less risk as you have lots of sites but I'd argue the reverse and say running one larger site carries less risk if you do it well with good intentions and play by the rules and the ability to pass the link love through one site works better than interlinking and thus avoiding the linkfarm label.
 
John,
i love what you have done in terms of committing to having quality content written. I think with an advertising budget that it is a great idea to commit more to quality content than to buying links. If the content is good enough to get organic links then its so much better than having mediocre content and buying links, longer term.

However, i would be think hard about the amount of different sites you have on the same server, whois etc. This would concern me more than one large authority site. The reasons are the following:-

1) your sites run the risk of being tarred with the same brush, that means that if at any time you offend the Google gods you run the risk of all the sites going the same way.

2) you argue that you would have unique sites as the content is so specific. however sites like about.com, wikipedia and indeed any authority content sites manage fine with one larger site.I would argue that amazing content is more wasted on 400 sites than it wuld be on one umbrella site.

3) it is a bigger headache to run and market 400 sites than one large one.Tracking, updating, hacking, monitoring etc.

4) will any of these sites become enough of an authority on its own to rank for the cream of the terms that you are after?

5) would say that to do amazingly with seo you do most often have to push the boundaries of what is acceptable to Google. Having all your sites asscocitated will always stop you from anything other than the purest white hat marketing, anything else would pose to much risk to the network.

6) another advantage of having alotof sites is the ability to split test marketing methods and find what works, somtimes this may mean being a little naughty with some. Again having them all together is a bit restrcitive IMO.

7) funny thing is if you have too much success with those 400 sites, that can pose a problem on its own as you come very much on the Google radar and not even the purest site owner wants that.

8) algorithmically exact match domains at the moment are very favorable and it s possible to buy an exact match name, market it and rank it top. What if this trend changes as they invariably do towards the 1000 page sites who have these exact match terms in their sub pages?

9) if the end plan is a sale, its alot harder for so many reasons to sell 400 sites than it is one authority site. I know as once had 150 and sold them for alot less than probably 1 site would have made.Reason was i was handing the company a business that only i really knew how to manage and it was a big headache for them to take them all on.

Its all horses for courses but the industries i work in there is no single site in the top 15 that is just an amazing content site and doesnt aggressively market their sites by buying links and similar. Barring wikipedia, and thats the only exception. If your sites are in the industries where you will want the massive keywords it will be hard for any one of those sites to get it without more risk.

There are probably 10 better reasons i could think of with a few more minutes but the bottom line is, you have spent alot of money on content, sites etc. Why take an unecessary risk? At least take steps to reduce your risk substantially now or maybe wish you had before.

Rob
 
Last edited:
--> What he said..

Damn, you put it so much better than I did Rob
 
Hi Rob,

Thanks for the well considered reply. I'd just like to say that our reason for minisites was to create tightly focussed sites. I could argue:

1. How can one large site be expert? Is it not better to have a site about each subject? As an example the tech stuff on about.com is pants. Absolutely no use whatsoever unless you're clueless in the first place. If I'm struggling with Javascript then I want to go to a site that's called something like 'myjavacriptexpert.com' and not 'geteverythinghere.com'. I must say from a readers point of view this seems to work. From a Searchengine point of view there are worries.

2. I have never hidden anything from Google and doubt I ever could. We have all our sites on analytics with the same tracking code, in Webmsater Central etc. There is no way that I believe anyone could ever hide anything. If you did you'd have to use separate machines in separate locations etc to access the sites. I believe they would just know and I'd hate to work under some threat of getting caught - imagine doing something like that and one day forgetting to turn off Pagerank view in the Google toolbar. It may be a way to make a few quid but it's not a business.

3. We rank really well on all our sites. Take say the word sewing sewing - Google Search where we are number one. Child safety child safety - Google Search where we are number three. All our sites rank like this. Our sites also get fantastic feedback consistently as can be seen from our readers comments on each site. I think it's the links that are given to use and references by .gov's, police, bbc etc that give us the credibility. Again this is down to a tightly focussed site.

4. We have never been on the edge of SEO - not ever. I have spent money on recruiting the best writers to create the best content we could. When we have written bad content and our readers have told us we have had it rewritten.

5. I believe Google wants great content to rise to the top and poor content to be deleted or demoted. We work on that principle.

6. We have never ever purchased or sold a link and never will. We've been offered large sums of money to host pages written by large multinational companies containing a text link. If we did and Google caught us (and they would) it's a clear case of us helping another company manipulate rankings.

7. I think it might be time to stop building sites and start building each one into something major. We will try and make each site a major site.

In summary from a business point of view I am happy with the decision I made 3 years ago. I just worry that I might breach some Google algorithm. I do believe though that our site template etc will have had a human site review by Google as we get over 100,000 page views per day from their search vistors.

Thanks for your help and thoughts.



John
 
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Featured Services

Sedo - it.com Premiums

IT.com

Premium Members

AucDom
UKBackorder
Be a Squirrel
Acorn Domains Merch
MariaBuy Marketplace

New Threads

Domain Forum Friends

Other domain-related communities we can recommend.

Our Mods' Businesses

Perfect
Service
Laskos
URL Shortener
*the exceptional businesses of our esteemed moderators
Top Bottom