Membership is FREE, giving all registered users unlimited access to every Acorn Domains feature, resource, and tool! Optional membership upgrades unlock exclusive benefits like profile signatures with links, banner placements, appearances in the weekly newsletter, and much more - customized to your membership level!

An upto date domain list of all .co.uk and .org.uk names ?

Status
Not open for further replies.
Joined
May 4, 2010
Posts
17
Reaction score
0
Hello everyone - this is my first post here.

Does anyone know how i can obtain a full list of all REGISTERED .co.uk and .org.uk domains that i can update everyday ? Or does anyone know of a script that does this, and where the info is pulled from ?

I could then scan this with my own script and find domains that are due to expire on a certain date... Is this normally how one would compile a full and accurate / up to date list of domains that are due to expire on a certain date ?

Any advice would be appreciated.


Thanks and Regards,
Andy
 
Last edited:
This must be one of the most frequently asked questions.

It does not exist.

That's right, there literally is NO list of all registered UK domains.

Why? Unlike .com, .co.uk has no "zone files" or equivalent available to the public (not for about 7 years now, I believe) so there's literally no way to find out what domains are registered or not.

That's why domainers have by-and-large have each independently had to make their own lists to try and find out (a subset of) what has gone or not. And yes, that means of course that NOBODY has a fully accurate picture. Which is an opportunity as well as a problem, in that you may be able to find domains about to drop that nobody else has noticed yet...
 
This must be one of the most frequently asked questions.

It does not exist.

That's right, there literally is NO list of all registered UK domains.

Why? Unlike .com, .co.uk has no "zone files" or equivalent available to the public (not for about 7 years now, I believe) so there's literally no way to find out what domains are registered or not.

That's why domainers have by-and-large have each independently had to make their own lists to try and find out (a subset of) what has gone or not. And yes, that means of course that NOBODY has a fully accurate picture. Which is an opportunity as well as a problem, in that you may be able to find domains about to drop that nobody else has noticed yet...


Good post.
 
Thanks for the reply Edwin.

So, based on what you have said, how do the several companies offer advance drop lists ?

How are they pulling / scraping the names ?

I guess once they get a list of domains, they have a script that queries WHOIS to see if they are suspended etc, then from the domain reg date it works out the exact drop date etc.

But where / how are they getting the actual domain names to check to begin with ?

What sort of script etc are they working with ?

An help would be appreciated.

Regards,

Andy
 
I believe some people are starting with the zone files for .com/net/org (yes, 100 million plus names!) and checking those against .co.uk.

Or at least historically they may have done so - Nominet has tightened its T&C about how many lookups you can carry out, and how much of the result you can STORE for reuse later.

Personally, I just make my own private keyword lists from scratch by researching various niches/topics - I couldn't care less about all the millions of junk/brandable domains, I'm only interested in the availability of "true generics" so by starting with "known good" keyphrases my whole list is at least relevant... So far, I have over 100,000 keyphrases on the various lists I monitor, and they're growing all the time.

I try and add 100-150 keyphrases per day, every day, so in a year I'm able to grow my lists by about 40,000 useful keyphrases (I only put on the list true generic expressions with a "useful" number of search results in Google Keyword Tool).

Note: one advantage of doing it my way i.e starting from "known good" keyphrases rather than some essentially random list of words and expressions (e.g. the zone files) is that anything I turn up that's still "available" is worth considering; I probably register 50% of what I find still available...
 
Last edited:
But where / how are they getting the actual domain names to check to begin with ?

One way is to take the .com zone file, change the domains to .co.uk/.org.uk and scan that which will give you a pretty good list of registered .uk domains. It will take you forever to compile and you'll need to be a Nominet member to access the systems needed to do a high volume of whois lookups.

Edit............Edwin beat me to it :)

Grant
 
I suppose the only problem with this is that your list is subjective and only based on what you think is a valuable keyword or phrase, hence you must potentially be missing a large portion of what is actually valuable.


I believe some people are starting with the zone files for .com/net/org (yes, 100 million plus names!) and checking those against .co.uk.

Or at least historically they may have done so - Nominet has tightened its T&C about how many lookups you can carry out, and how much of the result you can STORE for reuse later.

Personally, I just make my own private keyword lists from scratch by researching various niches/topics - I couldn't care less about all the millions of junk/brandable domains, I'm only interested in the availability of "true generics" so by starting with "known good" keyphrases my whole list is at least relevant... So far, I have over 100,000 keyphrases on the various lists I monitor, and they're growing all the time.

I try and add 100-150 keyphrases per day, every day, so in a year I'm able to grow my lists by about 40,000 useful keyphrases (I only put on the list true generic expressions with a "useful" number of search results in Google Keyword Tool).

Note: one advantage of doing it my way i.e starting from "known good" keyphrases rather than some essentially random list of words and expressions (e.g. the zone files) is that anything I turn up that's still "available" is worth considering; I probably register 50% of what I find still available...
 
I suppose the only problem with this is that your list is subjective and only based on what you think is a valuable keyword or phrase, hence you must potentially be missing a large portion of what is actually valuable.

For sure, for sure - which is why I keep growing the list every day.

But the alternative would be to wade through 6 million or so names looking for the gold "needles" in a haystack of junk. Not remotely appealing...
 
Even with things as competitive as they are these days, I don't see why anyone would flog, let alone give away, a list of domain names. There must have been quite a lot of blood, sweat & tears gone into some highly tweaked Excel spreadsheets I reckon. Prized assets surely!
 
Bear in mind that once you then have this list and know what is dropping on the day - the harder part of actually getting a dropped domain name of value begins.
 
Many thanks for all your responses.

Edwin - you mention that even if you had a list of 6 million, you wouldnt be wanting to wade through them all etc, picking the ones you want - firstly though, how would you go about get the initial list of 6 or so million registered domains ?
 
You either build your own lists using dictionaries, word lists etc. or like I already said, use the .com zone file.

Grant
 
Hi,

OK, so

1. we use the old .co.uk zone files
2. we use the current .com, .org etc and change to .co.uk and .org.uk
3. dictionary / thesaurus and add .co.uk / .org.uk
4. could scan directories / DMOZ etc
5. your own keyword list - by using this method - say i was targeting 'shoes' - would i enter the word 'shoes' into a suggestion tool, export all of the suggestions, add .co.uk / .org.uk etc and try a WHOIS search and see if any registered domains returned ?

Would you say this is about it for compiling a list ? Out of the millions of .co.uk / .org.uk and .me.uk domains registered, what percentage of them do you think we would find using the above system ?

Once this is done do you guys enter these details into a bulk whois and get to see if any of the domains are registered already ? And then separate the 'suspended' domains and calculate the drop date and put into another txt file ?

Based on your above methods - how do you think domainlore.co.uk found these names that are due to drop -

i-there.co.uk
tjmmail.co.uk
mfbuk.co.uk


Any advice would be appreciated

Thanks Again,

Andy
 
A txt file? I think you're gonna need a bigger boat...

I think a list encompassing points 1-5 would be beyond excessive. Having a list is a start, you then have to maintain it and keep it updated. You then have to evaluate on a daily basis, out of the domains you have calculated to drop which ones do you want to go for. You then have to bear in mind that any half decent domain is re-regged within fractions of a second of dropping (and that side of it is something you would have to learn entirely on your own).
 
You're still missing the point. Most of the domains in .co.uk are what I would call "not commercially useful". In other words, they're either junk, or they're meaningful to exactly one person/company in the whole world and not worth registering for anyone else.

Probably 7,000,000+ of the 8 million .co.uk domains that have already been registered fall into this category. And actually I would suggest it's more like 19 out of every 20 domains (and probably nearer 49 out of every 50!).

So the more you automate things with bigger and bigger lists, the more you're setting yourself up for a massive task! You will have to wade through tens of thousands of meaningless, worthless domains in order to find that rare gold nugget.

And the irony is that the "better" your starting lists are (zone files, old zone files, word lists, random phrase generators, whatever) the harder you're making the task.

The benefits of doing it the "hard way" from the beginning i.e. building the list itself only out of meaningful keyphrases are four-fold:
A) Anything on the list is "worth having" - there's no junk
B) A by-product of the list building process itself is that you will find plenty of domains that are commercially valuable but unregistered (you won't find ANY of these by starting from the zone files, since you'd need to search through the 290,000,000 unregistered names - difference between com/net/org and .co.uk - to find them!)
C) You start to get a real feel for "who has what". This can be useful down the road for trading, referring sales etc.
D) You leave no stone unturned. If you take things one topic/niche at a time, by the time you finish you can be pretty confident that either you know what's still available or you've found everything that's relevant and who owns it

It's basically the difference between finding all the needles in a haystack starting from huge bundles of hay mixed with a handful of needles, or by starting from needles! With the latter approach, you'll never quite know when you've found all of them, but there's zero wasted effort sifting away straw...

NOTE: I've only covered the post-compilation benefits above, but of course you also need no programming skills, and you don't need a massive DAC quota that you dedicate full-time for months to looking up millions of names!
 
Last edited:
Many thanks again for your replies.

Edwin, ok, so we go down the hard route and select our specific niches etc we want to target.

Say we we were going for 'shoes' as our target niche. What system do you use to generate the phrases / words around shoes ?

Once this list is generated, do you just add .co.uk to the end of this list, and then bulk enter into whois and see what is registered ?

Is this all manual work ? Surely just one niche would take months to get anywhere ?

Thanks again,
Andy
 
If it's "shoes" then visit a bunch of shoe ecommerce sites, and look at the types of shoe they have listed as products. Focus only on GENERICS i.e. don't go for "adidas" or "nike" but things like: shoes, running shoes, walking shoes, leather shoes, walking boots and so on.

Then start with keywords such as "shoe", "shoes", "boots" and so on in Google Keyword Tool (set it to "exact match") and see what gets a good number of searches. Again, mentally discard anything with trademarks in it, even if the number of searches is massive - you're looking for GENERICS.

It shouldn't take you more than an hour or two to "do" a niche - for that, I mean visit 10-20 ecommerce sites, see the main keywords and keyphrases they're using, cut and paste them into Notepad, and use them as the "seed" to do Google Keyword Tool searches to come up with more terms.

Of course, the first few times take getting used to, but after that it's much quicker.

BTW, I'd generally stick to one qualifier, maximum. So "running shoes" is fine - but not "blue running shoes" even if the latter had decent searches, as it's just too specific.

Once you have your list of all the keywords/keyphrases, search and replace to get rid of the spaces, stick .co.uk on the end, and look them all up.

Most important of all: use your brain! By which I mean that if something doesn't "look right" as a keyphrase or in Google Keyword Tool, then skip it. GKT is not a perfect tool by any means, so it's not a substitute for thinking...

Done.
 
Last edited:
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Members online

Premium Members

New Threads

Domain Forum Friends

Our Mods' Businesses

*the exceptional businesses of our esteemed moderators
General chit-chat
Help Users
  • No one is chatting at the moment.
      There are no messages in the current room.
      Top Bottom