What's the best way to hold newly purchased domains over 2 years?
-
Hi,
A friend has just bought 3 domains and is not planning to build websites with them for around 2 years. He asked me what the best thing to do with these domains was...I have 2 ways of look ing at it:
a) Putting a holding page on these and submit to Google Webmaster Tools - this way they are indexed by Google and hold search engine trust when the site finally goes up - HOWEVER, if they are not updated with fresh content would that work against them in 2 years time?
b) Simply redirect them to their existing site and don't do anything else.
Let me know your thoughts.
Adido.
-
everywhere I've read says it is important
My early SEO days weren't pretty. I read a lot of information which was far from credible. Much of the information made perfect sense in a "of course the world is flat! If it was round the people at the bottom would fall off" type of way.
Much of my day used to be spent optimizing meta keywords, building links from blogs with follow links, and many other activities which are frankly crap but otherwise promoted from random sites as having value. Then I changed my focus from learning SEO from anyone who seemed like an expert and restricted my focus to truly credible sources. That change was the single best move I made.
If I may make a suggestion, stop reading SEO information from "everywhere" as the overwhelming majority of it is crap. Focus on a few, reliable sources of information. Even then, always question and test new learnings. A few sites to start with are: SEOmoz, mattcutts.com/blog, matt's videos on youtube, and Google's official blog. It would take months or years to sift through the information on these sites alone. You will pick up links to other credible sources of information and be able to form your own opinions.
-
A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
When asked this question Matt Cutts replied "We can't promise that it's not a factor, but if it is a factor, it's super tiny. It's a really small thing. We're using it primarily in conjunction with other things to try and see what's going on." It's often hard to pin Matt down on things of this nature and that response is about as clear and definitive of a response as one could hope for.
Overall, the theory is often supported with correlative results. "Hey look at these older sites which rank better". Older sites often have shorter, more brandable .com URLs. Older domains have had a lot of time to earn links and recognition. These are the value factors. Going through the process of picking up domains which otherwise don't have value, then hosting these domains and adding random content to put them through the "aging" process is not an acknowledged SEO best practice. It is an idea someone came up with that caught on, nothing more.
You are welcome to disagree with me. I freely acknowledge I could be wrong. As long as search engine algorithms are kept secret we could all be wrong about a great many things. All we can do is rely upon what search engine employees such as Matt Cutts share with us, and our testing. Based on the above information, I maintain the belief domain age is not a ranking factor. Even if I am wrong, the factor is given such an incredibly small weight it is unlikely to ever make a noticeable difference.
Some links which may help:
http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
http://www.seomoz.org/blog/5-common-pieces-of-seo-advice-i-disagree-with
-
I respectfully disagree with this statement "A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
I have helped people with 10 year old domains do SEO and have seen very fast and excellent rankings results, even on sites that were crap and the only way I can explain that is that they built trust in Google due to domain age.
My suggestion for someone buying domains that they plan on sitting on for a couple of years and then using is to prime them by creating a two page site with each page having about 2000 words of relevant content, getting a few links pointing to it and let Google find (and index) the site on it's own. Then the site is considered a live site and not parked or 301 redirected and it will begin building up age.
Even if we cannot completely confirm either way that domain age is a factor, it is always best to play it safe, there is no way a brand new site would be seen by Google as more trusted than an established aged site, right? That is as long as the site you set up is not a site with flimsy pages. Google doesn't care how many pages the site is, it could be 2 pages just as long as those 2 pages have a lot of relevant content it won't get smacked down.
-
Thanks for your quick response Ryan...can anyone else shed light on whether domain age is important...everywhere I've read says it is important...any other Mozzers have an input into this one?
Cheers,
Adido
-
Given the choices offered, option 2.
Purchasing a domain in and of itself offers no value. A 10 year old domain is not worth any more then a domain you acquire today. Any word to the contrary is a myth.
There can be a correlation whereby an active domain earns links over time, and the longer a domain has existed the more links it has had time to earn. But buying a domain and sitting on it doesn't help a domain's SEO value on any level.
If the domain names you acquired are in some way related to an existing domain, you can perform a 301 redirect to the main site and gain some value from them in that manner. For example, seomoz.com redirects to seomoz.org. Some people might mistakenly visit the .com site and redirecting to the main site has some value.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How best to handle partial domain move?
The scenario is this, we have a website eg: ABC.com where the content is in two language folders (en-us and en-gb). We have created a new brand with website XYZ.com for the USA market. Of course, this domain will take a while to rank because it is completely fresh. My question is how best to deactivate the en-us content on the old site to: a) prevent it showing up on Google US
Intermediate & Advanced SEO | | esseljay
and
b) pass the US traffic to the new website to boost its rankings I was thinking of removing the en-us pages from ABC.com and using a 410 error page containing a link to XYZ.com
Would it be better to replace the content on en-us instead (with a link)? I'm not keen to use a straight 301 redirect as sometimes we get traffic from other countries to the en-us content. Thanks in advance Mozzers 🙂0 -
Whats the best way to implement rel = “next/prev” if we have filters?
Hi everyone, The filtered view results in paginated content and has different urls: example: https://modli.co/dresses.html?category=45&price=13%2C71&size=25 Look at what it says in search engine land: http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970 Look at Advanced Techniques paragraph. do you agree? it seem like google will index the page multiple times for every filter variant. Thanks, Yehoshua
Intermediate & Advanced SEO | | Yehoshua0 -
Any issue? Redirect 100's of domains into one website's internal pages
Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?
Intermediate & Advanced SEO | | Fergclaw0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Whats the best way to remove search indexed pages on magento?
A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)... I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example: http://aqmp.com.br/acessorios/lencos.html http://aqmp.com.br/acessorios/lencos.html?mode=grid http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine?
Intermediate & Advanced SEO | | SeoMartin10 -
What is the best way to learn SEO?
I was wondering if it's worth taking an SEO Training course. If so is it better to take a live class or Online class. Or is better to just read all the SEO Books out there? Or is there a good video series anyone can recommend? What is the best way to learn SEO? I have a good understanding of SEO but I'm not a Pro ( Yet ). Obviously SEO is always evolving so even the Pro's are constantly updating their skill set but I want to make sure my foundation is solid and complete. Advice Please. Thank you all.
Intermediate & Advanced SEO | | bronxpad0 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0