What's the best way to hold newly purchased domains over 2 years?
-
Hi,
A friend has just bought 3 domains and is not planning to build websites with them for around 2 years. He asked me what the best thing to do with these domains was...I have 2 ways of look ing at it:
a) Putting a holding page on these and submit to Google Webmaster Tools - this way they are indexed by Google and hold search engine trust when the site finally goes up - HOWEVER, if they are not updated with fresh content would that work against them in 2 years time?
b) Simply redirect them to their existing site and don't do anything else.
Let me know your thoughts.
Adido.
-
everywhere I've read says it is important
My early SEO days weren't pretty. I read a lot of information which was far from credible. Much of the information made perfect sense in a "of course the world is flat! If it was round the people at the bottom would fall off" type of way.
Much of my day used to be spent optimizing meta keywords, building links from blogs with follow links, and many other activities which are frankly crap but otherwise promoted from random sites as having value. Then I changed my focus from learning SEO from anyone who seemed like an expert and restricted my focus to truly credible sources. That change was the single best move I made.
If I may make a suggestion, stop reading SEO information from "everywhere" as the overwhelming majority of it is crap. Focus on a few, reliable sources of information. Even then, always question and test new learnings. A few sites to start with are: SEOmoz, mattcutts.com/blog, matt's videos on youtube, and Google's official blog. It would take months or years to sift through the information on these sites alone. You will pick up links to other credible sources of information and be able to form your own opinions.
-
A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
When asked this question Matt Cutts replied "We can't promise that it's not a factor, but if it is a factor, it's super tiny. It's a really small thing. We're using it primarily in conjunction with other things to try and see what's going on." It's often hard to pin Matt down on things of this nature and that response is about as clear and definitive of a response as one could hope for.
Overall, the theory is often supported with correlative results. "Hey look at these older sites which rank better". Older sites often have shorter, more brandable .com URLs. Older domains have had a lot of time to earn links and recognition. These are the value factors. Going through the process of picking up domains which otherwise don't have value, then hosting these domains and adding random content to put them through the "aging" process is not an acknowledged SEO best practice. It is an idea someone came up with that caught on, nothing more.
You are welcome to disagree with me. I freely acknowledge I could be wrong. As long as search engine algorithms are kept secret we could all be wrong about a great many things. All we can do is rely upon what search engine employees such as Matt Cutts share with us, and our testing. Based on the above information, I maintain the belief domain age is not a ranking factor. Even if I am wrong, the factor is given such an incredibly small weight it is unlikely to ever make a noticeable difference.
Some links which may help:
http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
http://www.seomoz.org/blog/5-common-pieces-of-seo-advice-i-disagree-with
-
I respectfully disagree with this statement "A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
I have helped people with 10 year old domains do SEO and have seen very fast and excellent rankings results, even on sites that were crap and the only way I can explain that is that they built trust in Google due to domain age.
My suggestion for someone buying domains that they plan on sitting on for a couple of years and then using is to prime them by creating a two page site with each page having about 2000 words of relevant content, getting a few links pointing to it and let Google find (and index) the site on it's own. Then the site is considered a live site and not parked or 301 redirected and it will begin building up age.
Even if we cannot completely confirm either way that domain age is a factor, it is always best to play it safe, there is no way a brand new site would be seen by Google as more trusted than an established aged site, right? That is as long as the site you set up is not a site with flimsy pages. Google doesn't care how many pages the site is, it could be 2 pages just as long as those 2 pages have a lot of relevant content it won't get smacked down.
-
Thanks for your quick response Ryan...can anyone else shed light on whether domain age is important...everywhere I've read says it is important...any other Mozzers have an input into this one?
Cheers,
Adido
-
Given the choices offered, option 2.
Purchasing a domain in and of itself offers no value. A 10 year old domain is not worth any more then a domain you acquire today. Any word to the contrary is a myth.
There can be a correlation whereby an active domain earns links over time, and the longer a domain has existed the more links it has had time to earn. But buying a domain and sitting on it doesn't help a domain's SEO value on any level.
If the domain names you acquired are in some way related to an existing domain, you can perform a 301 redirect to the main site and gain some value from them in that manner. For example, seomoz.com redirects to seomoz.org. Some people might mistakenly visit the .com site and redirecting to the main site has some value.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best add affiliate links in a way that minimizes panda risk?
We have a site of about 100.000 pages that is getting several million of visitors per year via organic search. We plan to add about 50.000 new pages gradually in the next couple of months and would like to add affiliate links to the new pages. All these 50.000 new pages will have unique quality data that a team has been researching for a while. I would like to add in the area under the fold or towards the end of the pages in an unobstrusive way affiliate links to about 5 different affiliate programs with affiliate links customized to page content and of real value to visitors. Since affiliate links are one of the factors that may trigger panda I am a bit nervous whether we should add the affiliate links and if there is any way of implementing the affiliate links in a way that they may be less likely to trigger panda. E.g. would you consider hiding affiliate links from google by linking to intermediate URL (which I would mark as noindex nofolllow) on our domain which then redirects to the final affiliate landing page (but google may notice via chrome or android data) ? Any other idea?
Intermediate & Advanced SEO | | lcourse0 -
I'm facing something weird about my domain
Howdy... I'm facing something weird about my domain : proudlylived.com
Intermediate & Advanced SEO | | MasdrSE
there's another domain points to my domain in search results which is : <cite class="_Rm">www.animaisfotos.com</cite> I went to check if it's 301/302 redirect and it shows that there isn't any redirection at all it just shows 200 Status, Now I don't know what is happening, and whatever it's i want to cancel it because it shows the second domain instead of mine in search results . any suggestions ?0 -
What is the best way to get anchor text cloud in line?
So I am working on a website, and it has been doing seo with keyword links for a a few years. The first branded terms comes in a 7% in 10th in the list on Ahefs. The keyword terms are upwards of 14%. What is the best way to get this back in line? It would take several months to build keyword branded terms to make any difference - but it is doable. I could try link removal, but less than 10% seem to actually get removed -- which won't make a difference. The disavow file doesn't really seem to do anything either. What are your suggestions?
Intermediate & Advanced SEO | | netviper0 -
What's the deal with significantLinks?
http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://moz.com/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?
Intermediate & Advanced SEO | | NerdsOnCall0 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
What's better ...more or less linking C-blocks?
I'm a little confused about c-blocks, I've been reading about them but I still don't get it. Are these similar to sitewide links? do they have to come from websites that I own and hosted in the same ip? and finally, what's better ...more or less linking c-blocks? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
Preferred domain can't set in Web master Tool
I have put my domain name as xxxxxtours.com without www in web master tool. i have redirect to www version using htaccess file .So I wanna put Preferred domain "Display urls as www.xxxxtours.com .When trying it give error as attached image.but i have verified site the .waiting for expert help . Ar5qx.png
Intermediate & Advanced SEO | | innofidelity0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770