What's the best way to hold newly purchased domains over 2 years?
-
Hi,
A friend has just bought 3 domains and is not planning to build websites with them for around 2 years. He asked me what the best thing to do with these domains was...I have 2 ways of look ing at it:
a) Putting a holding page on these and submit to Google Webmaster Tools - this way they are indexed by Google and hold search engine trust when the site finally goes up - HOWEVER, if they are not updated with fresh content would that work against them in 2 years time?
b) Simply redirect them to their existing site and don't do anything else.
Let me know your thoughts.
Adido.
-
everywhere I've read says it is important
My early SEO days weren't pretty. I read a lot of information which was far from credible. Much of the information made perfect sense in a "of course the world is flat! If it was round the people at the bottom would fall off" type of way.
Much of my day used to be spent optimizing meta keywords, building links from blogs with follow links, and many other activities which are frankly crap but otherwise promoted from random sites as having value. Then I changed my focus from learning SEO from anyone who seemed like an expert and restricted my focus to truly credible sources. That change was the single best move I made.
If I may make a suggestion, stop reading SEO information from "everywhere" as the overwhelming majority of it is crap. Focus on a few, reliable sources of information. Even then, always question and test new learnings. A few sites to start with are: SEOmoz, mattcutts.com/blog, matt's videos on youtube, and Google's official blog. It would take months or years to sift through the information on these sites alone. You will pick up links to other credible sources of information and be able to form your own opinions.
-
A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
When asked this question Matt Cutts replied "We can't promise that it's not a factor, but if it is a factor, it's super tiny. It's a really small thing. We're using it primarily in conjunction with other things to try and see what's going on." It's often hard to pin Matt down on things of this nature and that response is about as clear and definitive of a response as one could hope for.
Overall, the theory is often supported with correlative results. "Hey look at these older sites which rank better". Older sites often have shorter, more brandable .com URLs. Older domains have had a lot of time to earn links and recognition. These are the value factors. Going through the process of picking up domains which otherwise don't have value, then hosting these domains and adding random content to put them through the "aging" process is not an acknowledged SEO best practice. It is an idea someone came up with that caught on, nothing more.
You are welcome to disagree with me. I freely acknowledge I could be wrong. As long as search engine algorithms are kept secret we could all be wrong about a great many things. All we can do is rely upon what search engine employees such as Matt Cutts share with us, and our testing. Based on the above information, I maintain the belief domain age is not a ranking factor. Even if I am wrong, the factor is given such an incredibly small weight it is unlikely to ever make a noticeable difference.
Some links which may help:
http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
http://www.seomoz.org/blog/5-common-pieces-of-seo-advice-i-disagree-with
-
I respectfully disagree with this statement "A 10 year old domain is not worth any more then a domain you acquire today." why do you think this?
I have helped people with 10 year old domains do SEO and have seen very fast and excellent rankings results, even on sites that were crap and the only way I can explain that is that they built trust in Google due to domain age.
My suggestion for someone buying domains that they plan on sitting on for a couple of years and then using is to prime them by creating a two page site with each page having about 2000 words of relevant content, getting a few links pointing to it and let Google find (and index) the site on it's own. Then the site is considered a live site and not parked or 301 redirected and it will begin building up age.
Even if we cannot completely confirm either way that domain age is a factor, it is always best to play it safe, there is no way a brand new site would be seen by Google as more trusted than an established aged site, right? That is as long as the site you set up is not a site with flimsy pages. Google doesn't care how many pages the site is, it could be 2 pages just as long as those 2 pages have a lot of relevant content it won't get smacked down.
-
Thanks for your quick response Ryan...can anyone else shed light on whether domain age is important...everywhere I've read says it is important...any other Mozzers have an input into this one?
Cheers,
Adido
-
Given the choices offered, option 2.
Purchasing a domain in and of itself offers no value. A 10 year old domain is not worth any more then a domain you acquire today. Any word to the contrary is a myth.
There can be a correlation whereby an active domain earns links over time, and the longer a domain has existed the more links it has had time to earn. But buying a domain and sitting on it doesn't help a domain's SEO value on any level.
If the domain names you acquired are in some way related to an existing domain, you can perform a 301 redirect to the main site and gain some value from them in that manner. For example, seomoz.com redirects to seomoz.org. Some people might mistakenly visit the .com site and redirecting to the main site has some value.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0 -
Best way to duplicate a wordpress site for staging purposes?
I want to make some changes to my Wordpress site, and want to somehow set up a live staging area. Does anyone know of a good way to do this? I want all of the same content there I just want to be able to make changes to it and try it all out before going live. Any thoughts on this? Also I want to be sure the staging site doesn't get indexed since it will be a complete duplicate of my existing site. Thanks!
Intermediate & Advanced SEO | | NoahsDad0 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0 -
Best way to merge 2 ecommerce sites
Our Client owns two ecommerce websites. Website A sells 20 related brands. Website has improving search rank, but not normally on the second to fourth page of google. Website B was purchased from a competitor. It has 1 brand (also sold on site A). Search results are normally high on the first page of google. Client wants to consider merging the two sites. We are looking at options. Option 1: Do nothing, site B dominates it’s brand, but this will not do anything to boost site A. Option 2: keep both sites running, but put lots of canonical tags on site B pointing to site A Option 3: close down site B and make a lot of 301 redirects to site A Option 4: ??? Any thoughts on this would be great. We want to do this in a way that boosts site A as much as possible without losing sales on the one brand that site B sells.
Intermediate & Advanced SEO | | EugeneF0 -
301 Redirect All Url's - WWW -> HTTP
Hi guys, This is part 2 of a question I asked before which got partially answered; I clicked question answered before I realized it only fixed part of the problem so I think I have to post a new question now. I have an apache server I believe on Host Gator. What I want to do is redirect every URL to it's corresponding alternative (www redirects to http). So for example if someone typed in www.mysite.com/page1 it would take them to http://mysite.com/page1 Here is a code that has made all of my site's links go from WWW to HTTP which is great, but the problem is still if you try to access the WWW version by typing it, it still works and I need it to redirect. It's important because Google has been indexing SOME of the URL's as http and some as WWW and my site was just HTTP for a long time until I made the mistake of switching it now I'm having a problem with duplicate content and such. Updated it in Webmaster Tools but I need to do this regardless for other SE's. Thanks a ton! RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC] RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Intermediate & Advanced SEO | | DustinX0 -
Ranking for our member's company names without giving them all away!
Hi, We have a directory of 25,000 odd companies who use our site. We have a strong PR site and want to rank a page for each company name. Some initial testing on one or two company names brings us to #2 after the company's own web site in the format: "Company Name Reviews and Feedback" - so it works well. We want to do this for all 25,000 of our members, however we do not wish to make it easy for our competitors to scrape through our member database!! e.g. using: www.ourdomain.com/randomstring/company-name-(profile).php unfortunately with the above performing a search on google for site:domain.com/()/()(profile).php would bring up all records. Are there any tried and tested ways of achieving what we're after here? Many Thanks.
Intermediate & Advanced SEO | | sssrpm0