Domain Age. What's a good age?
-
I have a new site that ranks very well and is rich with content. I know that it would rank better but since it's new I'm assuming that it is being held back. My question is how long does it take for a site to mature?
-
Thanks Keri
-
Open Site Explorer isn't a live crawl, and the data there can be a little old. There is an update scheduled for tomorrow, so I'd wait a day and check for your links tomorrow -- the data will be a lot fresher then, but still a few weeks old.
Keri
-
Thanks for your advice, I appreciate it. I used the Open Site Explorer tool here but for some reason I don't see the links that are pointing to my site. Google Webmasters shows i have over 1000 links which are all natural links and another tool shows that I have over 750 links.
What would you change about the site?
How would you rebuild the site?
It is a work in progress so your advice helps.
Thanks again.
-
Hi Joel,
Just took a look at your website. I'll give you some quick points.
Bluntly, your website needs work. It needs to be completely re-built.
If you want to to build up authority for your site, you need to do so through link building. If you want to get natural links as opposed to paying for links or submitting to low quality directories, you need link-worthy content. That takes me back to my first point about a new website.
SEO Moz's Open Site Explorer is a great tool where you can take a look at the websites that are ranking well for your top keywords, and you can see where those sites are getting their links from.
-
Now I see sites above me that have domain age on their side but they don't have as many links. I have a lot of links that are organic. I also have a more robust site full of unique content that my competitors don't have. So in this case what steps should I take? Thank you for your advice.
-
So what is the best strategy for competing against other sites who have domain age and authority. I am a Realtor and my site is up against some large national sites. I am targeting local keywords with local info. Thanks for your advice.
-
I think this is one of those things that SEOs hear a little bit about, then stress out about. Although high rankings & an older domain may be highly correlated, that does not mean that there is a cause-and-effect relationship between ranking & age of domain. It's simply natural that the longer a domain is around (and the longer an actual website resides on that domain) that website/domain has more opportunity to build up its domain authority and their rankings.
-
I've found a few different topics about this and it seems to be the consensus that domain age does(n't) matter... people don't seem to really have a definite answer. There is a study that shows that a large percentage (over 50%) of #1 search results are domains over 10 years old, but younger than 10 years seemed to be pretty even. The target keywords would also play a factor.. if the keyword you're going for has been around for a long time, the older domains that are ranking for that keyword will be much harder to topple with a young domain.
Another side note is just the number of links a site can accumulate over 10 years over a very young domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When creating a sub-domain, does that sub-domain automatically start with the DA of the main domain?
We have a website with a high DA and we are considering sub-folder or sub-domain. One of the great benefits of a sub-folder is that we know we get to keep the high DA, is this also the case for sub-domains? Also if you could provide any sources of information that specify this, I can't see to find anything!
Intermediate & Advanced SEO | | Saba.Elahi.M.0 -
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
Can 'follow' rather than 'nofollow' links be damaging partner's SEO
Hey guys and happy Monday! We run a content rich website, 12+ years old, focused on travel in a specific region, and advertisers pay for banners/content etc alongside editorial. We have never used 'nofollow' website links as they're no explicitly paid for by clients, but a partner has asked us to make all links to them 'nofollow' as they have stated the way we currently link is damaging their SEO. Could this be true in any way? I'm only assuming it would adversely affect them if our website was peanalized by Google for 'selling links', which we're not. Perhaps they're just keen to follow best practice for fear of being seen to be buying links. FYI we now plan to change to more full use of 'nofollow', but I'm trying to work out what the client is refering to without seeming ill-informed on the subject! Thank you for any advice 🙂
Intermediate & Advanced SEO | | SEO_Jim0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
It's a good idea to have a directory on your website?
Currently I have a directory on a sub domain but Google apparently sees it as part of my main domain so all outgoing links may be affecting my rankings?
Intermediate & Advanced SEO | | Valarlf0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0 -
Aside from creative link bait, what's a solid link building strategy involve?
All things considered, directories, blogs, articles, press releases, forums, social profiles, student discount pages, etc, what do you consider to be a strong, phased, link building strategy? I'm talking beyond natural/organic link bait, since many larger accounts will not allow you to add content to their website or take 6 months to approve a content strategy. I've got my own list, but would love to hear what the community considers to be a strong, structured, timeline-based strategy for link building.
Intermediate & Advanced SEO | | stevewiideman1