Domain Age. What's a good age?
-
I have a new site that ranks very well and is rich with content. I know that it would rank better but since it's new I'm assuming that it is being held back. My question is how long does it take for a site to mature?
-
Thanks Keri
-
Open Site Explorer isn't a live crawl, and the data there can be a little old. There is an update scheduled for tomorrow, so I'd wait a day and check for your links tomorrow -- the data will be a lot fresher then, but still a few weeks old.
Keri
-
Thanks for your advice, I appreciate it. I used the Open Site Explorer tool here but for some reason I don't see the links that are pointing to my site. Google Webmasters shows i have over 1000 links which are all natural links and another tool shows that I have over 750 links.
What would you change about the site?
How would you rebuild the site?
It is a work in progress so your advice helps.
Thanks again.
-
Hi Joel,
Just took a look at your website. I'll give you some quick points.
Bluntly, your website needs work. It needs to be completely re-built.
If you want to to build up authority for your site, you need to do so through link building. If you want to get natural links as opposed to paying for links or submitting to low quality directories, you need link-worthy content. That takes me back to my first point about a new website.
SEO Moz's Open Site Explorer is a great tool where you can take a look at the websites that are ranking well for your top keywords, and you can see where those sites are getting their links from.
-
Now I see sites above me that have domain age on their side but they don't have as many links. I have a lot of links that are organic. I also have a more robust site full of unique content that my competitors don't have. So in this case what steps should I take? Thank you for your advice.
-
So what is the best strategy for competing against other sites who have domain age and authority. I am a Realtor and my site is up against some large national sites. I am targeting local keywords with local info. Thanks for your advice.
-
I think this is one of those things that SEOs hear a little bit about, then stress out about. Although high rankings & an older domain may be highly correlated, that does not mean that there is a cause-and-effect relationship between ranking & age of domain. It's simply natural that the longer a domain is around (and the longer an actual website resides on that domain) that website/domain has more opportunity to build up its domain authority and their rankings.
-
I've found a few different topics about this and it seems to be the consensus that domain age does(n't) matter... people don't seem to really have a definite answer. There is a study that shows that a large percentage (over 50%) of #1 search results are domains over 10 years old, but younger than 10 years seemed to be pretty even. The target keywords would also play a factor.. if the keyword you're going for has been around for a long time, the older domains that are ranking for that keyword will be much harder to topple with a young domain.
Another side note is just the number of links a site can accumulate over 10 years over a very young domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does product environment have impact on main website's SEO
We have two environments - product, where login is necessary and where the customers are working. We also have there our help desk, Q&A and knowledge base. Pretty sophisticated page regarding information on a specific topic. We also have our main page where we promote our products, company and events, etc. Main page is www.example.com, where product environment is login.example.com . Does this product environment have an impact on my main page's SEO?
Intermediate & Advanced SEO | | NeringaA0 -
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
List of SEO "to do's" to increase organic rankings
We are looking for a complete list of all white hat SEO "to do's" that an SEO firm should do in order to help increase Google/Bing/Yahoo organic rankings. We would like to use this list to be sure that the SEO company/individual we choose uses all these white hat items as part of an overall SEO strategy to increase organic rankings. Can anyone please point me in the right direction as to where we can obtain this complete list? If this is not the best approach, please let me know what is, as I am not an SEO person. Thank you kindly in advance
Intermediate & Advanced SEO | | RetractableAwnings.com0 -
What's with the Keyword Apocalypse?
Hi, 9 of my tracked keywords have dropped by over 20 ranks since last week. The nastiest drops in ranking are by 36, 38, and 46 places. For the last month I have been chipping away at the duplicate content with 301 redirects and was expecting my keyword rankings to improve slightly as a result of this; not the opposite. I don't have any manual actions logged against my site and am at a bit of a loss to explain this sudden drop. Any suggestions would be most welcome.
Intermediate & Advanced SEO | | McCaldin1 -
Site was moved, but still exists on the old server and is being outranked for it's own name
Recently, a client went through a split with a business partner, they both had websites on the same domain, but within their own sub directories. There is a main landing page, which links to both sites, the landing page sits on the root. Ie. example.com is a landing page with links to example.com/partner1, and example.com/partner2 Parter 2 will be my client for this example. After the split, partner 2 downloaded his website, and put it up on his own server, but no longer has any kind of access to the old servers ftp, and partner 1 is refusing to cooperate in any way to have the site removed from the old server. They did add a 301 redirect for the home page on the old server for partner 2, so, example.com/partner2/index.html is 301'ing to the new site on the new server, HOWEVER, every other page is still live on that old server, and is outranking the new site in every instance. The home page is also being outranked, even with the 301 redirect in place. What are some steps I can take to rectify this? The clients main concern is that this old website, containing the old partners name, is outranking him for his own name, and the name of his practice. So far, here's what i've been thinking: Since the site has poor on-page optimization, i'll start be cleaning all of that up. I'll then optimize the home page to better depict the clients name and practice through proper usage of heading tags, titles, alt, etc, as well as the meta title and description. The only other thing I can think of would be to start building some backlinks? Any help/suggestions would be greatly appreciated! Thanks.
Intermediate & Advanced SEO | | RCDesign740