Best Way to Determine Age of Site
-
What's the best way to determine the age of a site?
Where by it's beginning I mean when it went through the Google Sandbox and has been a functioning site every since.
Thanks!
-
I think archive.org may be my best bet. Thanks for the good advice
-
are you talking about versions of sites to see how old that particular website is or the domain?
obviously whois information is great for domains
http://www.networksolutions.com/whois/index.jsp
there is also a way to see old versions of websites here:
-
I've previously used Webconfs do research domain age - it's a pretty good resource. Don't think you'll be able to tell exactly when it made it's way through the Google Sandbox, but you should at least be able to determine when it went online. Although, if it was was anytime after 1998-99, then it's almost guaranteed to have made a trip to the box
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anyone suspect that a site's total page count affects SEO?
I've been trying to find out the underlying reason why so many websites are ranked higher than mine despite seemingly having far worse links. I've spent a lot of time researching and have read through all the general advice about what could possibly be hurting my site's SEO, from page speed to h1 tags to broken links, and all the various on-page SEO optimization stuff....so the issue here isn't very obvious. From viewing all of my competitors, they seem to have a much higher number of web pages on their sites than mine does. My site currently has 20 pages or so and most of my competitors are well in the hundreds, so I'm wondering if this could potentially be part of the issue here. I know Google has never officially said that page number matters, but does anyone suspect that perhaps page count matters towards SEO and that competing sites with more total pages than you might have an advantage SEOwise?
Algorithm Updates | | ButtaC1 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Penguin 3.0 Site Dropped after Update
Hi We was hit by the Penguin update a long time ago and we lost a lot of traffic/positions because of this. For a long time we worked really hard to identify all off our links that may have caused us to recieve this penalty. After Months of work we submitted the disavow file and reconsideration request and in June 2014 we recieved confirmation from google in webmaster tools that the manual spam action had been revoked. over time we then started to recieve more traffic and better positions in the serps, however since penguin 3.0 we have dropped again for a range of keywords. many going from page 1 to 2 or page 2 to 3/4 Any ideas what we should do here , any help will be really appriciated as I'm totally confused We havent done any link building at all since the penalty / recovery
Algorithm Updates | | AMG1000 -
How much link juice does a sites homepage pass to inner pages and influence inner page rankings?
Hi, I have a question regarding the power of internal links and how much link juice they pass, and how they influence search engine ranking positions. If we take the example of an ecommerce store that sells kites. Scenario 1 It can be assumed that it is easier for the kite ecommerce store to earn links to its homepage from writing great content on its blog, as any blogger that will link to the content will likely use the site name, and homepage as anchor text. So if we follow this through, then it can be assumed that there will eventually be a large number of high quality backlinks pointing to the sites homepage from various high authority blogs that love the content being posted on the sites blog. The question is how much link juice does this homepage pass to the category pages, and from the category pages then to the product pages, and what influence does this have on rankings? I ask because I have seen strong ecommerce sites with very strong DA or domain PR but with no backlinks to the product page/category page that are being ranked in the top 10 of search results often, for the respective category and product pages. It therefore leads me to assume that internal links must have a strong determiner on search rankings... Could it therefore also be assumed that a site with a PR of 5 and no links to a specific product page, would rank higher than a site with a PR of 1 but with 100 links pointing to the specific product page? Assuming they were both trying to rank for the same product keyword, and all other factors were equal. Ie. neither of them built spammy links or over optimised anchor text? Scenario 2 Does internal linking work both ways? Whereas in my above example I spoke about the homepage carrying link juice downward to the inner category and product pages. Can a powerful inner page carry link juice upward to category pages and then the homepage. For example, say the blogger who liked the kite stores blog content piece linked directly to the blog content piece from his site and the kite store blog content piece was hosted on www.xxxxxxx.com/blog/blogcontentpiece As authority links are being built to this blog content piece page from other bloggers linking to it, will it then pass link juice up to the main blog category page, and then the kite sites main homepage? And if there is a link with relevant anchor text as part of the blog content piece will this cause the link juice flowing upwards to be stronger? I know the above is quite winded, but I couldn't find anywhere that explains the power of internal linking on SERP's... Look forward to your replies on this....
Algorithm Updates | | sanj50500 -
What is the best SEO solution for pagination?
Dear all, What is the best SEO solution for pagination? for example, what code do I need to put on these individual pages? /page-1 /page-2 /page-3 (final page) Thanks!
Algorithm Updates | | HMK-NL0 -
Best approach to ranking locally
Hello, What are the best approaches to ranking locally. Ie a user in Dallas googles "mechanics" and local results return. I understand Google+ local pages to be an important factor for the location based listings on maps. What about about location specific pages on the site? Meaning if we have a page on the site talking about areas we serve in Dallas. Other suggestions?
Algorithm Updates | | CallRingTalk0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Ranking Drop After Switching Sites
I have a client who's rankings dropped after switching to out site. We know that rankings can drop a little after switching, but we are concerned that hers are still low. Any suggestions? As far as I can tell, the links to her site remained the same. Thanks Holly
Algorithm Updates | | hwade1