Best Way to Determine Age of Site
-
What's the best way to determine the age of a site?
Where by it's beginning I mean when it went through the Google Sandbox and has been a functioning site every since.
Thanks!
-
I think archive.org may be my best bet. Thanks for the good advice
-
are you talking about versions of sites to see how old that particular website is or the domain?
obviously whois information is great for domains
http://www.networksolutions.com/whois/index.jsp
there is also a way to see old versions of websites here:
-
I've previously used Webconfs do research domain age - it's a pretty good resource. Don't think you'll be able to tell exactly when it made it's way through the Google Sandbox, but you should at least be able to determine when it went online. Although, if it was was anytime after 1998-99, then it's almost guaranteed to have made a trip to the box
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
Whats the best method to tackle website traffic drop?
ON 14th November following a DNS error (which seems to have been a google error as local servers were not effected and it happened to many people everywhere) my traffic started dropping. Within 5 days it was down 50%. In a panic to resolve the situation I thought it was because i was on a shared host and moved to VPS which was a disaster. Ive had server errors since. After desperately looking everywhere for news on the DNS error and whether there as algorithm change etc I decided i should have a look at my site @ www.mutantspace.com and see if there were any internal issues. In Google Webmaster forum a helpful moderator suggested that I do 2 things: 1. Deal with the fact that I had keyword stuffed my <alt tags="">. Basically i run an arts blog so have 8 - 10 images per post and was putting the same text in each one i.e [artists name][artform][name of artwork]. I stupidly didnt realise what i had been doing and have since been deleting my alt tags for every image except one per post. However i have 17,000 images so its going to take a while. </alt> 2. She also linked me to https://ahrefs.com/site-explorer/overview/subdomains/http%253A%252F%252Fwww.mutantspace.com%252F and wondered why i had such volatile inbound links. I dont know why. And cant figure it out As far as everything else goes I dont know what I could be doing wrong to deserve penalty - if it is a penalty. I dont back link so all my links are natural (from artists, galleries, art blogs, tumblrs, etc) I dont sell advertising (yet anyway) Having said that ive been told i have too many links on each page (i run a wordpress site and so have categories, etc) so im wondering if i should nofollow my categories? In short im wondering what advice anyone has on doing a systematic shake up of my site. Im currently doing the following: 1. deleting most of the <alt tags="">on my posts. Ive got back as far as 2012 and will keep going til theyre all done.</alt> 2. Redirecting all crawl errors 3. No Following more outbound links and links to social networks 4. Checking all inbound links to see if there is an suspicious domains 5. Sorting out the fact that ive had numerous server errors for the last 2 weeks (would that affect SERPS?) Is there anything else i can do? Should do? much appreciated
Algorithm Updates | | mutant20080 -
SEO having different effects for different sites
Hi, I hope this isn't a dumb question, but I was asked by a local company to have a look at their website and make any suggestions on how to strengthen and improve their rankings. After time spent researching their competitors, and analysing their own website I was able to determine that they are actually in a good position. The have a well structured site that follows the basic search rules, they add new relevant content regularly and are working on their social strategy. Most of their pages are rated A within Moz, and they spend a lot of time tweaking the site. When I presented this to them, they asked why there are sites that rank above them that don't seem to take as much care over their website. For example, one of their main competitors doesn't engage in any social networking, and rarely adds content to their site. I was just wondering if anyone could shed any light on why this happens? I appreciate there's probably no simple answer, but it would be great to hear some different input. Many thanks
Algorithm Updates | | dantemple880 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
External Linking Best Practices Question
Is it frowned upon to use basic anchor text such as "click here" within a blog article when linking externally? I understand, ideally, you want to provide a descriptive anchor text, especially linking internally, but can it negatively affect your own website if you don't use a descriptive anchor text when linking externally?
Algorithm Updates | | RezStream80 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Best practice for someone wanting to repost / translate some of your blog posts?
I've been contacted by several sites (a few in other countries) who would like to repost some of our articles on their site. A few of these are in other countries and they would like to translate them in their language. (we have a site about raising a child with Down syndrome so they are wanting to use our info to help people...not "beat us" in rankings, or anything like that.) I didn't know what the best practice on this was. I don't want to get dinged for duplicate content or have someone rank higher than me for my own article, etc. Just curious what the best way to go about this was. I'm also assuming the articles that are translated wouldn't be an issue at all since the content will be in another language. Is this right? Thanks!
Algorithm Updates | | NoahsDad0 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0