Best Way to Determine Age of Site
-
What's the best way to determine the age of a site?
Where by it's beginning I mean when it went through the Google Sandbox and has been a functioning site every since.
Thanks!
-
I think archive.org may be my best bet. Thanks for the good advice
-
are you talking about versions of sites to see how old that particular website is or the domain?
obviously whois information is great for domains
http://www.networksolutions.com/whois/index.jsp
there is also a way to see old versions of websites here:
-
I've previously used Webconfs do research domain age - it's a pretty good resource. Don't think you'll be able to tell exactly when it made it's way through the Google Sandbox, but you should at least be able to determine when it went online. Although, if it was was anytime after 1998-99, then it's almost guaranteed to have made a trip to the box
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Guides to determine if a client's website has been penalized?
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized? I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines. Thanks!
Algorithm Updates | | EEE30 -
How to optimise a news site? - tomorrows chip paper terms
Are there any specific tips to how to gain traffic from very short lived search terms? If the site you are SEO/SEMing want to go for search related to things like the latest celebrity breakup, or a fashion event that lasts less than a week The onsite stuff seems pretty good as SEO onsite tools generally give it an A grade Is it just a case of doing the same stuff as normal, but faster? 😉
Algorithm Updates | | Fammy0 -
What is the best way to organize a catergory for SEO purpsoes?
I work for a organic vitamin and supplement company and we are looking to rank for our categories by making more specific categories. For example we are going to try to add under the category "vitamin d" some smaller more relevant (longer-tail) categories like "spray vitamin d" and "vegan vitamin d" and try to rank instead for these searches and also searches containing words that we already have more authority from Google like "natural" or "organic". I know that putting the product pages a level deeper will only hurt us so I want to avoid that but I'm wondering if anyone has some advice on how to organize categories for longer tail keywords that we actually have a chance to rank for. Any help to figure this out would be greatly appreciated. Here is our page as it is currently, like I said we want to create sub categories that are effective for SEO, but also make searching and navigating the site easier. http://www.mynaturalmarket.com/Vitamin-D.html Thanks, ThatKwameGuy
Algorithm Updates | | ThatKwameGuy1 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Youtube, Video SEO, & my site
For our business we are building a collection of videos ranging including product info, how-to's, and some funny content. My understanding is that if you embed these onto my site from youtube you don't get any credit for these videos on the web site even if submitting a video sitemap. My thinking is to post these videos to youtube and to host them on our own site and submit a video sitemap including the videos on our site. We would change the name, description, etc. on youtube vs. what's o our web site. Question is - is this the best strategy? Do I get penalized for duplicate content? They are important for both the social aspects of youtube and the content vaue of our web site.
Algorithm Updates | | uwaim20120 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0