Does google use the wayback machine to determine the age of a site?
-
I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it.
My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new?
Thanks
-
hopefully that is correct. I would hate to get knocked down just because I took myself out of the wayback machine. I have had the domain registered in my name since 2006 so I should be ok then.
-
Google uses WHOIS to determine the age of the domain under a current ownership, however if the WHOIS data changes and at the same time the content of the site changes dramatically, Google may reset the domain age of the site in their opinion.
I'm not sure but I would imagine Google have their own records on domain age outside of WHOIS registration.
-
Using Wayback would be strange.It's just a quick WHOIS lookup to determine age.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am Using Wix website creator. Will google be able to read the javascript?
I tried using some of the moz tools like the "on page grader" and it was not able to read any of the writing on my webpage because wix uses javascript. Will this impact my rankings on google compared to my competitors? The New Wix websites allow you to build a website in HTML. Should I switch to this? Thanks, Jonathan
Technical SEO | | H1_Marketing_Solutions0 -
Use of multiple keywords that are similar for one local site
Hi I thought that if I wanted to rank a local site for the core Keyword, 'Landscaping Location' that variations of this keyword should be used on the same page. But I recently read that if I wanted to rank for: Landscaping Location
Technical SEO | | CamperConnect14
Landscaping in Location
Landscaping Services in Location that I should use separate page for each term. Is this correct? A small local website will probably only have a few pages and so making up pages solely to go after Keywords can't be right. But then would opportunities be missed? Thanks for your help with this!!0 -
How GOOGLE can re-index my site as possible as?
I have facing the question about re-indexing in the google search engine, the case is: i have changed my site meta description but google indexed display part description why?? my site is http://www.green-lotus-trekking.com/everest-base-camp-trek/ whats the problem in meta tag description? Please let me know about this?
Technical SEO | | agsln0 -
Mobile site content and main site content
Help, pls! I have one main site and a mobile version of that site (m.domain.com). The main site has more pages, more content, different named urls. The main site has consistently done well in Google. The mobile site has not: the mobile site is buried. I am working on adding more content to the mobile site, but am concerned about duplicate content. Could someone pls tell me the best way to deal with these two versions of our site? I can't use rel=canonical because the urls do not correspond to the same names on the main site, or can I? Does this mean I need to change the url names, offer different content (abridged), etc? I really am at a loss as to how to interpret Google's rules for this. Could someone please tell me what I am doing wrong? Any help or tips would GREATLY appreciated!!!!! Thanks!
Technical SEO | | lfrazer0 -
Google haveing problems accessing part of my site
hi my site is, www.in2town.co.uk and for a few weeks now google has had trouble accessing part of my site. Today googlewebmaster tools tells me that google is having major problems it shows, 123 pages where access were denied. i have spoken to my hosting company who could not find a problem, so not sure what to do now. can anyone please give me advice on what the problem may be. any help would be great
Technical SEO | | ClaireH-1848860 -
Should I be using use rel=author in this case?
We have a large blog, which it appears one of our regional blogs (managed separately) is simply scraping content off of our blog and adding it to theirs. Would adding rel=author (for all of our guest bloggers) help eliminate google seeing the regional blog content as scraped or duplicate? Is rel=author the best solution here?
Technical SEO | | VistageSEO0 -
Using DNS & 301 redirects to gain control over a rogue site
I'd appreciate peoples' views on the following please. We have been approached by a client whose website does not rank # 1 for their own distinctive brand name due to this position being taken by a site they had developed for them by an affiliate some years back. The affiliate's site is clearly seen by Google as the definitive site for the brand - being older, having more links & in both Yahoo & DMOZ. The relationship has soured with the affiliate & the client wants to take control of the affiliate site & have it 301 redirect to the 'real' brand site. The affiliate won't cooperate (funny that). However whilst the client doesn't have control over the affiliate's website, they do own the domain. Given this, it seems that an option is to temporarily create a 1 page website on another server, change the affiliate website domain DNS settings to point to this, & in turn have that 301 re-direct to the client's website. This is a bit of a round about approach, but necessary because the affiliate won't directly 301 the site they control - despite the client owning it. (As I say the relationship has soured). If you think there's a better alternative approach to this problem (aside from litigation), I'd appreciate hearing it please. Thanks.
Technical SEO | | SureFire0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0