"Revisit-after" Metatag = Why use it?
-
Hi Mozfans,
Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them;
name="revisit-after" content="7 days" />
I'm wondering what is the purpose of the tag?
Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible?
Thanks in advance everyone!
Ash
-
Haha thanks for the example Ryan.
OK, I think I should let my web developer know, he seems to put it on all of his sites (he knows his stuff so maybe it's an old habit he's never bothered to research).
Your example prompted me to find the following page: http://www.seoconsultants.com/clueless/seo/tips/meta/
Quite a good read IMO.
-
The "revisit-after" tag has absolutely no value in HTML nor SEO. At no point of time did this tag ever have any value. There was a single search engine which was never of any significance which created this tag, but it was never adopted by Google nor anyone else.
If anyone disagrees, then I would suggest they add the following meta tag to their page:
It is no more effective then the "revisit-after" tag but at least it's original!
-
At one point this was taken as a "suggestion", but I believe almost all search engines automatically ignore this nowadays.
I think even when it was a valid command, it was still more often than not ignored by Googlebot
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Anyone have experience with using HTTPS compared to HTTP and how it could affect rankings?
Are there any negative or positive effects of using https over http when it comes to rankings?
Algorithm Updates | | classifiedtech0 -
Does having a few URLs pointing to another url via 301 "create" duplicate content?
Hello! I have a few URLs all related to the same business sector. Can I point them all at my home domain or should I point them to different relevant content within it? Ioan
Algorithm Updates | | IoanSaid1 -
Should I Wait Until the "Dust Settles" on the Algorithm Update or Get Busy Now?
We were hit hard on both of our sites yesterday and can't afford to wait for the dust to settle, as some folks are advising. We have been attempting to work off a penalty on one of our sites by undergoing a massive (and expensive) link removal project over the last four months. We are on our third reconsideration request and, hopefully, this last round of link removal will have done the job. I'm hesitant to go in and "de-optmize" the site by changing title tags and changing the anchor text until the penalty is removed, but I'm not sure if that's the right plan of action. I am, however, going to dig into the non-penalized site and change some title tags and anchor text. Any thoughts on this strategy would be greatly appreciated.
Algorithm Updates | | rdreich490 -
Why am i seeing a "conduit" line for search engine sources in Google Analytics ?
Among Google, Yahoo, Bing etc... One of the line is "Conduit". I never heard about this engine but, accordingly to Google Analytics metrics, it is the engine that bring the best traffic to my site in terms of pages per visit.
Algorithm Updates | | betadvisor0 -
Taking advantage of "Search Plus Your World"
How can I, the owner of a 5 thousand page website, take advantage of Search Plus Your World to increase website traffic?
Algorithm Updates | | StreetwiseReports0 -
What are the good strategies using satellite sites in SEO??
Hello to everybody, We'are thinking about launching a massive amount of satellite websites in order to promote our website. Is it really efficient in terms of link building? Or is the ROI really small due to the amount of time and money needed to create and manage these websites? Thanks a lot!!! Update: Thanks to all of you for all these interesting answers!
Algorithm Updates | | sarenausa1