"Revisit-after" Metatag = Why use it?
-
Hi Mozfans,
Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them;
name="revisit-after" content="7 days" />
I'm wondering what is the purpose of the tag?
Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible?
Thanks in advance everyone!
Ash
-
Haha thanks for the example Ryan.
OK, I think I should let my web developer know, he seems to put it on all of his sites (he knows his stuff so maybe it's an old habit he's never bothered to research).
Your example prompted me to find the following page: http://www.seoconsultants.com/clueless/seo/tips/meta/
Quite a good read IMO.
-
The "revisit-after" tag has absolutely no value in HTML nor SEO. At no point of time did this tag ever have any value. There was a single search engine which was never of any significance which created this tag, but it was never adopted by Google nor anyone else.
If anyone disagrees, then I would suggest they add the following meta tag to their page:
It is no more effective then the "revisit-after" tag but at least it's original!
-
At one point this was taken as a "suggestion", but I believe almost all search engines automatically ignore this nowadays.
I think even when it was a valid command, it was still more often than not ignored by Googlebot
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Where does Google finds "Soft 404" and "Not found" links?
Hi all, We can see very old links or anonymous links of website suddenly listing under soft 404 or 404 in GSW. As per Google, some of them are some script generated ignorable links. Other are actually the ones which were deleted but not redirected. I wonder how Google get these years old links even though there are no source links available for these. These must be fixed even though they are not linked anywhere from our internal or external pages? Thanks
Algorithm Updates | | vtmoz0 -
Anyone else noticing "Related Topics" featured snippet? Is this new?
First time I've seen this type of featured snippet and now have seen it twice in the space of a couple hours. Queries on Google UK desktop: surgical instruments Hawking radiation Is this new? It definitely is for the "surgical instruments" search. Google are highlighting related topics/keywords in bold beneath the usual featured snippet. b261ea5b3279991f8549d20127f8fde3.png
Algorithm Updates | | Ria_0 -
Proactively Use GWT Removal Tool?
I have a bunch of links on my site from sexualproblems.net (not a porn site, it's a legit doctor's site who I've talked to on the phone in America). The problem is his site got hacked and has tons of links on his homepage to other pages, and mine is one of them. I have asked him multiple times to take the link down, but his webmaster is his teenage son, who doesn't basically just doesn't feel like it. My question is, since I don't think they will take the link down, should I proactively remove it or just wait till I get a message from google? I'd rather not tell google I have spam links on my site, even if I am trying to get them removed. However, I have no idea if that's a legitimate fear or not. I could see the link being removed and everything continuing fine or I could see reporting the removal request as signaling a giant red flag for my site to be audited. Any advice? Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
New Google "Knowledge Graph"
So according to CNN an hour ago regarding new Google update: "With Knowledge Graph, which will begin rolling out to some users immediately, results will be arranged according to categories with which the search term has been associated" http://www.cnn.com/2012/05/16/tech/web/google-search-knowledge-graph/index.html?hpt=hp_t3 Does this mean we need to start optimizing for Categories as well as Keywords?
Algorithm Updates | | JFritton0 -
Google decreased use of Meta Descripiton Tag
Over the past month or so I have noticed that Google is not using the meta description for my pages but is instead pulling text from the actual page to show on the SERP. Is Google placing less emphasis on meta descriptions?
Algorithm Updates | | PerriCline0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0