"Revisit-after" Metatag = Why use it?
-
Hi Mozfans,
Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them;
name="revisit-after" content="7 days" />
I'm wondering what is the purpose of the tag?
Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible?
Thanks in advance everyone!
Ash
-
Haha thanks for the example Ryan.
OK, I think I should let my web developer know, he seems to put it on all of his sites (he knows his stuff so maybe it's an old habit he's never bothered to research).
Your example prompted me to find the following page: http://www.seoconsultants.com/clueless/seo/tips/meta/
Quite a good read IMO.
-
The "revisit-after" tag has absolutely no value in HTML nor SEO. At no point of time did this tag ever have any value. There was a single search engine which was never of any significance which created this tag, but it was never adopted by Google nor anyone else.
If anyone disagrees, then I would suggest they add the following meta tag to their page:
It is no more effective then the "revisit-after" tag but at least it's original!
-
At one point this was taken as a "suggestion", but I believe almost all search engines automatically ignore this nowadays.
I think even when it was a valid command, it was still more often than not ignored by Googlebot
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my article is reposted on another blog, using re=canonical, does that count as a link back?
Hey all! My company blog is interested in letting another blog repost our article. We would ask them to use "re-canonical" in the mark-up to avoid Google digging through "duplicate" info out there. I was wondering, if the other site does use the "re=canonical", will that appear as a backlink or no? I understand that metrics will flow back to my original URL and not the canonical one, but I am wondering if the repost will additionally show as a backlink. Thanks!
Algorithm Updates | | cmguidry0 -
Anyone else noticing "Related Topics" featured snippet? Is this new?
First time I've seen this type of featured snippet and now have seen it twice in the space of a couple hours. Queries on Google UK desktop: surgical instruments Hawking radiation Is this new? It definitely is for the "surgical instruments" search. Google are highlighting related topics/keywords in bold beneath the usual featured snippet. b261ea5b3279991f8549d20127f8fde3.png
Algorithm Updates | | Ria_0 -
Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
Hello friends, One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below: http://chakracentral.com/#panelBlock4 (service page)
Algorithm Updates | | chakraseo
http://chakracentral.com/#panelBlock3 (about-us page) I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it. Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below: http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples So please find my queries below for which I need help: 1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website? Regards,
Sarmad Javed0 -
Google is really NOT SAYING IN "HOW SEARCH WORKS” ?
Hi All SEOmoz members and team, As I was reading this, is it true that Google does this . Simply, I don't think so, I haven't experienced any of such what is being talked [http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/ C](http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/ "http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/")ome on, let us discuss the real thing about Google. Teginder Ravi
Algorithm Updates | | Futura0 -
Are Some Websites "White Listed"?
I track several niches that I am not in so I am not to biased with my own, and I noticed one site despite its rather mediocre quality, never moves. I have seen other websites rise and fall in rank, a few with pretty good content. He writes reviews, but very obviously never touched the products he reviews. However I see some other sites with real photos, and good advice for making a decision - they will sit on page two or three. I havent done a lot of research other than the size of the sites, and the links, and they are about equal. Sometimes the ranking site is smaller (its about 90 pages in google). The other sites I have seen have more content on one topic as well, which is interesting google opts for his one page "once over" review over something more in depth and authentic. It got me thinking about whether some sites are white listed by google, as in hand picked to rank despite what else is out there. Is this possible?
Algorithm Updates | | PrivatePartners0 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0 -
New Google "Knowledge Graph"
So according to CNN an hour ago regarding new Google update: "With Knowledge Graph, which will begin rolling out to some users immediately, results will be arranged according to categories with which the search term has been associated" http://www.cnn.com/2012/05/16/tech/web/google-search-knowledge-graph/index.html?hpt=hp_t3 Does this mean we need to start optimizing for Categories as well as Keywords?
Algorithm Updates | | JFritton0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1