Is the If-Modified-Since HTTP Header still relevant?
-
I'm relatively new to the technical side of SEO and have been trying to brush up my skills by going through Google's online Web-master Academy, which suggests that you need a If-Modified-Since HTTP Header tag on your site. I checked and apparently our web server doesn't support this.
I've been told by a good colleague that the If-Modified-Since tag is no longer relevant as the spiders will frequently revisit a site as long as you regularly update and refresh the content (which we do).
However our site doesn't seem to of been reindexed for a while as the cached version's are still showing the pages from over a month ago.
So two question really - is the If-Modified-Since HTTP Header still relevant and should I make sure this is included?
And is there anything else I should be doing to make sure the spiders crawl our pages? (apart from keeping them nice, fresh and useful)
-
If the webserver does not support (or the admin does not want to enable) this feature you could always have your frontend-templates have a small string wich holds the date/time when the page was last updated. Something along the lines "last updated on: ...." at the bottom or top of the content area. It's also an useful bit of information for users.
-
Hi Annie
I'm surprised there hasn't been lots of answers to your question.
Check-out this video here on SEOmoz entitled "Whiteboard Interview - Google's Matt Cutts on Redirects, Trust + More" featuring Matt Cutts being asked some questions by Rand. It opens with a partial answer to your first question:
"These days we use it a little less" (2 years ago) ~ basically means that in locations such as the US, most of Europe, Japan... & so on, where Bandwidth is rarely an issue anymore, 'If-Modified-Since' isn't taken notice of, it's not worth including anymore.
In say developing countries where bandwidth is sometimes still on the low side, it may still be used, hence why a sweeping 'it doesn't matter anymore' statement wasn't given.
**Your second question: **
- Content, fresh unique value-adding content that is, that's engaging and shareable, is always a positive aspect to work on, which in turn can lead to some awesome new links. This encourages the bots to visit more regularly.
- Ensuring that your site doesn't have any technical issues (say causing significant downtime).
- Ensuring that Robots.txt isn't wrongly disallowing any pages from being crawled.
- Keeping an eye on Google Webmaster Tools (& Bing Webmaster Tools) for any messages or errors.
- You can alter the crawl rate in GWT, though is usually best to leave it on the default auto setting.
Hope that helps,
Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http to https for large ecommerce - our steps taken (any others recommended?)
**Here is the message from our technical team for the http to https migration; are there any other http to https migration steps recommended? ** Http to https migration steps (for this large ecommerce site): We implemented HTTPS (HTTP over TLS) protocol today (5/4/2017). Applied a patch to ensure that HTTPS pages did not have NoIndex, NoFollow and tested before and after . Added new IIS HTTPS Redirect to enforce HTTPS from HTTP and changed others, including the WWW redirect Changed HTTPS only for Cookies as required as per new PCI vulnerabilities Changed the Basepage HTML template to use Relative Paths or Absolute URLs with HTTPS only (to prevent mixed content) Created and ran a SQL Script to cleanup 16 tables from HTTP to HTTPS (about 20,000 of them, including internal URL links, site settings, etc) Ran Google Sitemap Generator to create new sitemaps with HTTPS Added new HTTPS instance of the site into Webmaster Tools, then added verification code to master page, verified and then submitted the sitemaps to Search Console (QUESTION: will historical data in Google Console/ WMT be preserved for https?) **Follow up steps for http to https migration for large ecommerce: ** From this point forward, to avoid “mixed content”, the Marketing team must use either Relative Paths or Absolute Paths with HTTPS only in any customization (i.e. Basepage) or any new link, such as created in Content Management (i.e. Long Description). Any mixed content will make the website look not secure to customers and search engine spiders – so it is very important to be disciplined and diligent about this. Contact Salesforce to change the protocol to HTTPS only. Meanwhile, to prevent mixed content, we put in a temporary custom javascript change as workaround – but this should not be permanent especially as to the next upgrade will remove it – so we need Saleforce to make a change ASAP. We did not change Blog site (on sub domain), but we should even though it is only a Content site because it will not be signaled as Secure. This means we need to have someone make the changes to WordPress to enforce HTTPS and then change any links. In terms of impact to page ranking due to Google’s treatment of HTTPS over HTTP and due to some impact to page speed – we will need to monitor closely to see how indexing, organic traffic and page ranking goes and take any additional actions as necessary.
Technical SEO | | seo20170 -
Https & http
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http? For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https. Could this be causing me problems for SEO?
Technical SEO | | moon-boots0 -
What to do with old website still online & duplicate content
I launched a new wordpress site at www.cheaptubes.com in Sept. I haven't taken the old one down yet, it is still at http://65.61.43.25/ The reason I left it up is I wanted to make sure everything was properly redirected 1st. Some pages and images are still ranking but most point to the new site. When I search for carbon nanotubes pricelist and look in images I see some of our images on the old site are still ranking there https://www.google.com/imgres?imgurl=http://65.61.43.25/images/single-walled-nanotubes.1.gif&imgrefurl=http://65.61.43.25/ohfunctionalizedcnts.htm&h=359&w=451&tbnid=HKlL84A_9X0jGM:&docid=N2wdCg7rSQBsjM&ei=-A2qVqThL4WxeKCyjdAM&tbm=isch&ved=0ahUKEwikvcWdxczKAhWFGB4KHSBZA8oQMwhJKCIwIg I guess I can put WP on the old server and do some 301s from there but I'm not sure if that is best or if I should just kill it off entirely? My rankings took a hit on Nov 15th and business has been bad ever since so I'm trying to figure this out quickly. Moz.com and onpage.org both say my site has duplicate content on several pages. I've looked at the content and it isn't duplicate. How can I figure this out? Google likely see's it the same way. These aren't duplicate pages, they are different products. I even searched my product pages to make sure I didn't have 2 of each in there and I don't. With Moz its mostly product tags it sees as duplicate but the products are completely different
Technical SEO | | cheaptubes0 -
Help! Website has lost all DA since being 301'd
Hi, We have redirected a small WP website to a new site and domain and it has lost all DA and PA. Does anyone know if there's a fix for this? Would reverting back to the old domain help? We redirect www.northcornwallgardens.co.uk to cornishlawns.co.uk Thank you James
Technical SEO | | CamperConnect140 -
Would removing or making non relevant links no follow boost a site?
Hi, I have just been checking out the backlinks for a prospective new client. It appears they have a number of links that are totally irrelevant to their nature of business and I was wondering if they would improve in the rankings etc if I removed them or made them no follow instead? Or would I simply just be throwing away crucial link juice? Thanks in advance
Technical SEO | | Benjamin3790 -
Non-Canonical Pages still Indexed. Is this normal?
I have a website that contains some products and the old structure of the URL's was definitely not optimal for SEO purposes. So I created new SEO friendly URL's on my site and decided that I would use the canonical tags to transfer all the weight of the old URL's to the New URL's and ensure that the old ones would not show up in the SERP's. Problem is this has not quite worked. I implemented the canonical tags about a month ago but I am still seeing the old URL's indexed in Google and I am noticing that the cache date of these pages was only about a week ago. This leads me to believe that the spiders have been to the pages and seen the new canonical tags but are not following them. Is this normal behavior and if so, can somebody explain to me why? I know I could have just 301 redirected these old URL's to the new ones but the process I would need to go through to have that done is much more of a battle than to just add the canonical tags and I felt that the canonical tags would have done the job. Needless to say the client is not too happy right now and insists that I should have just used the 301's. In this case the client appears to be correct but I do not quite understand why my canonical tags did not work. Examples Below- Old Pages: www.awebsite.com/something/something/productid.3254235 New Pages: www.awebsite.com/something/something/keyword-rich-product-name Canonical tag on both pages: rel="canonical" href="http://www.awebsite.com/something/something/keyword-rich-product-name"/> Thanks guys for the help on this.
Technical SEO | | DRSearchEngOpt0 -
Http and Https Update
Guys, I was just wondering what is the current up-to-date resolution for HTTP and HTTPS, One of my client needs SSL for transaction,and they only need single page to be encrypted(SSLed) , So should I Force SSL that page only and direct entire HTTPS pages to HTTP to avoid duplicate page issue, or would it considered cloaking. It is been long time i did something similar, Or current Search Engines algo can handle it by themselves and i should just leave it . Thanks!
Technical SEO | | DigitalJungle0 -
My site is live 6 months but still has no PageRank?
I launched my site back in November and in the last 6 months have been pretty stoked about some of the links we've received from some .edu 's and some other big sites.... but I still don't register any PageRank. Is this normal or have I made some fatal flaw that is killing me in the eyes of google? Thanks Brian Clapp Founder, SportsTVJobs.com
Technical SEO | | sportstvjobs0