Incorrect cached page indexing in Google while correct page indexes intermittently
-
Hi,
we are a South African insurance company.
We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance.
Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely.
This is primarily for search phrases like "car insurance" and "car insurance quotes".
The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery.
The correct page does index for other search terms like "insurance for car".
Your help would be appreciated, thanks!
-
It would seem that http://www.miway.co.za/midrivestyle is actually doing a 302 (Temporary) redirect... which means its not passing any link equity. This could be the reason it is still showing in the index instead of your intended page. You would need to implement a 301 (Permanent) redirect to completely remove the old page in favor of the new one.
Edit: I double-checked it with a few more user-agents in SEOBook's HTTP Status Codes Checker and actually seem to be getting 301s sometimes and 302s other times. Not sure why it would be doing that but I would still double check that you have your redirects implemented correctly as that could be the culprit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implications of extending browser caching for Google?
I have been asked to leverage browser caching on a few scripts in our code. http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds) http://www.google.com/jsapi (1 hour) https://www.google-analytics.com/plugins/ua/linkid.js (1 hour) https://www.google-analytics.com/analytics.js (2 hours) https://www.youtube.com/iframe_api (expiration not specified) https://ssl.google-analytics.com/ga.js (2 hours) The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data. Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis. Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you. There is also the concern that Google recommends not caching outside of their settings. Any help on this is much appreciated. Do you see any issues/risks/benefits/etc. to doing this from your perspective?
Intermediate & Advanced SEO | | chrisvogel0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Keywords Directing Traffic To Incorrect Pages
We're experiencing an issue where we have keywords directing traffic to incorrect child landing pages. For a generic example using fake product types, a keyword search for XL Widgets might send traffic to a child landing page for Commercial Widgets instead. In some cases, the keyword phrase might point a page for a child landing page for a completely different type of product (ex: a search for XL Widgets might direct traffic to XL Gadgets instead). It's tough to figure out exactly why this might be happening, since each page is clearly optimized for its respective keyword phrase (an XL Widgets page, a Commercial Widgets page, an XL Gadgets page, etc), yet one page ends up ranking for another page’s keyword, while the desired page is pushed out of the SERPs. We're also running into an issue where one keyword phrase is pointing traffic to three different child landing pages where none of the ranking pages are the page we've optimized for that keyword phrase, or the desired page we want to rank appears lower in the SERPs than the other two pages (ex: a search for XL Widgets shows XL Gadgets on the first SERP, Commercial Widgets on the second SERP, and then finally XL Widgets down on the third or fourth SERP). We suspect this may be happening because we have too many child landing pages that are targeting keyword terms that are too similar, which might be confusing the search engines. Can anyone offer some insight into why this may be happening, and what we could potentially do to help get the right pages ranking how we'd like?
Intermediate & Advanced SEO | | ShawnHerrick0 -
Huge Google index on E-commerce site
Hi Guys, I got a question which i can't understand. I'm working on a e-commerce site which recently got a CMS update including URL updates.
Intermediate & Advanced SEO | | ssiebn7
We did a lot of 301's on the old url's (around 3000 /4000 i guess) and submitted a new sitemap (around 12.000 urls, of which 10.500 are indexed). The strange thing is.. When i check the indexing status in webmaster tools Google tells me there are over 98.000 url's indexed.
Doing the site:domainx.com Google tells me there are 111.000 url's indexed. Another strange thing which another forum member describes here : Cache date has been reverted And next to that old url's (which have a 301 for about a month now) keep showing up in the index. Does anyone know what i could do to solve the problem?0 -
Google cached pages and search terms
Here's something I noticed. We have a rank A page and it's ranking 10 on Google search results. When I hover my mouse over our search result, Google gives us a preview, but Google also highlights in red where the search keyword is present on the page. Reviewing our page, even though we have it as the h1 header and intro paragraph, Google is highlighting it half way down the page. Any ideas why? I review rank 1 - 5 and Google highlights the keyword on the intro paragraph and h1 header Have you guys experienced anything like this? It makes me think..Google could be crawling my site and thinking I haven't got it in the h1 or intro paragraph etc.. Thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Indexing non-indexed content and Google crawlers
On a news website we have a system where articles are given a publish date which is often in the future. The articles were showing up in Google before the publish date despite us not being able to find them linked from anywhere on the website. I've added a 'noindex' meta tag to articles that shouldn't be live until a future date. When the date comes for them to appear on the website, the noindex disappears. Is anyone aware of any issues doing this - say Google crawls a page that is noindex, then 2 hours later it finds out it should now be indexed? Should it still appear in Google search, News etc. as normal, as a new page? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
How to make Google forget my pages ?
Hello all ! I've decided to delete many pages from my website wich had poor content. I've made a php 301 redirect from all these old pages to a unique page (not the home page, a deep page). My problem is that this modification has been made a week ago and my position in the SERPs have crashed down... What can I do ? I believe that I'll get up again when Google will see that these pages don't exist anymore but it could take a long time 😞 (these page are in the Google cache with a date older than my modification's date) I've read somewhere that I should put a link to the destination page (where old pages are 301 redirected) but I don't understand how it could help... Can someone help me ? Tell me what I've done wrong... These pages were very poor and I've deleted them in order to boost the global quality of my site... It should help me in the SERPs, not penalize me...
Intermediate & Advanced SEO | | B-CITY0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0