Why Is this page de-indexed?
-
I have dropped out for all my first page KWDs for this page
https://www.key.co.uk/en/key/dollies-load-movers-door-skates
Can anyone see an issue?
I am trying to find one....
We did just migrate to HTTPS but other areas have no problem
-
Hi
Yes, there was an issue with my rank tracking software - phew
Thank you
-
OK great thanks for your help
I'll keep an eye on everything
-
You are 5th for Heavy Duty Dolly
I don't see what the problem is - the page is doing really well??
Regards
Nigel
-
This is not an issue.
Its totally normal to have some http indexed pages left. Even more common if the migration is recent. Dont be afraid of this Becky.
-
Yeap, give it a little time to google re-crawl all the new site. I´d give it nearley a month to consider that google has seen completely the new version of the site, always checking the number of indexed pages in GSC and the resuls appearing for a site: search
Being out of top 100 gives you a clue that you are in the middle of the transition.and for the keyword: _Heavy Duty Dolly; _ I do see your page. Check attached image.
Best luck.
GR. -
Hi Becky
I just searched in a normal browser so it could be Google skewing the results for you.
For indexed pages
site:key.co.uk inurl:http:
Regards
Nigel
-
How did you find these http pages?
I did a search in Incognito, but I couldn't see anything myself.
I'll try again, thanks!
-
Hi
Thanks for this. Yes I've checked in Google Console, I can find the page in indexed pages but the indexed pages are a lot less since migration:
HTTP - indexed 13013, blocked 12,891
HTTPS - indexed - 2814 / blocked robots.txt 5713
Do I just wait?
One keyword example for that page would be #Heavy Duty Dolly' & 'load moving dolly'
Were position 1 now out of top 100.
We're working on page speed/load time for the whole site, but why would it affect that one page so badly?
-
Hi Becky,
Without knoing those relevant search terms, there's almost no analysis to be done.
I´ve noticed that it took very long time to load, here a GTmetrix report.Remember that migrating to HTTPs makes google to re-crawl all your website's pages and re evaluate all ranking factors.
My advise is to wait a little longer. It might take a few weeks.Also, always monitor the Google Search console profile, there could be some message. Take a look into indexed pages, there could be also that there are less pages indexed now than before migration.
Hope I've helped.
Best luck.
GR. -
Hi Becky
Load Movers - Pos 3
Wooden dollies - Pos 1Maybe open an incognito browser with history cleared.
I don't see a problem
Regards
Nigel
PS You still have 748 http pages indexed but it's only 10% of the total
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
Google suddenly indexing 1,000 fewer pages. Why?
We have a site, blog.example.org, and another site, www.example.org. The most visited pages on www.example.org were redesigned; the redesign landed May 8. I would expect this change to have some effect on organic rank and conversions. But what I see is surprising; I can't believe it's related, but I mention this just in case. Between April 30 and May 7, Google stopped indexing roughly 1,000 pages on www.example.org, and roughly 3,000 pages on blog.example.org. In both cases the number of pages that fell out of the index represents appx. 15% of the overall number of pages. What would cause Google to suddenly stop indexing thousands of pages on two different subdomains? I'm just looking for ideas to dig into; no suggestion would be too basic. FWIW, the site is localized into dozens of languages.
Intermediate & Advanced SEO | | hoosteeno0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Crawl efficiency - Page indexed after one minute!
Hey Guys,A site that has 5+ million pages indexed and 300 new pages a day.I hear a lot that sites at this level its all about efficient crawlabitliy.The pages of this site gets indexed one minute after the page is online.1) Does this mean that the site is already crawling efficient and there is not much else to do about it?2) By increasing crawlability efficiency, should I expect gogole to crawl my site less (less bandwith google takes from my site for the same amount of crawl)or to crawl my site more often?Thanks
Intermediate & Advanced SEO | | Mr.bfz0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Huge google index with un-relevant pages
Hi, i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages. since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do: when a match is finished - not linked, but appears on the index and SERP 301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant? use rel=canonical to the match Category do nothing.... *301 redirect will shrink my index status, some say a high index status is good... *is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google? *would canonical remove the past matches pages from the index? what do you think? Thanks, Assaf.
Intermediate & Advanced SEO | | stassaf0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0