Homepage not indexed - seems to defy explanation
-
Hey folks
Hoping to get some more eyes on a specific problem I am seeing with a clients site.
Site: http:www.ukjuicers.com
We have checked everything we can think of and the usual suspects here are not present:
- Canonical URL is in place
- Site is shown as indexed in search console
- No Crawl, DNS, Connectivity or server errors
- No robots.txt blocking - verified in search console
- No robots meta tags or directives
- Fetch as Google works
- Fetch & render works
- site command returns all other pages
- info command does not return the homepage
- homepage is cached and cache has been updated since this issue started: http://webcache.googleusercontent.com/search?q=cache:www.ukjuicers.com
- homepage is indexed in yahoo and Bing
- all variations redirect to the www.ukjuicers.com domain (.co.uk, .com, www, sans www etc)
The only issue I found after some extensive digging was some issues with the HTTP and HTTPS versions of the site both being available and both specifying the canonical version as themselves. So, http site used canonicals with http and https site used canonicals with https. So, a conflict there with the canonical exacerbating the problem it is there to solve.
The HTTPS site is not indexed though and we have set this up in webmaster tools and now the web developer has set redirects to ensure all versions even the https now 301 redirect to the http://www.ukjuicers.com page so these canonical issues have been ironed out.
But... it's still not indexing the homepage.
The practical implications of this are quite scary - the site used to be somewhere between 1st and 4th for keywords like 'juicers', 'juicer' etc. Now they are bottom of page 1 or top of page 2 with an internal page. They were jostling with the big boys (amazon, argos, john lewis etc) but now they are right at the bottom of the second page.
It's a strange one - i have seen all manor of technical problems over the years but this one seems to defy sensible explanation. The next step is to do a full technical SEO audit of the site but I am always of the opinion that with many eyes all bugs are shallow so if anyone has any input or experience with odd indexation problems like this would love to get your input.
Cheers
Marcus -
Glad you figured it out. I honestly didn't think it would have been the canonicals. I'm a little surprised that the bots didn't just choose not to respect the suggestion as opposed to blanking your site from the index. Didn't think that was even a possibility from incorrect canonicals. Good to know for the future though in case anything like this comes up with anyone else's site.
-
Yep - it's back. Looks like resolving the canonical issue fixed it. Seems it was a usual suspect after all.
-
Yep - bit of a weird one but in the end looks like the canonicals were the issue. Thanks for taking a look though man - super appreciated.
-
Hey Bernadette - thanks for the feedback. Site is back in the index now, looks like the canonicals were the culprit but the owners are keen for no future issues so I will dig in and take a look at these points. Cheers!
-
Hey folks
24 hours after we identified and fixed the canonical issue the site is now indexed again so it does look like it was indeed a canonical conundrum. Both the HTTP and HTTPS sites were claiming to be the canonical version so in some respects creating a conflict. We removed this conflict and it is now indexed.
Thanks for the extra eyes folks - appreciated and if anyone ever needs another pair of eyes to look a problem give me a shout.
Cheers
Marcus -
Hey Marcus. You just need some links from high authority website like moz:) People say you're indexed so case closed, job done:)
-
I just noticed that clicking on the entire slider, even out to the sides where it appears to be just white space, takes you to another page. At first I didn't realize what I was clicking that got me to the next page. When I do Crtl+A on the page, the full width of the slider images shows highlighted in blue, but to the side of those images outside of those bounds is linked. I'm wondering if Google sees this as cloaking and kicked out the homepage as a result.
*I did see that AGM pointed out it's indexed now, but that's not to say this wasn't the cause of original de-index.
-
As of this writing it looks like the page is indexed. By searching site:ukjuicers.com it comes up in the search results with about 861 other results. Not sure if there is anything you changed to get things working again but it seems to be in their index now.
-
I took a look at all of the usual suspects as well... which amounts to pretty much everything that everyone else mentioned but I was intrigued by this issue and thought maybe another set of eyes might notice something that was off. Nothing was wrong in the page source from what I saw, no issues crawling it myself and I didn't see any penalties. Normally I'd think that if your homepage wasn't appearing for branded organic searches then a penalty was levied against you but when that is the case the homepage is still normally find-able in a Site operator search. M__aybe it is related to all the backlinks that were lost/deleted in the past month but I'm not sure why that would be the case unless removing the homepage from the index was a Penguin response to link issues... but I was under the impression that peguin was devaluing the link source not the link recipient and deleting/removing links seems to be a preferred method of handling penguin-related issues. So if there is a relationship between penguin and your homepage being deindexed then I am not sure at all why nor am I certain how to fix it as I'm not seeing anything in particular that screams "linking issue" at me. (though I only did a fairly cursory inspection of things)
So I am stumped. Whenever the issue is figure out I would love to know how/why this came to be.
-
Marcus, I know this is frustrating. I've checked several things, and looked at many of the possibilities that you've already brought up. I don't have access to the Google Search Console, so I cannot comment about any of that data. I'm assuming that you don't have a manual action on the site or any other messages from Google.
What I've seen in the past is issues with schema markup, especially when it comes to reviews and how they're handled on sites. I'm not saying that this is the issue--but I've seen issues that Google has had with these (especially because there is the word "hidden" there in the code). So, you might look into that some more.
The issue could also be related to links--look at the links to the site's home page to see if there is an issue with low quality links pointing to that page or other unnatural links.
If someone has copied the page, added a canonical tag, and then added a "meta noindex tag" to their page, it's possible that they could have taken your page out of the index. This has happened before.
-
Unfortunately you're not amazon so maybe you must try harder;)
or force to index mainpage with some software or indexer website then wait a while.
I'm thinking about some negative seo made for your mainpage but so far can't see any symptoms.
-
This is a strange one then.... very strange.
Just performed a site: search and like you said it is not showing up as indexed. There is normally something technical to explain an issue like this, but I cannot see anything after looking at your site robots and source code.
-
Hey Krzysztof
Yeah, the page has little textual content but... neither does the amazon homepage. Ultimately the page is a jump in point for all the products and the content suits that. Certainly, I could understand Google not liking the page but would that not result in a reduced rank rather than a complete removal like this?
On the dodgy links front they have never done anything on that front - so anything there would be surprising (or just incidental cruft that is out there on scraper sites and the like).
Super odd.
-
Yep - super odd. 15 years or so in this game and never seen anything quite like this. Transient drops but usually it boiled down to some simple technical error or more often user error cough no index / robots.txt cough
-
Hey - the real issue here is the page is just not indexed. It's not there. Not that another page is a more suitable or preferential result. Ultimately that was the best page for a user to jump in at... The page is not even returned in a brand search so... can't see how any other page could be more suitable for that kind of search.
-
Hi Marcus
The only thing I think it can be the issue is the number of words on mainpage. Mostly I see images and words from menus, links and not main content. Digging deeper can help (seo audit).
This can be a penguin too but to know the answer, full link analysis is needed. After quick glance I see some unnatural links but not in larger number. Maybe they got footprints not visible at once (same ip, c class, content with link etc).
-
You're not kidding, this does defy explanation. When did it drop out of the index?
In all honesty, I don't have a solution, you've already checked everything I would have. I'm mostly commenting so I can keep up with this issue and see how it unfolds. Very curious to see if anyone can identify what's happening here.
-
Hmmm, is it a case of Google simply feels the homepage is not as engaging and relevant in terms of search to your users and they put more emphasis on product pages which it choose to feature instead.
I often find that for key terms our product pages almost always rank higher then the homepage unless a brand only search.
Secondly, is this a recent change? Could the most recent Penguin update have simply resulted in your competitors getting a boost where as before the previous algo was holding them back which has resulted in your position slide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
AJAX and High Number Of URLS Indexed
I recently took over as the SEO for a large ecommerce site. Every Month or so our webmaster tools account is hit with a warning for a high number of URLS. In each message they send there is a sample of problematic URLS. 98% of each sample is not an actual URL on our site but is an AJAX request url that users are making. This is a server side request so the URL does not change when users make narrowing selections for items like size, color etc. Here is an example of what one of those looks like Tire?0-1.IBehaviorListener.0-border-border_body-VehicleFilter-VehicleSelectPanel-VehicleAttrsForm-Makes We have over 3 million indexed URLs according to Google because of this. We are not submitting these urls in our site maps, Google Bot is making lots of AJAX selections according to our server data. I have used the URL Handling Parameter Tool to target some of those parameters that are currently set to let Google decide and set it to "no urls" with those parameters to be indexed. I still need more time to see how effective that will be but it does seem to have slowed the number of URLs being indexed. Other notes: 1. Overall traffic to the site has been steady and even increasing. 2. Google bot crawls an average of 241000 urls each day according to our crawl stats. We are a large Ecommerce site that sells parts, accessories and apparel in the power sports industry. 3. We are using the Wicket frame work for our website. Thanks for your time.
Technical SEO | | RMATVMC0 -
Having www. and non www. links indexed
Hey guys, As the title states, the two versions of the website are indexed in Google. How should I proceed? Please also note that the links on the website are without the www. How should I proceed knowing that the client prefers to have the www. version indexed. Here are the steps that I have in mind right now: I set the preferred domain on GWMT as the one with www. I 301 redirect any non www. URL to the www. version. What are your thoughts? Should I 301 redirect the URL's? or is setting the preference on GWMT enough? Thanks.
Technical SEO | | BruLee0 -
Index quickly a website? (Google,Bing..)
Hi, I would like to know what are the best practices in 2012 to index our website in less than 24 hours? (or less..) Thanks for your answer 😄
Technical SEO | | Probikeshop0 -
Index page
To the SEO experts, this may well seem a silly question, so I apologies in advance as I try not to ask questions that I probably know the answer for already, but clarity is my goal I have numerous sites ,as standard practice, through the .htaccess I will always set up non www to www, and redirect the index page to www.mysite.com. All straight forward, have never questioned this practice, always been advised its the ebst practice to avoid duplicate content. Now, today, I was looking at a CMS service for a customer for their website, the website is already built and its a static website, so the CMS integration was going to mean a full rewrite of the website. Speaking to a friend on another forum, he told me about a service called simple CMS, had a look, looks perfect for the customer ... Went to set it up on the clients site and here is the problem. For the CMS software to work, it MUST access the index page, because my index page is redirected to www.mysite.com , it wont work as it cant find the index page (obviously) I questioned this with the software company, they inform me that it must access the index page, I have explained that it wont be able to and why (cause I have my index page redirected to avoid duplicate content) To my astonishment, the person there told me that duplicate content is a huge no no with Google (that's not the astonishing part) but its not relevant to the index and non index page of a website. This goes against everything I thought I knew ... The person also reassured me that they have worked within the SEO area for 10 years. As I am a subscriber to SEO MOZ and no one here has anything to gain but offering advice, is this true ? Will it not be an issue for duplicate content to show both a index page and non index page ?, will search engines not view this as duplicate content ? Or is this SEO expert talking bull, which I suspect, but cannot be sure. Any advice would be greatly appreciated, it would make my life a lot easier for the customer to use this CMS software, but I would do it at the risk of tarnishing the work they and I have done on their ranking status Many thanks in advance John
Technical SEO | | Johnny4B0 -
301 Redirection of entire section to the homepage
Hi Guys, So here's the deal. Let's say I have a site at mysite.com/ which talks about tomatoes, and I also have a subsection that talks about potatoes at mysite.com/potatoes I want to stop providing information about potatoes altogether so i'm thinking about doing a 301 redirection from all of the pages at mysite.com/potatoes(.*) to the home page. The thing is, mysite.com/potatoes actually has a great page authority (3475 links from 145 domains) so I really don't wan to lose all that juice... Here are my questions: Will the links be added to the ones i have for the homepage already? Since my home page and my /potatoes section ranked for 2 different subjects, how is this transfer going to affect my rankings for the homepage? will it now also rank for both tomatoes AND potatoes? How much time does it usually take for google to recognize the 301 and pass the link juice? Any other tips on optimizing this process? Thank you for your time! -francois
Technical SEO | | nyakim0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0