Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemap indexed pages dropping
-
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
-
Just wanted to update this, it took a month but since I decided to completely remove canonical tags and try and handle duplicate content with url rewrites and 301 redirects and I now have 114 out of 149 indexed from my sitemap which is much better. it ended up to dropping to 5 out of 149 at one point.
-
Hi Stephen,
Great that you've probably found the cause - this will absolutely cause mass de-indexation. I had a client a year ago canonicalise their entire site (two sites, actually) to the home page. All their rankings and indexed pages dropped off over a matter of about six days (we spotted the tag immediately but the fix went into a "queue" - ugh!).
The bad news is that it took them a long time to get properly re-indexed and regain their rankings (I am talking months, not weeks). Having said this, the sites were nearly brand new - they had very few backlinks and were both less than six months old. I do not believe that an older site would have had as much problem regaining rankings, but I can't be sure and I have only seen that situation take place first-hand once.
-
I may have found the issue today. Most of the articles are pulled from a database and I think I placed a wrong canonical tag on the page which screwed up everything. Does anyone know how long it takes before a fix like this will show?
-
Thats a good catch, I fixed that. I do use that in WMT and it has been fine for the longest time. I guess its not that big of an issue, my main concern was the pages being indexed. Was reading another Q&A thing and used the info: qualifer to check some of the pages and all the ones I checked are indexed and its more then the 11. I just don't understand why its dropped all a sudden, and if that number really means anything.
-
How are the indexed numbers looking in WMT today? I see 3,370 results for a site: search on the domain, but those can be iffy in terms of up to date accuracy: https://www.google.co.uk/search?q=site%3Agoautohub.com&oq=site%3Agoautohub.com&aqs=chrome..69i57j69i58.798j0j4&sourceid=chrome&es_sm=119&ie=UTF-8
Not that this should matter too much if you are submitting a sitemap through WMT but your robots.txt file specifies sitemap.xml. There is a duplciate sitemap on that URL (http://goautohub.com/sitemap.xml) - are you using sitemap.php, which you mention here, in WMT? .php can be used for sitemaps, but I would update the robots.txt file to reflect the correct URL - http://i.imgur.com/uSB1P1g.png, whichever is meant to be right. I am not aware of problems with having duplicate sitemaps, as long as they are identical, but I'd use just one if it were me.
-
Thanks for checking, I haven't found anything yet.The site is goautohub.com. it's a custom site and the site map file is auto generated. It's goautohub.com/sitemap.php. I've done it like that for over a year. I did start seeing an error message about high response times and I've been working on improving that. It makes since because we have been advertising more to get the site seen. In regards to the rest of Williams points I have checked those but no improvement yet. Thank you
-
Hi Stephen,
Checking in to see if you had checked the points William has raised above. Do you see anything that could have resulted in the drop? Also, are you comfortable sharing the site here? We might be able to have a look too (feel free to PM if you are not comfortable sharing publicly).
Cheers,
Jane
-
Try to determine when the drop off started, and try to remember what kinds of changes the website was going through during that time. That could help point to the reason for the drop in indexing.
There are plenty of reasons why Google may choose not to index pages, so this will take some digging. Here are some places to start the search:
-
Check your robots.txt to ensure those pages are still crawlable
-
Check to make sure the content on those pages isn't duplicated somewhere else on the Web.
-
Check to see if there was any updates to canonical changes on the site around when the drop started
-
Check to make sure the sitemap currently on the site matches the one you submitted to Webmasters, and that your CMS didn't auto-generate a new one
-
Make sure the quality of the pages is worth indexing. You said your traffic didn't really take a hit, so it's not de-indexing your quality stuff.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing a site from Google index with no index met tags
Hi there! I wanted to remove a duplicated site from the google index. I've read that you can do this by removing the URL from Google Search console and, although I can't find it in Google Search console, Google keeps on showing the site on SERPs. So I wanted to add a "no index" meta tag to the code of the site however I've only found out how to do this for individual pages, can you do the same for a entire site? How can I do it? Thank you for your help in advance! L
Technical SEO | | Chris_Wright1 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
How to block text on a page to be indexed?
I would like to block the spider indexing a block of text inside a page , however I do not want to block the whole page with, for example , a noindex tag. I have tried already with a tag like this : chocolate pudding chocolate pudding However this is not working for my case, a travel related website. thanks in advance for your support. Best regards Gianluca
Technical SEO | | CharmingGuy0 -
Staging & Development areas should be not indexable (i.e. no followed/no index in meta robots etc)
Hi I take it if theres a staging or development area on a subdomain for a site, who's content is hence usually duplicate then this should not be indexable i.e. (no-indexed & nofollowed in metarobots) ? In order to prevent dupe content probs as well as non project related people seeing work in progress or finding accidentally in search engine listings ? Also if theres no such info in meta robots is there any other way it may have been made non-indexable, or at least dupe content prob removed by canonicalising the page to the equivalent page on the live site ? In the case in question i am finding it listed in serps when i search for the staging/dev area url, so i presume this needs urgent attention ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Is the Authority of Individual Pages Diluted When You Add New Pages?
I was wondering if the authority of individual pages is diluted when you add new pages (in Google's view). Suppose your site had 100 pages and you added 100 new pages (without getting any new links). Would the average authority of the original pages significantly decrease and result in a drop in search traffic to the original pages? Do you worry that adding more pages will hurt pages that were previously published?
Technical SEO | | Charlessipe0 -
How to stop my webmail pages not to be indexed on Google ??
when i did a search in google for Site:mywebsite.com , for a list of pages indexed. Surprisingly the following come up " Webmail - Login " Although this is associated with the domain , this is a completely different server , this the rackspace email server browser interface I am sure that there is nothing on the website that links or points to this.
Technical SEO | | UIPL
So why is Google indexing it ? & how do I get it out of there. I tried in webmaster tool but I could not , as it seems like a sub-domain. Any ideas ? Thanks Naresh Sadasivan0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0