Google Sitemap only indexing 50% Is that a problem?
-
We have about 18,000 pages submitted on our Google Sitemap and only about 9000 of them are indexed. Is this a problem?
We have a script that creates a sitemap on a daily basis and it is submitted on a daily basis. Am I better off only doing it once a week? Is this why I never get to the full 18,000 indexed?
-
My robots, tags and redirects are all good now. Any other things to look at?
-
Have you done some troubleshooting? If there's that much of a % change, did you check your robots, tags, redirects, etc. to see if any of the technical side may be hindering indexing?
-
It is a large e-commerce site with pretty much the exact situation described. We re did the site about 6 weeks ago and the site before was always close to 100% indexed. It was about 17900 out of 18000.
-
Great answer Donford. We have a large site, with many items that are basically the same but usually have one different attribute value. So Google will typical index a parent page and list the rest as:
Results 1 - 15 of 15 – Medium Duty - Swivel Top Plate - Capacity to 400 lbs ...
So even though the page may not be in the primary index, it will still help the visitor get to what they are looking for. So I would advise grabbing a snippet of text on a page not indexed and using it as a query to see if this is the case.
-
Google will index more as they find value in more links. The last ecommerce site I worked on had 12,000 pages as of the end of the year they were 85% indexed.
It is quite common from my experience for larger sites to take awhile to be fully indexed if ever at all. Here is what Goolge says about ensuring proper setup, but other then what they say, its all about content and uniqueness. A particular challenge for some e-commerce sites whom sell items that are similar in nature. Like 1/2"x1" screw vs 5/8" x 1" screw. Its very hard to develop unique content for items that similar.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
MOZ is showing that I have non- indexed blog tag posts are they supposed to be nonindexed. My articles are indexed just not the blog tags that take you to other similar articles do I need to fix this or is it ok?
MOZ is showing that my blog post tags are not indexed my question is should they be indexed? my articles are indexed just not the tags that take you to posts that are similar. Do I need to fix this or not? Thank you
Intermediate & Advanced SEO | | Tyler58910 -
Only a fraction of the sitemap get indexed
I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English. The URL structure is: https://www.baumewatches.com/XX/page (where XX is the country code)
Intermediate & Advanced SEO | | Lvet
Language annotations hreflang seem to be set up properly In the Google Search Console I registered: https://www.baumewatches.com the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way) The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months. I cannot think about a solution for this.0 -
XML Sitemaps - how to create the perfect XML Sitemap
Hello, We have a site that is not updated very often - currently we have a script running to create/update the XML sitemap every time a page is added/edited or deleted. I have a few questions about best practices for creating XML sitemaps. 1. If the site is not updated for months on end - is it a bad idea to force the script to update i.e. changing the dates once a month? Will google noticed nothing has changed just the date i.e. all the content on the site is exactly the same. Will they start penalising you for updating an XML sitemap when there is nothing new about the website?
Intermediate & Advanced SEO | | JohnW-UK
2. Is it worth automating the XML file to link into Bing/Google to update via webmaster tools - as I say even if the site is never updated?
3. Is the use of "priorities" necessary?
4. The changefreq - does that mean Google/Bing expects to see a new file ever month?
5. The ordering of the pages - the script seems pretty random and put the pages in a random order - should we make it order the pages with the most important ones first? Should the home page always be first?
6. Below is a sample of how our XML sitemap appears - is there anything that we should change? i.e. all marked up properly? This XML file does not appear to have any style information associated with it. The document tree is shown below.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><url><loc>http://www.domain.com</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/contact/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/sitemap/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url></urlset> Hope someone can help enlighten us to best practices0 -
Broken sitemaps vs no sitemaps at all?
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file. The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up. I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt. I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes. Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
Intermediate & Advanced SEO | | edmundsseo0 -
Google indexing issue?
Hey Guys, After a lot of hard work, we finally fixed the problem on our site that didn't seem to show Meta Descriptions in Google, as well as "noindex, follow" on tags. Here's my question: In our source code, I am seeing both Meta descriptions on pages, and posts, as well as noindex, follow on tag pages, however, they are still showing the old results and tags are also still showing in Google search after about 36 hours. Is it just a matter of time now or is something else wrong?
Intermediate & Advanced SEO | | ttb0 -
Are there any disadvantages of switching from xml sitemaps to .asp sitemaps in GWT
I have been using multiple xml sitemaps for products for over 6 months and they are indexing well with GMT. I have been having this manually amended when a product becomes obsolete or we no longer stock it. I now have the option to automate the sitemaps from a SQL feed but using .asp sitemaps that I would submit the same way in GWT. I'd like your thoughts on the Pro's and cons of this, pluses for me is realtime updates, con's I percieve GMT to prefer xml files. what do you think?
Intermediate & Advanced SEO | | robertrRSwalters0 -
Why do i not receive google traffic?
over the 4-5 months i have published over 3000 unique articles which i have payed well over 10 000usd for, but i still only receive about 20 google visitors a day for that content. i uploaded the 3000 articles after i 301 redirected the old site to a a new domain (old site had 1000 articles, and at least 300visits from google a day), and all the old conetnt receives the traffic fine (301 redirect is working 100percent now and pr went from 0 to 3pr) articles are also good ranging from 400-800 words. 90 percent of them are indexed by google, most of them have been bookmarked to digg reddit etc website domain is over 10 years old - alltopics.com why google doesnt send me the traffic i deserve?
Intermediate & Advanced SEO | | rxesiv0