Only a fraction of the sitemap get indexed
-
I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English.
The URL structure is: https://www.baumewatches.com/XX/page (where XX is the country code)
Language annotations hreflang seem to be set up properlyIn the Google Search Console I registered:
- https://www.baumewatches.com
- the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country
I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way)
The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months.
I cannot think about a solution for this.
-
Hey Luca,
I'm sorry that you are running into this trouble. Here are a few things to review in order to get the number of pages increased in Google:
1. You'll want to review your internal linking structure in the Google Search Console. Adding a xml sitemap and submitting it to Google and Bing can help but inner links can help Google index new pages and deeper pages
2. Add content to key pages on the site like the homepage, category pages and individual product pages. If your site is thin (have pages that have less than 200 words of content), Google will be less likely to index it deeper
3. Build external links to deep pages on the site such as product category pages
I hope this helps - I would start with content and inner linking
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing sitemaps in console
Hi there, Does anyone have any experience submitting a completely new sitemap structure - including URLs - to google console? We've changed our sitemap plug in, so rather than /sitemap-index.xml, our main sitemap home is /sitemap.xml (as an example). Is it better to 410 the old ones or 301 redirect them to the new sitemaps? If 301, what do we do about sitemaps that don't completely correlate - what was divided into item1.xml, item2.xml is now by date so items-from-2015.xml, items-from-2016.xml and so on. On a related note, am I right in thinking that there's no longer a "delete/ remove sitemap" option on console? In which case, what happens to the old ones which will now 404? Thanks anyone for any insight you may have 🙂
Intermediate & Advanced SEO | | Fubra0 -
Google index
Hello, I removed my site from google index From GWT Temporarily remove URLs that you own from search results, Status Removed. site not ranking well in google from last 2 month, Now i have question that what will happen if i reinclude site url after 1 or 2 weeks. Is there any chance to rank well when google re index the site?
Intermediate & Advanced SEO | | Getmp3songspk0 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
URL with a # but no ! being indexed
Given that it contains a #, how come Google is able to index this URL?: http://www.rtl.nl/xl/#/home It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang). site:http://www.rtl.nl/xl/#/home returns nothing, but: site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
Intermediate & Advanced SEO | | EdelmanDigital0 -
Strange indexing of multi language site
I've been looking at a site which has a strange ranking/indexing issue. The website has several translated versions of the site for different languages and these translated pages seem to be outranking the UK pages in the UK search results. All of the translated pages are in sub folders, eg domain.com domain.com/fr domain.com/sv domain.com/it domain.com/es I cant work out how or why Google would see these pages with non english content as more relevant than the UK pages? One thing I did notice is that there are no meta language tags on there. Could this be the issue?
Intermediate & Advanced SEO | | edwardlewis0 -
Submitting sitemaps every 7 days
Question, if you had a site with more than 10 million pages (that you wanted indexed) and you considered each page to be equal in value how would you submit sitemaps to Google? Would you submit them all at once: 200 sitemaps 50K each in a sitemap index? Or Would you submit them slowly? For example, would it be a good idea to submit 300,000 at a time (in 6 sitemaps 50k each). Leave those those 6 sitemaps available for Google to crawl for 7 days then delete them and add 6 more with 300,000 new links? Then repeat this process until Google has crawled all the links? If you implemented this process you would never at one time have more than 300,000 links available for Google to crawl in sitemaps. I read somewhere that eBay does something like this, it could be bogus info though. Thanks David
Intermediate & Advanced SEO | | zAutos0 -
Google indexing issue?
Hey Guys, After a lot of hard work, we finally fixed the problem on our site that didn't seem to show Meta Descriptions in Google, as well as "noindex, follow" on tags. Here's my question: In our source code, I am seeing both Meta descriptions on pages, and posts, as well as noindex, follow on tag pages, however, they are still showing the old results and tags are also still showing in Google search after about 36 hours. Is it just a matter of time now or is something else wrong?
Intermediate & Advanced SEO | | ttb0