Sitemap indexation
-
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
-
Thanks! I'll see if this changes anything.
-
Its not that complicated, it is really easy...
In Google Webmaster tools go to the Crawl/Fetch as Google. The top level will be displayed at the top of the page. Press the Fetch Button to the right.
Goolge will fetch the page and this will be displayed underneath on the same page. To the right of this line, you will see a button to submit to index. When you press this a pop up box will appear and you can select to either submit just this page or this page and all links from it. Select the all links from it. (you can only do this full crawl/submit option 10 times in a calendar month, to submit just single pages you can do this 500 times a month) and then press Submit.
Google will then submit all the pages to its index.
Hope that helps.
Bruce
-
In regard of the error, Google crawled our https://stage.musik.dk instead of just https://musik.dk. We now have authorization on the subdomain, which gives errors in our account. I made another post about this and it seems it shouldn't harm our ranking.
Webmaster Tools is an extremely messy tool when working with various subdomains + no-http
-
Yeah. I've tested it several times, but with no errors. today its up on 35 indexed pages, but a loong way to go...
-
What do you mean by manual submit the site? Its more than 23.000 links, so a manual process is kinda of a no go
-
Hi,
Are you sure you submitted the right site map format / files? We've had in in the past that are sitemap was broken up into multiple files and we had to send sitemap-index.xml, sitemap-1.xml ... sitemap-16.xml. Have you checked it again and again?
regards
Jarno
-
No Sure what the problem was with the "by mistake"
Go to Google Webmaster tools and "manually" submit the site for the home page and all links. This will at least get the ball rolling whilst you investigate the other possible problems once you revist the sitemap etc just to check that it is complete and has not missed off a bunch of pages
Bruce
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Pages in my Shopify website is not indexing
Hi The Service area pages created on my Shopify website is not indexing on google for a long time, Tried indexing the pages manually and also submitted the sitemap but still the pages doesn't seem to get indexed.
Technical SEO | | Bhisshaun
Thanks in Advance.0 -
De-indexing and SSL question
Few days ago Google indexed hundreds of my directories by mistake (error with plugins/host), my traffic dropped as a consequence. Anyway I fixed that and submitted a URL removal request. Now just waiting things to go back to normality. Meantime I was supposed to move my website to HTTPS this week. Question: Should I wait until this indexing error has been fixed or I may as well go ahead with the SSL move?
Technical SEO | | fabx0 -
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
Google News Sitemap
Currently for our website Thinkdigit, we are using a rss sitemap (http://www.thinkdigit.com/google_sitemap/news_rss.php) for news. Please let me know is this the right format or we should use xml format only. Also we have lost a huge chunk of traffic from news search, Previously it used to be around 10,000 visit from google news, now it is just 300 visit per day.
Technical SEO | | 9dot90 -
Wrong page version in the index
Hi, my site is currently accessible through URL with and without www. The Version with www has 10 times more Backlinks (PA 45 vs 38) but is not listet into the google Index. As far as I know there was never made a google Webmaster account or declared otherwise the version without www to be 'cannonical'. Basically I think that for SEO reasons it would be much better to declare the with www version to be cannonical and redirect the without www version to it. My questions are: Do you have an idea why the with www version is not indexed?
Technical SEO | | Naturalmente
How long does Google usually take to change the version in the index?
Do I risk my site to be thrown out of the index for some days untill the change is made? Thanks in advance.0 -
301 redirects inside sitemaps
I am in the process of trying to get google to follow a large number of old links on site A to site B. Currently I have 301 redirects as well a cross domain canonical tags in place. My issue is that Google is not following the links from site A to site B since the links no longer exist in site A. I went ahead and added the old links from site A into site A's sitemap. Unfortunately Google is returning this message inside webmaster tools: When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL. However I do not understand how adding the redirected links from site B to the sitemap in site A will remove the old links. Obviously Google can see the 301 redirect and the canonical tag but this isn't defined in the sitemap as a direct correlation between site A and B. Am I missing something here?
Technical SEO | | jmsobe0