Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
-
Hi!
The Problem
We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them.
The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed.
Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions?
Thanks for any input on this one.
-
Hi Allison, any updates on this?
From my understanding, it is possible that Google is not indexing the mobile versions of pages if they are simply corresponding to the desktop pages (and indicated as such with the rel=alternate mobile switchboard tags). If they have that information they may simply index the desktop pages and then display the mobile URL in search results.
It is also possible that the GSC data is not accurate - if you do a 'site:' search for your mobile pages (I would try something like 'site:domain/m/' and see what shows up), does it show a higher number of mobile pages than what you're seeing in GSC?
Can you check data for your mobile rankings and see what URLs are being shown for mobile searchers? If your data is showing that mobile users are landing on these pages from search, this would indicate that they are being shown in search results, even if they're not showing up as "indexed" in GSC.
-
Apologies on the delayed reply and thank you for providing this information!
Has there been any change in this trend over the last week? I do know that subfolder mobile sites are generally not recommended by search engines. That being said, I do not feel the mobile best practice would change as a result. Does the site automatically redirect the user based on their device? If so, be sure Google is redirecting appropriately as well.
"When a website is configured to serve desktop and mobile browsers using different URLs, webmasters may want to automatically redirect users to the URL that best serves them. If your website uses automatic redirection, be sure to treat all Googlebots just like any other user-agent and redirect them appropriately."
Here is Google's documentation on best practices for mobile sites with separate URLs. I do believe the canonical and alternate tags should be left in place. It may be worth experimenting with the removal of these mobile URLs from the sitemap though I feel this is more of a redundancy issue than anything.
I would also review Google's documentation on 'Common Mobile Mistakes', perhaps there is an issue that is restricting search engines from crawling the mobile site efficiently.
Hope that helps!
-
Hi Paul and Joe
Thanks for the reply!
Responsive is definitely in the works...
In the meantime to answer:
-
GSC is setup for the mobile site. However its not on a subdomain, its a subdirectory mobile site. So rather than m.site.com we have www.site.com/m for the mobile sites. A sitemap has been submitted and thats where I can see the data as shown in the image.
-
Because the mobile site is a subdirectory site the data becomes a little blended with the main domain data in Google Search Console. If I want to see Crawl Stats for example Google advises "To see stats and diagnostic information, view the data for (https://www.site.com/)."
-
re: "My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file." I had read this previously, but due to the nature of the sub-directory setup of the site, the mobile sitemap became part of the sitemap index...rather than having just one large sitemap.
Thoughts?
-
-
ASs joe says - set up a separate GSC profile for the mdot subdomain. The use that to submit the mdot sitemap directly if you wish. You'll get vastly better data about the performance of the mdot site by having it split out, instead of mixed into and obfuscated by the desktop data.
Paul
-
Hi Alison,
While this is a bit late, I would recommend moving to a responsive site when/if possible. Much easier to manage, fewer issues with search engines.
My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file.
Also, do you have Google Search Console set up for both the m. mobile site and the desktop version? It does not seem so with all sitemaps listed in the one property in your screenshot. If not, I recommend setting this up as you may receive some valuable insights into how Google is crawling the mobile site.
I'd also review Google's Common Mobile Mistakes guide to see if any of these issues could be impacting your situation. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
How long does Google takes to re-index title tags?
Hi, We have carried out changes in our website title tags. However, when I search for these pages on Google, I still see the old title tags in the search results. Is there any way to speed this process up? Thanks
Technical SEO | | Kilgray0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Google is indexing my directories
I'm sure this has been asked before, but I was looking at all of Google's results for my site and I found dozens of results for directories such as: Index of /scouting/blog/wp-includes/js/swfupload/plugins Obviously I don't want those indexed. How do I prevent Google from indexing those? Also, it only seems to be doing it with Wordpress, not any of the directories on my main site. (We have a wordpress blog, which is only a portion of the site)
Technical SEO | | UnderRugSwept0