Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
-
Hi!
The Problem
We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them.
The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed.
Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions?
Thanks for any input on this one.
-
Hi Allison, any updates on this?
From my understanding, it is possible that Google is not indexing the mobile versions of pages if they are simply corresponding to the desktop pages (and indicated as such with the rel=alternate mobile switchboard tags). If they have that information they may simply index the desktop pages and then display the mobile URL in search results.
It is also possible that the GSC data is not accurate - if you do a 'site:' search for your mobile pages (I would try something like 'site:domain/m/' and see what shows up), does it show a higher number of mobile pages than what you're seeing in GSC?
Can you check data for your mobile rankings and see what URLs are being shown for mobile searchers? If your data is showing that mobile users are landing on these pages from search, this would indicate that they are being shown in search results, even if they're not showing up as "indexed" in GSC.
-
Apologies on the delayed reply and thank you for providing this information!
Has there been any change in this trend over the last week? I do know that subfolder mobile sites are generally not recommended by search engines. That being said, I do not feel the mobile best practice would change as a result. Does the site automatically redirect the user based on their device? If so, be sure Google is redirecting appropriately as well.
"When a website is configured to serve desktop and mobile browsers using different URLs, webmasters may want to automatically redirect users to the URL that best serves them. If your website uses automatic redirection, be sure to treat all Googlebots just like any other user-agent and redirect them appropriately."
Here is Google's documentation on best practices for mobile sites with separate URLs. I do believe the canonical and alternate tags should be left in place. It may be worth experimenting with the removal of these mobile URLs from the sitemap though I feel this is more of a redundancy issue than anything.
I would also review Google's documentation on 'Common Mobile Mistakes', perhaps there is an issue that is restricting search engines from crawling the mobile site efficiently.
Hope that helps!
-
Hi Paul and Joe
Thanks for the reply!
Responsive is definitely in the works...
In the meantime to answer:
-
GSC is setup for the mobile site. However its not on a subdomain, its a subdirectory mobile site. So rather than m.site.com we have www.site.com/m for the mobile sites. A sitemap has been submitted and thats where I can see the data as shown in the image.
-
Because the mobile site is a subdirectory site the data becomes a little blended with the main domain data in Google Search Console. If I want to see Crawl Stats for example Google advises "To see stats and diagnostic information, view the data for (https://www.site.com/)."
-
re: "My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file." I had read this previously, but due to the nature of the sub-directory setup of the site, the mobile sitemap became part of the sitemap index...rather than having just one large sitemap.
Thoughts?
-
-
ASs joe says - set up a separate GSC profile for the mdot subdomain. The use that to submit the mdot sitemap directly if you wish. You'll get vastly better data about the performance of the mdot site by having it split out, instead of mixed into and obfuscated by the desktop data.
Paul
-
Hi Alison,
While this is a bit late, I would recommend moving to a responsive site when/if possible. Much easier to manage, fewer issues with search engines.
My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file.
Also, do you have Google Search Console set up for both the m. mobile site and the desktop version? It does not seem so with all sitemaps listed in the one property in your screenshot. If not, I recommend setting this up as you may receive some valuable insights into how Google is crawling the mobile site.
I'd also review Google's Common Mobile Mistakes guide to see if any of these issues could be impacting your situation. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Hide sitelinks from Google search results
Does anyone have any recommendations on how you can tell Google (hopefully via a URL) not to index that page of a website? I have tried through SEO Yoast to hide certain sitemaps (which has worked to a degree) but certain functionalities of Wordpress websites show links without them actually being part of a "sitemap" so those links are harder to hide. I'm having an issue with one of my websites - the sitelinks that Google is suggesting are nowhere near the most popular pages and I know that you can't make recommendations through Google not to show certain pages through Search Console. anymore. Any suggestions are greatly appreciated! Thanks!
Technical SEO | | MainstreamMktg0 -
How do I authenticate a script with Search Console API to pull data
In regards to this article, https://moz.com/blog/how-to-get-search-console-data-api-python I've gotten all the way to the part where I need to authenticate the script to run. I give access to GSC and the local host code comes up. In the article, it says to grab the portion between = and #, but that doesnt seem to be the case anymore. This is what comes up in the browser http://localhost/?code=4/igAqIfNQFWkpKyK6c0im0Eop9soZiztnftEcorzcr3vOnad6iyhdo3DnDT1-3YFtvoG3BgHko4n1adndpLqjXEE&scope=https://www.googleapis.com/auth/webmasters.readonly When I put portions of it in, it always comes back with an error. Help!
Technical SEO | | Cnvrt0 -
Soft 404 in Search Console
Search console is showing quite a lot of soft 404 pages on my site, but when I click on the links, the pages are all there. Is there a reason for this? It's a pretty big site - I'm getting 141 soft 404s from about 20,000 pages
Technical SEO | | abisti20 -
How to know how much pages are indexed on Google?
I have a big site, there are a way to know what page are not indexed? I know that you can use site: but with a big site is a mess to check page by page. This is a tool or a system to check a entire site and automatically find non-indexed pages?
Technical SEO | | markovald0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Why is Google only indexing 3 of 8 pages?
Hi everyone, I have a small 8 page website I launched about 6 months ago. For the life of me I can not figure out why google is only indexing 3 of the 8 pages. The pages are not duplicate content in any way. I have good internal linking structure. At this time I dont have many inbound links from others, that will come in time. Am I missing something here? Can someone give me a clue? Thanks Tim Site: www.jparizonaweddingvideos.com
Technical SEO | | fasctimseo0