Desktop & Mobile Sitemaps Covering The Same Ground - Any Benefit To Having Both?
-
If my URL structure is the same for the desktop and mobile experience, is there any benefit to creating a mobile sitemap, considering that the sitemap for our desktop site covers the same URLs?
-
Yes, it's responsive design with the exact same URLs for both mobile and desktop.
Thanks for your helpful response!
-
Hi John,
When you say that the URLs have the same structure: do you mean that they are different URLs but organised the same way (e.g. www.domain.com is the same as m.domain.com, www.domain.com/page-1 is the sames as m.domain.com/page-1, etc)? Or is it a responsive site with the same URLs regardless of device?
The primary benefit of a sitemap is for discovery by the search engine crawlers. If you have a responsive site, you don't need a separate mobile sitemap. If you have a different set of URLs for mobile devices, even if it follows the same structure as the desktop site, I'd recommend creating a mobile sitemap.
Hope that helps!
-
John,
Yes! In search marketing today everything is about an edge. Anything you can do to make it easier for search engines to quantify that giving your company as a result is better increases the possibility that you get a higher ranking. Well over 50% of search is on mobile devices today so serving this traffic well is a priority. You should also take some time to look at your menu on mobile devices to see if they are organized in a way that would be convenient for a mobile user. I would highly recommend getting a smart phone with a smaller screen to make sure the buttons are convenient to use. Although these UI adjustments don't directly affect your rankings they do effect your user engagement which in turn decreases your bounce rate and improves your conversion rate. This in turn is factored into your rankings.
Going back to my first argument I would recommend that If you have video or audio that you submit site maps for these as well.
Hope this helps,
Ron
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only a fraction of the sitemap get indexed
I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English. The URL structure is: https://www.baumewatches.com/XX/page (where XX is the country code)
Intermediate & Advanced SEO | | Lvet
Language annotations hreflang seem to be set up properly In the Google Search Console I registered: https://www.baumewatches.com the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way) The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months. I cannot think about a solution for this.0 -
Best Sitemap for Large Website
i have more than 3500 pages on my website. Please let me know the best sitemap plugin for my website.
Intermediate & Advanced SEO | | Michael.Leonard1 -
Pagination & SEO
Hi In one of my other Q&A's someone mentioned I may need to look at pagination. For instance, are these pages counted as 'new' pages in Google's eyes when clicking on pagination? http://www.key.co.uk/en/key/plastic-storage-boxes http://www.key.co.uk/en/key/plastic-storage-boxes#productBeginIndex:30&orderBy:5&pageView:list& Does anyone have any advice on what I could do? It's not something I have had much experience with. Thank you Becky
Intermediate & Advanced SEO | | BeckyKey0 -
What are the best practices with website redesign & redirects?
I have a website that is not very pretty but has great rankings. I want to redesign the website and loose as little rankings as possible and still clean up the navigation. What are the best practices? Thanks in advance.
Intermediate & Advanced SEO | | JHSpecialty0 -
International SEO, Ecommerce & Rich Snippets
I have an Australian Ecommerce site. I also sell to NZ and USA . As part of the user experience it will detect where you are and change the currency accordingly. so when google crawls - the currency will always be USD I guess ( because it is a US IP address ). My question - how can I embed ecommerce microdata that will show the correct currency / price to the correct country in SERPS ?
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
Link Reclimation & Redirects
Hello, I'm in the middle of a link reclamation project wherein we're identifying broken links, links pointing to dupe content etc. I found a forgotten co-brand which is effectively dupe content across 8 sub-domains, some of which have a significant number of links (200+ linking domains | 2k+ in-bound links). Question for the group is what's the optimal redirect option? Option 1: set 301 and maintain 1:1 URL mapping will pass all equity to applicable PLPs and theoretically improve rank for related keyword(s). requires a bit more configuration time and will likely have small effect on rank given links are widely distributed across URLs. Option 2: set 301 to redirect all requests to the associated sub-domain e.g. foo.mybrand.cobrand.com/page1.html and foo.mybrand.cobrand.com/page2 both redirect to foo.mybrand.com/ will accumulate all equity at the sub-domain level which theoretically will be roughly distributed throughout underlying pages and will limit risk of penalty to that sub-domain. Option 3: set 301 to redirect all requests to our homepage. easiest to configure & maintain, will accumulate the maximum equity on a priority page which should positively affect domain authority. run risk of being penalized for accumulating links en mass, risk penalty for spammy links on our primary sub-domain www, won't pass keyword specific equity to applicable pages. To be clear, I've done an initial scrub of anchor text and there were no signs of spam. I'm leaning towards #3, but interested in others perspectives. Cheers,
Intermediate & Advanced SEO | | PCampolo
Stefan0 -
Indexing/Sitemap - I must be wrong
Hi All, I would guess that a great number of us new to SEO (or not) share some simple beliefs in relation to Google indexing and Sitemaps, and as such get confused by what Web master tools shows us. It would be great if somone with experience/knowledge could clear this up for once and all 🙂 Common beliefs: Google will crawl your site from the top down, following each link and recursively repeating the process until it bottoms out/becomes cyclic. A Sitemap can be provided that outlines the definitive structure of the site, and is especially useful for links that may not be easily discovered via crawling. In Google’s webmaster tools in the sitemap section the number of pages indexed shows the number of pages in your sitemap that Google considers to be worthwhile indexing. If you place a rel="canonical" tag on every page pointing to the definitive version you will avoid duplicate content and aid Google in its indexing endeavour. These preconceptions seem fair, but must be flawed. Our site has 1,417 pages as listed in our Sitemap. Google’s tools tell us there are no issues with this sitemap but a mere 44 are indexed! We submit 2,716 images (because we create all our own images for products) and a disappointing zero are indexed. Under Health->Index status in WM tools, we apparently have 4,169 pages indexed. I tend to assume these are old pages that now yield a 404 if they are visited. It could be that Google’s Indexed quotient of 44 could mean “Pages indexed by virtue of your sitemap, i.e. we didn’t find them by crawling – so thanks for that”, but despite trawling through Google’s help, I don’t really get that feeling. This is basic stuff, but I suspect a great number of us struggle to understand the disparity between our expectations and what WM Tools yields, and we go on to either ignore an important problem, or waste time on non-issues. Can anyone shine a light on this for once and all? If you are interested, our map looks like this : http://www.1010direct.com/Sitemap.xml Many thanks Paul
Intermediate & Advanced SEO | | fretts0