Sitemap for SmartPhone site
-
Hello
I have a smartphone site (e.g.m.abc.com). To my understanding we do not need a mobile sitemap as its not a traditional mobile site. Shall I add those mobile site links in my regular www XML sitemap or not bother to add the links as we already have rel = canonical (on m.abc.com ) and rel= alternate in place (on www site) to respective pages. Please suggests a solution.
I really look forward to an answer as I haven't found the "official" answer to this question anywhere.
-
Here is something that comes from this article that confuses me. http://www.seroundtable.com/mobile-sites-google-sitemaps-12709.html
Yes, with "mobile" we mean the traditional mobile phone browsers, not smart-phones (which we generally treat the same as desktop browsers given their advanced capabilities). Using special CSS/HTML templates for smart-phones would be fine and would not require submitting them via mobile Sitemap. It keeps things a bit easier if you want to focus on smart-phones, but there are still a gigantic number of more traditional phones with limited internet browsing capabilities out there :-).
Based on this we don't need Mobile sitemap for Smartphone sites. Its only needed for traditional mobile sites.
-
I suggest to add the m.abc.com Domain to a mobile Sitemap if you can do it. It's not an important requirement - particularly if you already have a normal XML sitemap and use rel=alternate. But if you can additionally create the mobile sitemap without much effort, why not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
What is optimal sitemap for large website
My website is having more than 3500 posts. Please let me know what sitemap plugin I need to use for the website and what is the best practice for it?
Intermediate & Advanced SEO | | Michael.Leonard0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
International Site Migration
Hi guys, In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc. The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk A lot of the content between these English based countries are the same. E.g. same product descriptions.
Intermediate & Advanced SEO | | jayoliverwright
So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site? So for: Australian pages: United States pages: Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering? Thankyou, Chris0 -
Temporarily shut down a site
What would be the best way to temporarily shut down a site the right way and not have a negative impact on SEO?
Intermediate & Advanced SEO | | LibertyTax1 -
What is the practical influence of priority in a sitemap?
I have a directory site with 1000s of entries. Will there be benefit to be gained from playing with various entries priorities in the sitemap? I was thinking I might give more priority to entries that have upgraded their directory entry. Thanks.
Intermediate & Advanced SEO | | flow_seo0 -
Limiting URLS in the HTML Sitemap?
So I started making a sitemap for our new golf site, which has quite a few "low level" pages (about 100 for the golf courses that exist in the area, and then about 50 for course architects), etc etc. My question/open discussion is simple. In a sitemap that already has about 50 links, should we include these other low level 150 links? Of course, the link to the "Golf Courses" is there, along with a link to the "Course Architects" MAIN pages (which, subdivides on THOSE pages.) I have read the limit is around 150 links on the sitemap.html page and while it would be nice to rank long tail for the Golf Courses. All in all, our site architecture itself is easily crawlable as well. So the main question is just to include ALL the links or just the main ones? Thoughts?
Intermediate & Advanced SEO | | JamesO0 -
Key page of site not ranking at all
Our site has the largest selection of dog clothes on the Internet. We're been (every so slowly) creeping up in the rankings for the "dog clothes" term, but for some reason only rank for our home page. Even though the home page (and every page on the domain) has links pointing to our specific Dog Clothes page, that page doesn't even rank anywhere when searching Google with "dog clothes site:baxterboo.com". http://www.google.com/webhp?source=hp&q=dog+clothes+site:baxterboo.com&#sclient=psy&hl=en&site=webhp&source=hp&q=dog+clothes+site:baxterboo.com&btnG=Google+Search&aq=f&aqi=&aql=&oq=dog+clothes+site:baxterboo.com&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=f4efcaa1b8c328f Pages 2+ of product results from that page rank, but not the base page. It's not excluded in robots.txt, All on site links to that page use the same URL. That page is loaded with more text that includes the keywords. I don't believe there's duplicated content. What am I missing? Has the page somehow been penalized?
Intermediate & Advanced SEO | | BBPets0