Mobile Sitemap Issue
-
Hi there,
I am having some difficulty with an error on Webmaster Tools. I'm concerned with a possible duplicate content penalty following the launch of my mobile site.
I have attempted to update my sitemap to inform Google that a different mobile page exists in addition to the desktop page.
I have followed Google's guidelines as outlined here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648I'm having problems with my sitemap.xml file. Webmaster tools is reporting that it is not able to read the file and when I validate it I am getting an error stating that the 'Namespace prefix xhtml on link is not defined'.
All I am trying to do is to create a sitemap that uses the rel="alternate" to inform Google that their is a mobile version of that specific page in addition to the desktop version.
An instance of the code I am using is below:
xml version="1.0" encoding="UTF-8"?> xml-stylesheet type="text/xsl" href="gss.xsl"?> <urlset< span="">xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"> http://www.mydomain/info/detail/ <xhtml:link< span="">rel="alternate" media="only screen and (max-width: 640px)" href="http://m.mydomain.com/info/detail.html"/> <lastmod></lastmod>2013-02-01T16:03:48+00:00<changefreq></changefreq>daily0.50</xhtml:link<></urlset<>
Any help would be much appreciated.
Thanks
-
Make sure you're using the rel="canonical" tag on that mobile page. That tells Google that the original content is elsewhere on your site. This will help keep you from getting dinged from search engines.
You can learn more about the rel="canonical" tag here .... http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Can I use two sitemaps?
I have a Magento website. I am going to add a Wordpress blog under /blog. If I setup each with its own webmaster tools to submit a sitemap does it hurt anything?
Intermediate & Advanced SEO | | Tylerj0 -
Different URL structure Desktop VS Mobile Regarding SEO when building a new seperate mobile site
Hi I have a old OScommerce webshop, that i will keep for now, but i have build a complete new mobile site for mobile devices, but it has another url structure. Can i launch this site without any problems when its Google Mobile Search Engine that index the mobile site, and then just make the neccesary rel alternate tags for the desktop site for the product pages and main categories that i can. There will be some differences in the urls i cant make a alternate for.
Intermediate & Advanced SEO | | noerdar0 -
Manage category pages and duplicate content issues
Hi everybody, I am now auditing this website www.disfracessimon.com
Intermediate & Advanced SEO | | teconsite
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion. When I was checking parts of the content in google to find duplicate content issues I found this: I google I searched: "Chaleco de streck decorado con botones" and found First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in. I was wondering if this could cause any problem related with duplicated content. Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google. Any advice? Thank you0 -
Subdomain Blog Sitemap link - Add it to regular domain?
Example of setup:
Intermediate & Advanced SEO | | EEE3
www.fancydomain.com
blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0 -
Limiting URLS in the HTML Sitemap?
So I started making a sitemap for our new golf site, which has quite a few "low level" pages (about 100 for the golf courses that exist in the area, and then about 50 for course architects), etc etc. My question/open discussion is simple. In a sitemap that already has about 50 links, should we include these other low level 150 links? Of course, the link to the "Golf Courses" is there, along with a link to the "Course Architects" MAIN pages (which, subdivides on THOSE pages.) I have read the limit is around 150 links on the sitemap.html page and while it would be nice to rank long tail for the Golf Courses. All in all, our site architecture itself is easily crawlable as well. So the main question is just to include ALL the links or just the main ones? Thoughts?
Intermediate & Advanced SEO | | JamesO0 -
Online Sitemap Generator
I have a site that has around 5,000 pages now. Are there any recommened online free/paid tools to generate a sitemap for me?
Intermediate & Advanced SEO | | rhysmaster0 -
What is the best method for segmenting HTML sitemaps?
Sitemaps create a Table of Contents for web crawlers and users alike. Understanding how PageRank is passed, HTML sitemaps play a critical role in how Googlebot and other crawlers spider and catalog content. I get asked this question a lot and, in most cases, it's easy to categorize sitemaps and create 2-3 category-based maps that can be linked to from the global footer. However, what do you do when a client has 40 categories with 200+ pages of content under each category? How do you segment your HTML sitemap in a case like this?
Intermediate & Advanced SEO | | stevewiideman0