Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
-
I'm trying to determine the best way to handle my mobile commerce site.
I have a desktop version and a mobile version using a 3rd party product called CS-Cart.
Let's say I have a product page. The URLs are...
mobile:
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857desktop:
store.domain.com/two-toned-tee.htmlI've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address.
I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com.
Any additional thoughts on this would be great!
-
That's an awfully interesting mobile URL =/ Ideally you should only have one URL. If that's not possible, you want to follow the same URL structure as closely as possible. If that's not possible, you'll want to add rel="alternate" on the desktop site to the corresponding mobile URL, and a canonical tag on the mobile site to the desktop site.
https://developers.google.com/webmasters/smartphone-sites/details
By the way, I'd strongly suggest detecting based on screen size or width rather than user-agent wherever possible.
-
If CS-Cart doesn't work out for you, Shopgate will build you a search optimized mobile site with your current website's look and feel.
-
Have you tried doing internal URL mapping, rather than allowing two different URLs?
Do you have access to .htaccess?
First detect the mobile device and then do the internal map for that.
If you can do it, there is still only the friendly URL showing externally.
Of course, it depends how the internal system works and how pages are output, but you would just stop the search engines from accessing the internally mapped links.
-
Thank you for the response.
We do have user agent detection going, and users are redirected appropriately. It's just that I have to redirect them to a non-SEO friendly URL for the mobile store. There's nothing I can do about that. I'm thinking I just need a canonical URL tag on the mobile store pages so that Google credits the right link and not the ugly link.
I would love to be able to try something like using the same page with CSS friendly design for mobile devices, but I don't have that option in any way using this 3rd party solution.
-
You need to server up the right page to the right crawler, googlebot and googlebotmobile
You need to detect the useragent and redirecte to the correct version, this is goolges idea, i think it is a lot of work and bound to cause confusion. You can also set up a mobil sitemap, to help indicate the difference
How i have done my mobile pages is using the same page with the tag and css, this idea is much easier I believe
see this page near the botton,
http://msdn.microsoft.com/en-us/hh553501
here is anouther example
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Same server for different client sites?
Hi everyone - I have a question about whether it's OK for us to host several of our client's websites on the same dedicated web server, without this causing problems in SEO. I know the issues with duplicate content etc., but for background - we provide website services to a particular sector (antiques/auctions). All our clients are distinct, and have written their own copy etc., but because they're all in the same sector, their websites will - largely - talk about the same types of things - so the content is not duplicated, but it's similar in topic, I guess. Does anyone feel it would cause a problem if we were to put several (say about 😎 of our client's websites on the same dedicated web server, or would we be better spreading the sites over different shared servers? Come to think about it, if we are spreading those same 8 sites across 4 virtual servers - but all hosted by the same company - presumably Google would know that too? Thanks in advance for your thoughts on this! Nikki
Intermediate & Advanced SEO | | Go-Auction0 -
Duplicate content hidden behind tabs
Just looking at an ecommerce website and they've hidden their product page's duplicate content behind tabs on the product pages - not on purpose, I might add. Is this a legitimate way to hide duplicate content, now that Google has lowered the importance and crawlability of content hidden behind tabs? Is this a legitimate tactic to tackle duplicate content? Your thoughts would be welcome. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
I have search result pages that are completely different showing up as duplicate content.
I have numerous instances of this same issue in our Crawl Report. We have pages showing up on the report as duplicate content - they are product search result pages for completely different cruise products showing up as duplicate content. Here's an example of 2 pages that appear as duplicate : http://www.shopforcruises.com/carnival+cruise+lines/carnival+glory/2013-09-01/2013-09-30 http://www.shopforcruises.com/royal+caribbean+international/liberty+of+the+seas We've used Html 5 semantic markup to properly identify our Navigation <nav>, our search widget as an <aside>(it has a large amount of page code associated with it). We're using different meta descriptions, different title tags, even microformatting is done on these pages so our rich data shows up in google search. (rich snippet example - http://www.google.com/#hl=en&output=search&sclient=psy-ab&q=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&oq=http:%2F%2Fwww.shopforcruises.com%2Froyal%2Bcaribbean%2Binternational%2Fliberty%2Bof%2Bthe%2Bseas&gs_l=hp.3...1102.1102.0.1601.1.1.0.0.0.0.142.142.0j1.1.0...0.0...1c.1.7.psy-ab.gvI6vhnx8fk&pbx=1&bav=on.2,or.r_qf.&bvm=bv.44442042,d.eWU&fp=a03ba540ff93b9f5&biw=1680&bih=925 ) How is this distinctly different content showing as duplicate? Is SeoMoz's site crawl flawed (or just limited) and it's not understanding that my pages are not dupe? Copyscape does not identify these pages as dupe. Should we take these crawl results more seriously than copyscape? What action do you suggest we take? </aside> </nav>
Intermediate & Advanced SEO | | JMFieldMarketing0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Wordpress.com content feeding into site's subdomain, who gets SEO credit?
I have a client who had created a Wordpress.com (not Wordpress.org) blog, and feeds blog posts into a subdomain blog.client-site.com. My understanding was that in terms of SEO, Wordpress.com would still get the credit for these posts, and not the client, but I'm seeing conflicting information. All of the posts are set with permalinks on the client's site, such as blog.client-site.com/name-of-post, and when I run a Google site:search query, all of those individual posts appear in the Google search listings for the client's domain. Also, I've run a marketing.grader.com report, and these same results are seen. Looking at the source code on the page, however, I see this information which leads me to believe the content is being credited to, and fed in from, Wordpress.com ('client name' altered for privacy): href="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg">class="alignleft size-thumbnail wp-image-2050" title="Could_you_survive_a_computer_disaster" src="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg?w=150&h=143" I'm looking to provide a recommendation to the client on whether they are ok to continue moving forward with this current setup, or whether we should port the blog posts over to a subfolder on their primary domain www.client-site.com/blog and use Wordpress.org functionality, for proper SEO. Any advice?? Thank you!
Intermediate & Advanced SEO | | grapevinemktg0 -
Guest blogging and duplicate content
I have a guest blog prepared and several sites I can submit it to, would it be considered duplicate content if I submitted one guest blog post to multipul blogs? and if so this content is not on my site but is linking to it. What will google do? Lets say 5 blogs except the same content and post it up, I understand that the first blog to have it up will not be punished, what about the rest of the blogs? can they get punished for this duplicate content? can I get punished for having duplicate content linking to me?
Intermediate & Advanced SEO | | SEODinosaur0