Multiple Versions of Mobile Site
-
Hey Guys,
We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site.Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curiousMy main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized!
Thanks,
LW
I know the first thing that comes to your mind is Duplicate content
-
Hi LW,
Sorry for the extreme delay here - the Q&A notification system went wonky for a bit and I never got the response message for this thread.
I'm sure you're passed this issue by now, but yes - Googlebot Mobile should just index the mobile version of the page.
Best,
Mike -
Hey Mike,
Thanks for your feedback, it is really helpful.
We are serving up unique source code on the same URL per device, with the user agent being detected on the server-side.
Am I right in assuming that he Googlebot Mobile will only see one version of the pages and index accordingly?
Cheers,
LW
-
Hi LW,
I'm wondering about some particulars of your setup for this.
How are URLs handled between the three sites (1 desktop, 2 mobile)?
Are you serving up unique source code on the same URL per device, or do you have device-specific URLs for all content?
What are you using to detect the useragent and redirect the user? Is this happening server-side, or with JavaScript?
The particulars of your setup will determine your best approach. When in doubt I would follow the instructions on this page.
I would not expect two mobile versions of your site to cause a duplicate content issue - more likely that Googlebot Mobile will only see one version of the pages and index those (but as above, the technical particulars will determine this).
Best,
Mike -
Thanks for your response, You raise a very valid point about the time taken for Google to index it. The new site has been live for a couple of weeks now, so I was hoping to see the new site to be starting to get indexed by Google by now!
In regards to rel="canonical", Yes we have implemented on the mobile site referencing the desktop site.
Reason behind developing a new version rather than just updating the previous version was because we had new functionality to include and a fair few changes to design based on learning from the old site. That being said code from the first site was still being used so it wasn't a completely new build.
-
If you have only just launched the new version of your mobile site, it may take some time before Google indexes it and detects that there are duplicate content with your previous version. Google bot doesn't crawl all new sites instantly.
Just wondering, have you done anything to prevent duplicate content penalty, such as using rel="canonical" tag? Also, why not update your previous version instead of creating a different mobile site entirely?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Should I 301 redirect my old site are just add a link to my new site
I used to offer design and web services on a site that is current blank (no content, no links). My questions is should I add a little bit of content, maybe a brief explanation with a link to my new site. Or should I just add 301 redirect. This is purely a question of what is better for SEO and ranking for my new site (not a branding question).
White Hat / Black Hat SEO | | Tyrell0 -
Old Press Release sites - Which ones do you Disavow and leave alone
Hi Mozers! I need your help. I'm in the final stages of a huge link audit and press releases are a big concern. As you know, press release distribution sites up until 2012 had "follow" links, giving webmasters a delight of having their keyword anchor texts a big boost in rankings. These are the websites that are troubling me today so i would appreciate your input on my strategy below as most of these websites are asking for money to remove them: 1. Press Release sites that are on the same C-class - Disavow 2. Not so authoritative press release websites that just follow my www domain only (no anchor texts) - I leave it alone 3. Not so authoritative press release websites but have anchor texts that are followed - Disavow 4. Post 2012 press release websites that have "followed" anchor text keywords - Request to remove, then disavow 5. Post 2012 press release websites that just follow my www domain only (no anchor texts) - leave it alone #2 and #5 are my biggest concern. Now more than ever I would appreciate your follow ups. I will respond quickly and apply "good answers" to the one's that make the most sense as my appreciation to you. God bless you all.
White Hat / Black Hat SEO | | Shawn1240 -
Has my site been penalized by google
Hi all I have noticed a sudden drop in rankings for most of my keywords on kerryblu ,co,uk and was thinking the site may have been manually penalized by google. I have not received any notification of this in webmaster tools but can't think of any other reason for the loss of rankings. I have searched the web for info on this but can't find a definite answer. Is there any way of knowing for sure. At the time of the crash the only real change I made was adding google adsense to my blog. Could this be responsible. Thanks for looking.
White Hat / Black Hat SEO | | Dill0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70