Multiple Versions of Mobile Site
-
Hey Guys,
We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site.Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curiousMy main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized!
Thanks,
LW
I know the first thing that comes to your mind is Duplicate content
-
Hi LW,
Sorry for the extreme delay here - the Q&A notification system went wonky for a bit and I never got the response message for this thread.
I'm sure you're passed this issue by now, but yes - Googlebot Mobile should just index the mobile version of the page.
Best,
Mike -
Hey Mike,
Thanks for your feedback, it is really helpful.
We are serving up unique source code on the same URL per device, with the user agent being detected on the server-side.
Am I right in assuming that he Googlebot Mobile will only see one version of the pages and index accordingly?
Cheers,
LW
-
Hi LW,
I'm wondering about some particulars of your setup for this.
How are URLs handled between the three sites (1 desktop, 2 mobile)?
Are you serving up unique source code on the same URL per device, or do you have device-specific URLs for all content?
What are you using to detect the useragent and redirect the user? Is this happening server-side, or with JavaScript?
The particulars of your setup will determine your best approach. When in doubt I would follow the instructions on this page.
I would not expect two mobile versions of your site to cause a duplicate content issue - more likely that Googlebot Mobile will only see one version of the pages and index those (but as above, the technical particulars will determine this).
Best,
Mike -
Thanks for your response, You raise a very valid point about the time taken for Google to index it. The new site has been live for a couple of weeks now, so I was hoping to see the new site to be starting to get indexed by Google by now!
In regards to rel="canonical", Yes we have implemented on the mobile site referencing the desktop site.
Reason behind developing a new version rather than just updating the previous version was because we had new functionality to include and a fair few changes to design based on learning from the old site. That being said code from the first site was still being used so it wasn't a completely new build.
-
If you have only just launched the new version of your mobile site, it may take some time before Google indexes it and detects that there are duplicate content with your previous version. Google bot doesn't crawl all new sites instantly.
Just wondering, have you done anything to prevent duplicate content penalty, such as using rel="canonical" tag? Also, why not update your previous version instead of creating a different mobile site entirely?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Do sitewide links from other sites hurt SEO?
A friend of mine has a pagerank 3 website that links to all my pages on my site on every page of his site. The anchor text of all these links are the title of each page that it links to. Does this hurt SEO? I can have him change to the links to whatever i want, so if it does hurt, what should i change the anchor text to if needed? Thanks mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
Site dropped suddenly. Is it due to htaccess?
I had a new site that was ranking on the first page for 5 keywords. My site was hacked recently and I went through a lot of trouble to restore it. Last night, I discovered that my site was nowhere to be found but when i searched site: mysite.com, it was still ranking which means it was not penalized. I discovered the issue to be a .htaccess and it have been resolved. My question is now that the .htaccess issue is resolved , will my site be restored back to the first page? Is there additional things that i should do? I have notified google by submitting my site
White Hat / Black Hat SEO | | semoney0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
I think my site is affected by a Google glitch...or something
Although google told me No manual spam actions found i had not received an unnatural link request notice i figured it would be a good idea to clean these up so i did. So i have submitted 3 reconsideration requests from google. They all came back with the same response: No manual spam actions found. I really doubt that anyone at google really checked those out.You will notice that i don't even appear on page 1-10 at all...its clearly google filtering the site out from the results(except for my brand terms), but i have no idea what for.What do you guys think it is? If you see anythign let me know so i can have it fixed.This has been going on for 2 months now...my company has been around for a long time...i dont understand why suddenly im not showing up in searches for the keyword si used to rank for...
White Hat / Black Hat SEO | | CMTM0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70