Multiple Versions of Mobile Site
-
Hey Guys,
We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site.Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curiousMy main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized!
Thanks,
LW
I know the first thing that comes to your mind is Duplicate content
-
Hi LW,
Sorry for the extreme delay here - the Q&A notification system went wonky for a bit and I never got the response message for this thread.
I'm sure you're passed this issue by now, but yes - Googlebot Mobile should just index the mobile version of the page.
Best,
Mike -
Hey Mike,
Thanks for your feedback, it is really helpful.
We are serving up unique source code on the same URL per device, with the user agent being detected on the server-side.
Am I right in assuming that he Googlebot Mobile will only see one version of the pages and index accordingly?
Cheers,
LW
-
Hi LW,
I'm wondering about some particulars of your setup for this.
How are URLs handled between the three sites (1 desktop, 2 mobile)?
Are you serving up unique source code on the same URL per device, or do you have device-specific URLs for all content?
What are you using to detect the useragent and redirect the user? Is this happening server-side, or with JavaScript?
The particulars of your setup will determine your best approach. When in doubt I would follow the instructions on this page.
I would not expect two mobile versions of your site to cause a duplicate content issue - more likely that Googlebot Mobile will only see one version of the pages and index those (but as above, the technical particulars will determine this).
Best,
Mike -
Thanks for your response, You raise a very valid point about the time taken for Google to index it. The new site has been live for a couple of weeks now, so I was hoping to see the new site to be starting to get indexed by Google by now!
In regards to rel="canonical", Yes we have implemented on the mobile site referencing the desktop site.
Reason behind developing a new version rather than just updating the previous version was because we had new functionality to include and a fair few changes to design based on learning from the old site. That being said code from the first site was still being used so it wasn't a completely new build.
-
If you have only just launched the new version of your mobile site, it may take some time before Google indexes it and detects that there are duplicate content with your previous version. Google bot doesn't crawl all new sites instantly.
Just wondering, have you done anything to prevent duplicate content penalty, such as using rel="canonical" tag? Also, why not update your previous version instead of creating a different mobile site entirely?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Leveraging A Second Site
Hi, A client of mine has an opportunity to buy/control another site in the same niche. The client's site is the top-ranked site for the niche. The second site is also often top half of page one. The second site has a 15 year old design that is a really bad, almost non-functional, user experience and thin content. The client's site (site 1) has the best link profile and dominates organic search, but the second site's link profile is as good as our nearest competitor's link profile. Both sites have been around forever. Both sites operate in the affiliate marketing space. The client's site is a multi million dollar enterprise. If the object were to wring the most ROI out of the second site, would you: A) Make the second site not much more than a link slave to the first, going through the trouble to keep everything separate, including owner, hosting, G/A, log-on IPs, so as not to devalue the links to 1st site, etc? Or... B) Develop the second site and not worry about hiding that both are the same owner. Or... C) Develop the second site and still worry about it keeping it all hidden from Google. Or... D) Buy the second site and forward the whole thing to site 1. I know the white hat answer is "B," but would like to hear considerations for these options and any others. Thanks! P.S., My pet peeve is folks who slam a fast/insufficient answer into an unanswered question, just to be the first. So, please don't.
White Hat / Black Hat SEO | | 945010 -
My indexed site URL removed from google search without get any message or Manual Actions???
On Agust 2 or 3.. I'm not sure about the exact date...
White Hat / Black Hat SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results I never get any penalties to my site on Google search console. I noticed some drops on some keywords before that happens (in June and July) but it all of it was related to web design keywords for local Qatar, but all other keywords that related to SEO and digital marketing were not have any changes and been on top My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Recently, based on google console, I found some new links but I have no idea how it been added to links of my website:
essay-writing-hub.com - 9,710
tiverton-market.co.uk - 252
facianohaircare.com - 48
prothemes.biz - 44
worldone.pw - 2
slashdot.org - 1
onwebmarketing.com - 1 the problem is that all my high PR real links deleted from google console as well although it still have my site link and it could be recognized by MOZ and other sites! Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
Hacked site vs No site
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
White Hat / Black Hat SEO | | Rich_Coffman0 -
Why is a site that does all the wrong things dominating?
A site that is a competitor of ours is basically dominating the search results despite doing everything you're not supposed to do, including: Purchasing links Having content that is thin, templated, and duplicate - adds little value Owning half a dozen other sites for linking to each other (link wheel?) We spend a lot of time on our content and making it the most useful it can be for our visitors. Granted our site is newer but we avoid these gray/black hat practices and yet we're not ranking nearly as high. What gives?
White Hat / Black Hat SEO | | Harbor_Compliance0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Need some advise on using a micro site
I thought I would use a micro site with just some main product landing pages being used. I would use the same design and code as main site, then re-write the text and then link everything to the new site. “BUT” I'm concerned about getting a penalty (duplicate) as all the anchor text links going to the main site would be identical! EG. To use the same design as the main site I would need to use the same layout etc including navbars, anchor text links in the footer etc.. and I'm worried this may trigger a duplicate content penalty ? Any advise please
White Hat / Black Hat SEO | | doorguy880 -
Infographic submission sites potentially offering paid links....
Good Morning/Afternoon fellow Mozzers, I recently created an infographic and am now looking to get it distributed via as many publications as possible. I discovered some great sites with collections of infographics.However I have discovered a multitude of sites offering to review and feature the infographic, or "express" submissions so the graphic features faster for a price..... links below. http://www.amazinginfographics.com/submit-infographics/ http://infographicjournal.com/submit-infographics/ 2 questions 1. Is this considered as buying links? My instincts say Yes. 2. Some sites offer mix of free and "express" paid submissions. If the answer to Q.1 is yes, should I avoid them all together even if my graphic gets picked up free? Thanks in advance for the feedback.
White Hat / Black Hat SEO | | RobertChapman0