Duplicate Content Issue: Mobile vs. Desktop View
-
Setting aside my personal issue with Google's favoritism for Responsive websites, which I believe doesn't always provide the best user experience, I have a question regarding duplicate content...
I created a section of a Wordpress web page (using Visual Composer) that shows differently on mobile than it does on desktop view. This section has the same content for both views, but is formatted differently to give a better user experience on mobile devices. I did this by creating two different text elements, formatted differently, but containing the same content. The problem is that both sections appear in the source code of the page. According to Google, does that mean I have duplicate content on this page?
-
HI Dino,
I don't see any issues. It is okay to use multiple H1 tags for reasons such as this. Google has confirmed multiple H1 tags are okay.
My example above was probably more alarming to you then I could have realized. My effort was to point out a simple case of how to use css for multiple device types. In your case having different text is for the benefit of the user which is exactly as it should be.
Good job,
Don
-
My developer (in training) figured out a solution to eliminate the duplicate content; however, I'm still wonder if having two H1 tags (one shows on mobile and the other shows on desktop) in the source code will hurt my SEO. I usually like to stick to one H1 so there is no confusion for Googlebots. Here's one of the pages in question:
view-source:http://new.brooklynmanhattanlocksmith.com/services/automotive/
Thanks for the help! Dino
-
HI Dino,
Before I said to much I had to look at Visual Composer. Spent about 10 minutes there and didn't really see how the code turns out. Perhaps if you like to post a link to the webpage or just message me if you don't want it public. I'll be happy to review the source and offer a thumbs up or any suggestions I can.
Good luck,
Don
-
Thanks, Don. Would this work if I have a separate H1 tag for each version as well? I want Google to recognize each H1 for each version and not get confused as to which headline is a priority.
Regards,
Dino
-
Hi Dino,
Is your code something (basic) like this.
I love lamp!
I love lamp!
Then you use a switch to determine which view to show?
If so, the correct way would be to use the switch to select which CSS to load instead. Thus you can use the same class but it will show up different based off of the users view.
I love lamp!
Here is a nice article about switching CSS based on views.
Hope that helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Best Practices for Leveraging Long Tail Content & Gated Content
Our B2B site has a lot of of long form content (e.g., transcriptions from presentations and webinars). We'd like to leverage the long tail SEO traffic driven to these pages and convert those visitors to leads. Essentially, we'd like Google to index all this lengthy, keyword-rich content AND we'd like to put up a read gate that requires users to register before viewing the full article. This is a B2B site, and the goal is to generate leads. Some considerations and questions: How much of the content to share before requiring registration? Ask too soon and it's a terrible user experience, give too much away and our business objectives are not met. Design-wise, what are good ways to do this? I notice Moz uses a "teaser" to block Mozinar content, and I've seen modals and blur bars on other sites. Any gotchas that Google doesn't like that we should be aware of? Trying to avoid anything that might seem like cloaking. Is it better to split the content across several pages (split a 10K word doc across 10 URLs and include a read gate on each) or keep to one page? Thank you!
Web Design | | Allie_Williams0 -
Show new mobile site to 60% users & old mobile site to 40% users
Hi, We are planning to show new mobile site to 60% users & old mobile site to 40% users. We will show the old site to google crawler. Our old site has some interlinking through footer & content whereas the new site does not has it. We wanted to do this since our new site does not supports some browsers. Will there be an issue with Google on showing the site like this. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
How To Avoid Duplicate Content
We are an eCommerce site for autoparts. It is basically impossible to avoid duplicate content, and I think we are getting penalized by Google for it. Here is why it is impossible. Let's say I sell a steering rack for a 2000 Honda Accord. I need an SEO rich page for 2000 Honda Accord Steering Rack. I sell steering racks for more than 25 years of Honda Accords. I can try and make the copy different but there is no way to spin the copy that many times and make it seem like it is not duplicate copy. This even gets more complicated because I sell hundreds of parts for each year of a Honda Accord, plus a lot of times you even have to go down to the engine size of the car for the right part. I can't use a redirect, ie 301 redirect because they are not the same pages. One is for a 2000 Honda Accord and the other a 2001 Honda Accord, and so on. Is their a redirect out there that I do not know about that would help me out in this case? Also, if their is no way around this and I am getting penalized would it be better to eliminate all these pages, possibly losing my ability to rank high on searches such as "2000 Honda Accord Steering Rack," and just replace with a page that has a Year Make Model, and Part dropdown which just takes the customer a checkout page?
Web Design | | joebuilder0 -
Google Bot cannot see the content of my pages
When I go to Google Webmaster tools and I type in any URL from the site http://www.ccisolutions.com in the "Fetch as Google Bot" feature, and then I click the link that says "success," Google bot is seeing my pages like this: <code>HTTP/1.1 200 OK Date: Tue, 26 Apr 2011 19:11:50 GMT Server: Apache/2.2.6 (Unix) mod_ssl/2.2.6 OpenSSL/0.9.7a DAV/2 PHP/5.2.4 mod_jk/1.2.25 Set-Cookie: CCISolutions-UT-Status=66.249.72.55.1303845110495128; path=/; expires=Thu, 25-Apr-13 19:11:50 GMT; domain=.ccisolutions.com Last-Modified: Tue, 28 Oct 2008 14:36:45 GMT ETag: "314b26-5a-2d421940" Accept-Ranges: bytes Content-Length: 90 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: text/html Any clue as to why this could be happening?</code>
Web Design | | danatanseo0 -
How will engines deal with duplicate head elements e.g. title or canonicals?
Obviously duplicate content is never a good thing...on separate URL's. Question is, how will the engines deal with duplicate meta tags on the same page. Example Head Tag: <title>Example Title - #1</title> <title>Example Title - #2</title> My assumption is that Google (and others) will take the first instance of the tag, such that "Example Title - #1" and canonical = "http://www.example.com" would be considered for ranking purposes while the others are disregarded. My assumption is based on how SE's deal with duplicate links on a page. Is this a correct assumption? We're building a CMS-like service that will allow our SEO team to change head tag content on the fly. The easiest solution, from a dev perspective, is to simply place new/updated content above the preexisting elements. I'm trying to validate/invalidate the approach. Thanks in advance.
Web Design | | PCampolo0 -
Flat vs. Silo Site Architecture, What's Better
I'm in the midst of converting a fairly large website (500+ pages) into WordPress as a content management system. I know that there are two schools of thought regarding site architecture: Those who believe that everything should be categorized, I.E.- website.com/shoes/reebok/running People who believe that the less clicks it takes from the homepage the better. As it stands, our current site has a completely flat architecture, with landing pages being added randomly to the root, I.E.- website.com/affordable-shoes-in-louisville-ky I'm beginning to think that there is a gray area with this. I spoke to someone who says that you should never have a page more than 2 categories/subfolders deep. But if we plan on adding a lot of content doesn't it make sense to set the site up into many categories so we can set a good foundation for adding massive amounts of content. Also, will 301 redirecting to the new structure cause us to lose rankings for certain terms? Any help here is appreciated.
Web Design | | C-Style0 -
Crawl Budget vs Canonical
Got a debate raging here and I figured I'd ask for opinions. We have our websites structured as site/category/product This is fine for URL keywords, etc. We also use this for breadcrumbs. The problem is that we have multiple categories into which a category fits. So "product" could also be at site/cat1/product
Web Design | | Highland
site/cat2/product
site/cat3/product Obviously this produces duplicate content. There's no reason why it couldn't live under 1 URL but it would take some time and effort to do so (time we don't necessarily have). As such, we're applying the canonical band-aid and calling it good. My problem is that I think this will still kill our crawl budget (this is not an insignificant number of pages we're talking about). In some cases the duplicate pages are bloating a site by 500%. So what say you all? Do we just simply do canonical and call it good or do we need to take into account the crawl budget and actually remove the duplicate pages. Or am I totally off base and canonical solves the crawl budget issue as well?0