Hiding content or links in responsive design
-
Hi,
I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that.
Google says:
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/detailsFor usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none")
Is this counted as hidden content and could penalize your site or not?
What do you guys do when you create responsive design websites?
Thanks!
GaB
-
Hi,
Saijo and Bradley are right in saying that hiding elements on a smaller screen should not be an issue (as it's a correct implementation of responsive design). Bear in mind as well that there is a Googlebot and a Smartphone Googlebot, so as long as the Googlebot is seeing what desktop users see and the Smartphone Googlebot (which uses an iPhone5 user agent) is seeing what mobile users see, it shouldn't be a problem.
The only thing I would add:
If you are going to use display:none to prevent a user from seeing something when they view your site, it's good to include an option to 'view full site' or 'view desktop site'. Also in that case I would question whether you actually need that content on the desktop site at all? Because best practice is to provide all the same content regardless of device.
If it's hidden but still accessible to the mobile user (in a collapsible div for instance) there's no cloaking involved so it shouldn't cause a problem.
As a side note: the Vary HTTP header is really for a dynamically served website (that is, a single URL which checks user agent and then serves the desktop HTML to desktop devices and mobile HTML to mobile devices).
Hope that helps!
-
The way I see it.
Google does not have a problem with proper use of things like media queries. More info : https://developers.google.com/webmasters/smartphone-sites/details . They ONLY have problem with webmasters when the hidden text is only available to search engines for SERP manipulation.
Read more into the " The Vary HTTP header " bit in the link above and some info from Matt : http://www.youtube.com/watch?v=va6qtaiZRHg&feature=player_detailpage#t=219
-
I understand what you are referring to about having to hide certain elements on smaller screens. Sometimes not everything fits or flows correctly.
When this happens, however, I try to hide design elements as opposed to text or links. I'm also OK with hiding images. If a block of text or a link seems out of place or doesn't flow properly, I will build a dropdown for it. I'm sure you've seen mobile sites with dropdown navigation menus.
I wouldn't leave it to up to Google to interpret what you are doing. Don't hide any links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Black Hat Link Building?
Hello big-brained Moz folks, We recently used Open Site Explorer to compile a list of inbound linking domains to one of our clients, alongside domains linking to a major competitor. This competitor, APBSpeakers.com, is dominating the search results with many #1 rankings for highly competitive phrases, even though their onsite SEO is downright weak. This competitor also has exponentially more links(602k vs. 2.4k) and way more content(indexed pages) reported than any of their competitors, which seems physically impossible to me. Linking root domains are shown as 667 compared to 170 for our client, who has been in business for 10+ years. Taking matters a step further, linking domains for this competitor include such authoritative domains as: Cnn.com TheGuardian.com PBS.org HuffingtonPost.com LATimes.com Time.com CBSNews.com NBCNews.com Princeton.edu People.com Sure, I can see getting a few high profile linking domains but the above seems HIGHLY suspicious to me. Upon further review, I searched CNN, The Guardian and PBS for all variations of this competitors name and domain name and found no immediate mentions of their name. I smell a rat and I suspect APB is using some sort behind-the-scenes programming to make these "links" happen, but I have no idea how. If this isn't the case, they must have a dedicated PR person with EXTREMELY strong connections to secure this links, but even this seems like a stretch. It's conceivable that APB is posting comments on all of the above sites, along with links, however, I was under the impression that all such posts were NoFollow and carried no link juice. Also, paid advertisements on the above sites should be NoFollow as well, right? Anyway, we're trying to get to the bottom of this issue and determine what's going on. If you have any thoughts or words of wisdom to help us compete with these seemingly Black Hat SEO tactics, I'd sure love to hear from you. Thanks for your help. I appreciate it very much. Eric
White Hat / Black Hat SEO | | EricFish0 -
How to dismantle a link building scheme?
My team performs SEO only in the real estate space. While doing some research recently we came across a semi-elaborate link building scheme by one of our competitors. This SEO firm built a dummy real estate resource site with lots of general content, nofollow links to brands (e.g. NYT, Fannie Mae etc.) for validation and links for high-valued keywords pointing to their clients' sites. Basically the whole site is a clever front to help their clients rank. Still, it seems to be working for them (at least for now), which I'm guessing is due to lack of strong competition and the site being quite old. Oh, and they also charge to become "affiliates" on the site, i.e. paid links disguised as non-paid. I reported the scheme via the Search Console. Anything else we could do? Have any of you had experience dealing with this kind of link scheming before? Any guidance is appreciated. Thank you!
White Hat / Black Hat SEO | | willthefrench0 -
New un-natural links to my website that i didnt create.. and lots of them!
Hi There In the last few months my search organic traffic has gone down and i am looking into links to my website through webmaster tools. It looks like there is some sort of automated robot that is building new links to my website at a rate of 5 per week! All are spammy, directory links. It looks like this has been going on for a few months now. I have no idea how to find and stop this. But i have a feeling this might explain why my traffic is down. ALSO non of these links are in MOZ Open site explorer or Majestic SEO.. they are just showing as just discovered in webmaster tools. I am assuming that means its pretty accurate ?! ALL help is appreciated! Thanks! Paula
White Hat / Black Hat SEO | | Pixelstorm0 -
Cross-Site Links with different Country Code Domains
I have a question with the penguin update. I know they are really cracking down on "spam" links. I know that they are wanting you to shift from linking keywords to the brand name, unless it makes sense in a sentence. We have five sites for one company in the header they have little flag images, that link to different country domains. These domains all have relatively the same domain name besides the country code. My question is, linking these sites back and fourth to each other in this way, does it hurt you in penguin? I know they are wanting you to push your identity but does this cross-site scheme hurt you? In the header of these sites we have something like this. I am assuming the best strategy would probably be to treat them like separate entities. Or, just focus on one domain. They also have some sites that have links in the footer but they are set up like:
White Hat / Black Hat SEO | | AlliedComputer
For product visit Domain.com Should nofollows be added on these footer links as well? I am not sure if penguin finds them spammy too.0 -
Spam linking site how to report
I have a spam linking site that is generation thousans of links to my site. Even if i have a good link background, this is the only spammy i have, each week number of links comings from it increases by 500 , i know have 3000 links for that site and 1800 for other sites, but that one keeps growing What should i do, i dont want that link it is imposible to remove as webmaster does not respond
White Hat / Black Hat SEO | | maestrosonrisas0 -
DIV Attribute containing full DIV content
Hi all I recently watched the latest Mozinar called "Making Your Site Audits More Actionable". It was presented by the guys at seogadget. In the mozinar one of the guys said he loves the website www.sportsbikeshop.co.uk and that they have spent a lot of money on it from an SEO point of view (presumably with seogadget) so I decided to look through the source and noticed something I had not seen before and wondered if anyone can shed any light. On this page (http://www.sportsbikeshop.co.uk/motorcycle_parts/content_cat/852/(2;product_rating;DESC;0-0;all;92)/page_1/max_20) there is a paragraph of text that begins with 'The ever reliable UK weather...' and when you via the source of the containing DIV you will notice a bespoke attribute called "threedots=" and within it, is the entire text content for that DIV. Any thoughts as to why they would put that there? I can't see any reason as to why this would benefit a site in any shape or form. Its invalid markup for one. Am I missing a trick..? Thoughts would be greatly appreciated. Kris P.S. for those who can't be bothered to visit the site, here is a smaller version of what they have done: This is an introductory paragraph of text for this page.
White Hat / Black Hat SEO | | yousayjump0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Attacked with spam links.
Our website was hit with the "Pharma hack", "Google Cloaking Hack", or "Blackhat SEO Spam". and Google showed in the results this website may be compromised. After cleaning out the hack from the website I chacked with the Seomoz tool Open Site Explorer and I found that they hacked 1000 of other websites and created links to my website. They were building a few 1000 links to the website with the clickable text "buy cheap online pharmacy". and more like that. This website www.washington23.com has been hacked and gives over 200 links to your website for pharmacy items. And Google considers this from your impotent links as i can see in webmasters. What can I do about it?
White Hat / Black Hat SEO | | Joseph-Green-SEO0