Hiding content or links in responsive design
-
Hi,
I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that.
Google says:
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/detailsFor usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none")
Is this counted as hidden content and could penalize your site or not?
What do you guys do when you create responsive design websites?
Thanks!
GaB
-
Hi,
Saijo and Bradley are right in saying that hiding elements on a smaller screen should not be an issue (as it's a correct implementation of responsive design). Bear in mind as well that there is a Googlebot and a Smartphone Googlebot, so as long as the Googlebot is seeing what desktop users see and the Smartphone Googlebot (which uses an iPhone5 user agent) is seeing what mobile users see, it shouldn't be a problem.
The only thing I would add:
If you are going to use display:none to prevent a user from seeing something when they view your site, it's good to include an option to 'view full site' or 'view desktop site'. Also in that case I would question whether you actually need that content on the desktop site at all? Because best practice is to provide all the same content regardless of device.
If it's hidden but still accessible to the mobile user (in a collapsible div for instance) there's no cloaking involved so it shouldn't cause a problem.
As a side note: the Vary HTTP header is really for a dynamically served website (that is, a single URL which checks user agent and then serves the desktop HTML to desktop devices and mobile HTML to mobile devices).
Hope that helps!
-
The way I see it.
Google does not have a problem with proper use of things like media queries. More info : https://developers.google.com/webmasters/smartphone-sites/details . They ONLY have problem with webmasters when the hidden text is only available to search engines for SERP manipulation.
Read more into the " The Vary HTTP header " bit in the link above and some info from Matt : http://www.youtube.com/watch?v=va6qtaiZRHg&feature=player_detailpage#t=219
-
I understand what you are referring to about having to hide certain elements on smaller screens. Sometimes not everything fits or flows correctly.
When this happens, however, I try to hide design elements as opposed to text or links. I'm also OK with hiding images. If a block of text or a link seems out of place or doesn't flow properly, I will build a dropdown for it. I'm sure you've seen mobile sites with dropdown navigation menus.
I wouldn't leave it to up to Google to interpret what you are doing. Don't hide any links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Black Hat Link Building?
Hello big-brained Moz folks, We recently used Open Site Explorer to compile a list of inbound linking domains to one of our clients, alongside domains linking to a major competitor. This competitor, APBSpeakers.com, is dominating the search results with many #1 rankings for highly competitive phrases, even though their onsite SEO is downright weak. This competitor also has exponentially more links(602k vs. 2.4k) and way more content(indexed pages) reported than any of their competitors, which seems physically impossible to me. Linking root domains are shown as 667 compared to 170 for our client, who has been in business for 10+ years. Taking matters a step further, linking domains for this competitor include such authoritative domains as: Cnn.com TheGuardian.com PBS.org HuffingtonPost.com LATimes.com Time.com CBSNews.com NBCNews.com Princeton.edu People.com Sure, I can see getting a few high profile linking domains but the above seems HIGHLY suspicious to me. Upon further review, I searched CNN, The Guardian and PBS for all variations of this competitors name and domain name and found no immediate mentions of their name. I smell a rat and I suspect APB is using some sort behind-the-scenes programming to make these "links" happen, but I have no idea how. If this isn't the case, they must have a dedicated PR person with EXTREMELY strong connections to secure this links, but even this seems like a stretch. It's conceivable that APB is posting comments on all of the above sites, along with links, however, I was under the impression that all such posts were NoFollow and carried no link juice. Also, paid advertisements on the above sites should be NoFollow as well, right? Anyway, we're trying to get to the bottom of this issue and determine what's going on. If you have any thoughts or words of wisdom to help us compete with these seemingly Black Hat SEO tactics, I'd sure love to hear from you. Thanks for your help. I appreciate it very much. Eric
White Hat / Black Hat SEO | | EricFish0 -
Disappearing Links Black Hat ?
I have seen reports of Black hat spamming with dodgy links but we have another issue with a clients site. The site had a small number of solid following links about 60 which had been in place for years and in the past few weeks all but those directly under their control have ceased to link. At the same time a very aggressive competitor has entered their market which is owned by the officers of an SEO company. Could it be that they have somehow disavowed the links to the site to damage it how do we find out? there are now just 10 following links?
White Hat / Black Hat SEO | | Eff-Commerce0 -
No cache still a good link for disavow?
Hi Yall, 2 scenarios: 1. I'm on the border line of disavowing some websites that link to me. If the page is N/A (not available) for the cache, does that mean i should disavow them? 2. What if the particular page was really good content and the webmaster just has the worse seo skills in not interlinking his old blogs, hence why the page that's linking to me is N/A for cache, should i still disavow it? Thanks
White Hat / Black Hat SEO | | Shawn1240 -
Should I Disavow More Links
My SEO website got hit with a very severe penalty about a year ago and it was totally banished from the rankings for all of the money terms like SEO, SEO company and search engine optimisation (before the penalty I ranked in the top 10-15 for all of those phrases, top 3 for SEO company). I was probably hit for being listed in shed loads of paid directories, low quality free directories, footer links in client sites, keyword forum signature links and articles with keyword rich text links. A month or so after I got hit I started trying to clean up my link profile, I got rid of all of the client website links, I changed the link text on the majority of forum signature links and article links, I managed to get rid of about 50 directory links and the ones that I could not get taken down I disavowed - about 150. During that time I sent 2-3 separate reconsideration requests and I got this message each time: "Links to your site violate Google's quality guidelines" After doing all of that work and being rejected I pretty much gave up - things just seemed to get worst, not only was I no longer ranking for the money terms, but all of my blog posts tanked as well. I got my site redesigned and switched to Wordpress - I used 301 redirects and everything but they totally didn't work. My organic traffic went down to less than 50 hits a day - before the penalty I was getting over 300 a day. Then on Saturday just gone, almost exactly a year after I got hit with the penalty I noticed my site ranking in position 23 on Google.co.uk in the UK for the competitive phrase SEO company from being absolutely nowhere and I do mean nowhere. This sign has given me hope and the motivation to get rid of the penalty altogether, update all of my articles, get rid of bad advice in old blog posts and get rid of the rest of the bad links. Thing is that I am nervous to go getting rid of more links and disavowing, what if I do more harm then good? Do you think the penalty has been removed and I should just leave the rest of the bad links or should I continue trying to clean things up? By the way, my website is http://www.seoco.co.uk
White Hat / Black Hat SEO | | Eavesy1 -
What is your SEO agency doing in terms of link building for clients?
What are you or your SEO agency doing for your client's link building efforts? What are you (or the agency) doing yourself, or out-sourcing, or having the client do for link building? If a new client needs some serious link building done, what do you prescribe and implement straight off the bat? What are your go-to link building tactics for clients? What are the link building challenges faced by your agency in 2013/2014? What's working for your agency and what's not? Does your agency work closely with the client's marketing department to gain link traction? If so, what are collaborating on? What else might you be willing to share about your agencies link building practices? Thanks
White Hat / Black Hat SEO | | Martin_S0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Links In Blog Posts: 1 Paragraph VS. Full Article
Hey guys, I've been using an article network to post unique articles (not spun). Been posting 1 paragraph articles with 1 text link. Just wondering what the main difference would be if I were to post a full article with 2 or 3 text links vs 1 paragraph with 1 text link, besides the fact that you get more links and save more time writing only 1 paragraph. Will the full article with 3 backlinks improve keyword ranks more or not by much? Cheers!
White Hat / Black Hat SEO | | upick-1623910 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780