Will Google Penalize Content put in a Div with a Scrollbar?
-
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique?
Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
-
I see this question has been answered years back from now. But what's the importance of this issue in today's world.
I just got a client's website and he want to add SEO optimized content in the scroll bar at the bottom of the page. I don't know if that's a spam or not. Can you please suggest me.
I'm eager to get a proper answer.
website is: www (dot) zdhsales (dot) com
-
I've actually wondered the same before. To the best of my knowledge I've never heard anyone cite overflow: auto; as a negative signal compared to the amount of press display: none; text-indent: -9999px; etc. gets. It very well could be abused just as badly though. The only way I could think of an abuse-check would be to weigh the amount of text in the corresponding div against what a practical min-height of that div should be, but that seems a bit excessive.
I agree with Steven, it's come to a point where these css techniques have very legitimate uses and probably shouldn't be penalized. Plus, there's plenty of other ways to accomplish the same thing, whether it's document tree manipulation or any other kind of rendering of a page after the crawable URL has been loaded. So at what point is it worth fighting such a thing?
edit: on a side note, what's the deal with those crazy underscores at the end of the URL? yuck.
-
Do Google actually still penalised Overflow:Hidden and Display:none though still, or just off screen placement such as left:-9999px? If they do its something that I'm sure will be changed as its commonly used for "div switching" through navigational menu's and tabs (for display:none at least).
-
Thank you for the response Ryan. Although the site is not outwardly "hiding" the copy, from a usability standpoint this method does not seem to carry much if any value to the person visiting the page. I figured Google would see this as a lame attempt at search engine bate and frown upon the practice.
-
To the best of my knowledge this has no impact on SEO. Googlebot doesn't like it when you hide content, but that only applies to overflow:hidden and display:none as far as I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Is this website being punished by Google?
Hi, I just took over the SEO for a friend of mine's website. Is this website being punished by Google? It has a strong link score, the homepage needs work as far as Key wording goes but it does not appear in Google's top 100 for any keyword. I am not sure that the last SEO company did some harm. Can anyone give me some tips on getting my friend back into the mix? www.wallybuysell.com
White Hat / Black Hat SEO | | CKerr0 -
Google messages & penalties
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity: "...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords." This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
White Hat / Black Hat SEO | | gfiedel0 -
Help required as difficulty removing Google algorithmic penalty
I am not an SEO expert but I am trying to recover my company's ranking on Google. We are a UK based baby shower company. Been established since 2003. We have used SEO companies a few years ago. On September 28th 2012 our rankings in Google dropped significantly on certain landing pages, others like our baby shower gifts page has remained position 1 for UK Google searches . Bing and Yahoo were unaffected. Searches for baby shower and baby shower decorations has gone from position 1 or 2 (behind wikipedia ) to these 2 landing pages being unranked in Google. I have for the first time ever gone through our back links, tried to locate bad or low quality links, emailed where possible, and set up in webmaster tools a dissavow file ( currently not acted upon by Google). I have also amended the text in the baby shower department so it does not read as keyword stuffed. It has been two and a half months now and sales has dropped significantly and me and the staff are getting very concerned. Our site is www.showermybaby.co.uk . We have not received a manual penalty. Any suggestions or help in removing this Google penalty would be greatly appreciated.
White Hat / Black Hat SEO | | postagestamp0 -
Instability on the server - punishment in Google?
My site has about 50k pages indexed in Google. We are well respected, and we believe our stories add much value to the user. However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours. Unfortunately we also have no provision for him to return to work. I wonder if this kind of instability can cause some punishment on Google, if so, I wonder if there is anything we can do to tell Google that we are aware and working to resolve the problem.
White Hat / Black Hat SEO | | lucastudio0