Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will Google Penalize Content put in a Div with a Scrollbar?
-
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique?
Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
-
I see this question has been answered years back from now. But what's the importance of this issue in today's world.
I just got a client's website and he want to add SEO optimized content in the scroll bar at the bottom of the page. I don't know if that's a spam or not. Can you please suggest me.
I'm eager to get a proper answer.
website is: www (dot) zdhsales (dot) com
-
I've actually wondered the same before. To the best of my knowledge I've never heard anyone cite overflow: auto; as a negative signal compared to the amount of press display: none; text-indent: -9999px; etc. gets. It very well could be abused just as badly though. The only way I could think of an abuse-check would be to weigh the amount of text in the corresponding div against what a practical min-height of that div should be, but that seems a bit excessive.
I agree with Steven, it's come to a point where these css techniques have very legitimate uses and probably shouldn't be penalized. Plus, there's plenty of other ways to accomplish the same thing, whether it's document tree manipulation or any other kind of rendering of a page after the crawable URL has been loaded. So at what point is it worth fighting such a thing?
edit: on a side note, what's the deal with those crazy underscores at the end of the URL? yuck.
-
Do Google actually still penalised Overflow:Hidden and Display:none though still, or just off screen placement such as left:-9999px? If they do its something that I'm sure will be changed as its commonly used for "div switching" through navigational menu's and tabs (for display:none at least).
-
Thank you for the response Ryan. Although the site is not outwardly "hiding" the copy, from a usability standpoint this method does not seem to carry much if any value to the person visiting the page. I figured Google would see this as a lame attempt at search engine bate and frown upon the practice.
-
To the best of my knowledge this has no impact on SEO. Googlebot doesn't like it when you hide content, but that only applies to overflow:hidden and display:none as far as I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Significant "Average Position" dips in Search Console each time I post on Google My Business
Hi everyone, Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly. Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did. We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image). I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content. I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc. My URL is https://www.photographybymatthewjames.com/ Thanks in advance Matthew C0000OTrpfmNWx8g
White Hat / Black Hat SEO | | PhotoMattJames0 -
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0