Will Google Penalize Content put in a Div with a Scrollbar?
-
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique?
Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
-
I see this question has been answered years back from now. But what's the importance of this issue in today's world.
I just got a client's website and he want to add SEO optimized content in the scroll bar at the bottom of the page. I don't know if that's a spam or not. Can you please suggest me.
I'm eager to get a proper answer.
website is: www (dot) zdhsales (dot) com
-
I've actually wondered the same before. To the best of my knowledge I've never heard anyone cite overflow: auto; as a negative signal compared to the amount of press display: none; text-indent: -9999px; etc. gets. It very well could be abused just as badly though. The only way I could think of an abuse-check would be to weigh the amount of text in the corresponding div against what a practical min-height of that div should be, but that seems a bit excessive.
I agree with Steven, it's come to a point where these css techniques have very legitimate uses and probably shouldn't be penalized. Plus, there's plenty of other ways to accomplish the same thing, whether it's document tree manipulation or any other kind of rendering of a page after the crawable URL has been loaded. So at what point is it worth fighting such a thing?
edit: on a side note, what's the deal with those crazy underscores at the end of the URL? yuck.
-
Do Google actually still penalised Overflow:Hidden and Display:none though still, or just off screen placement such as left:-9999px? If they do its something that I'm sure will be changed as its commonly used for "div switching" through navigational menu's and tabs (for display:none at least).
-
Thank you for the response Ryan. Although the site is not outwardly "hiding" the copy, from a usability standpoint this method does not seem to carry much if any value to the person visiting the page. I figured Google would see this as a lame attempt at search engine bate and frown upon the practice.
-
To the best of my knowledge this has no impact on SEO. Googlebot doesn't like it when you hide content, but that only applies to overflow:hidden and display:none as far as I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
Hi all, We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Do you get penalized for Keyword Stuffing in different page URLs?
If i have a website that provides law services in varying towns and we have pages for each town with unique content on each page, can the page URLS look like the following: mysite.com/miami-family-law-attorney mysite.com/tampa-family-law-attorney mysite.com/orlando-family-law-attorney Does this get penalized when being indexed?
White Hat / Black Hat SEO | | Armen-SEO0 -
Should I Even Bother Trying To Recover This Site After Google Penguin?
Hello all, I would like to get your opinion on whether I should invest time and money to improve a website which was hit by Google Penguin in April 2014. (I know, April 2014 was nearly 2 years ago. However, this site has not been a top priority for us and we have just left until now). The site is www.salmonrecipes.net Basically, we aggregated over 700 salmon recipes from major supermarkets, famous chefs, and others (all with their permission) and made them available on this site. It was a good site at the time but it is showing its age now. For a few years we were occasionally #1 on Google in the US for "salmon recipes", but normally we would be between #2 and #4. We made money from the site almost entirely through Adsense. We never made a huge amount, but it paid our office rent every month, which was handy. We also built up an email database of several thousand followers, but we've not really used this much. (Yet). In the year from 25th April 2011 to 24th April 2012 the site attracted just over 500k visits. After the rankings dropped due to Google Penguin, traffic dropped by 77% in the year from 25th April 2011 to 24th April 2012. Rankings and traffic have not recovered at all, and are only getting worse. I am happy to accept that we deserved our rankings to fall during the Google Penguin re-shuffle. I stupidly commissioned an offshore company to build lots of links which, in hindsight, were basically just spam, and totally without any real value. However they assured me it was safe and I trusted them, despite my own nagging reservations. Anyway, I have full details of all the links they created, and therefore I could remove many of these 'relatively' easily. (Of course, removing hundreds of links would take a lot of time). My questions ... 1. How can I evaluate the probability of this site 'recovering' from Google Penguin. I am willing to invest time/money on link removal and new (ethical) SEO work if there is a reasonable chance of regaining a position in the top 5 on Google (US) for "salmon recipes" and various long-tail terms. But I am keen to avoid spending time/money on this if it is unlikely we will recover. How can I figure out my chances? 2. Generally, I accept that this model of site is in decline. Relying on Google to drive traffic to a site, and on Google to produce revenue via its Adsense scheme, is risky and not entirely sensible. Also, Google seems to provide more and more 'answers' itself, rather than sending people to e.g. a website listing recipes. Given this, is it worth investing any money in this at all? 3. Can you recommend anyone who specialises in this kind of recovery work. (As I said, I have a comprehensive list of all the links that were built, etc). OK, that is all for now. I am really looking forward to whatever opinions you may have about this. I'll provide more info if required. Huge thanks
White Hat / Black Hat SEO | | smaavie
David0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Can i 301 redirect a website that does not have manual penalty - but definetly affected by google
ok, i have a website (website A) which has been running since 2008, done very nicely in search results, until january of this year... it dropped siginificantly, losing about two thirds of visitors etc... then in may basically lost the rest... i was pulling my hair out for months trying to figure out why, i "think" it was something to do with links and anchor text, i got rid of old SEO company, got a new SEO company, they have done link analysis, trying to remove lots of links, have dissavowed about 500 domains... put in a reconsideration request... got a reply saying there is no manual penalty... so new seo company says all they can do is carry on removing links, and wait for penguin to update and hopefully that will fix it... this will take as along as it takes penguin to update again... obviously i can not wait indefinetely, so they have advised i start a new website (website B)... which is a complete duplicate of website A. Now as we do not know whats wrong with website A - (we think its links - and will get them removed) my seo company said we cant do a 301 redirect, as we will just cause what ever is wrong to pass over to website B... so we need to create a blank page for every single page at website A, saying we have moved and put a NO FOLLOW link to the new page on website B.... Personally i think the above will look terrible, and not be a very user friendly experience - but my seo company says it is the only way to do it... before i do it, i just wanted to check with some experts here, if this is right? please advise if 301 redirects are NOT correct way to do this. thanks
White Hat / Black Hat SEO | | isntworkdull
James0 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0