Will Google Penalize Content put in a Div with a Scrollbar?
-
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique?
Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
-
I see this question has been answered years back from now. But what's the importance of this issue in today's world.
I just got a client's website and he want to add SEO optimized content in the scroll bar at the bottom of the page. I don't know if that's a spam or not. Can you please suggest me.
I'm eager to get a proper answer.
website is: www (dot) zdhsales (dot) com
-
I've actually wondered the same before. To the best of my knowledge I've never heard anyone cite overflow: auto; as a negative signal compared to the amount of press display: none; text-indent: -9999px; etc. gets. It very well could be abused just as badly though. The only way I could think of an abuse-check would be to weigh the amount of text in the corresponding div against what a practical min-height of that div should be, but that seems a bit excessive.
I agree with Steven, it's come to a point where these css techniques have very legitimate uses and probably shouldn't be penalized. Plus, there's plenty of other ways to accomplish the same thing, whether it's document tree manipulation or any other kind of rendering of a page after the crawable URL has been loaded. So at what point is it worth fighting such a thing?
edit: on a side note, what's the deal with those crazy underscores at the end of the URL? yuck.
-
Do Google actually still penalised Overflow:Hidden and Display:none though still, or just off screen placement such as left:-9999px? If they do its something that I'm sure will be changed as its commonly used for "div switching" through navigational menu's and tabs (for display:none at least).
-
Thank you for the response Ryan. Although the site is not outwardly "hiding" the copy, from a usability standpoint this method does not seem to carry much if any value to the person visiting the page. I figured Google would see this as a lame attempt at search engine bate and frown upon the practice.
-
To the best of my knowledge this has no impact on SEO. Googlebot doesn't like it when you hide content, but that only applies to overflow:hidden and display:none as far as I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
I'm changing title tags and meta tags, url, will i loose my ranking?
Hi Guys QUESTION: I'm currently going through a re-design for my new website that was published in November 2014 - since launching we found there were many things we needed to change, our pages were content thin being one of the biggest. I had industry experts that came in and made comments on the title tags lacking relevance for eg: our title tag for our home page is currently "Psychic Advice" most ideal customers don't search "Psychic Advice" they search more like "Online Psychic Reading" or Psychic Readings" I noticed alot of my competitors also were using title tags such as Online Psychic Readings, Free Psychic Readings etc so it brings me to my question of "changing the title tags around. The issue is, im ranking for two keywords in my industry, online psychics and online psychic readings in NZ. 1. Our home page and category pages are content thin.... so hoping that adding the changes will create perhaps some consistency also with the added unique and quality content. Here is the current website: zenory. co.nz and the new one is www.ew-zenory.herokuapp.com which is currently in development I have 3 top level domains com,com.au, and co.nz Is there anyone that can give me an idea if I were to change my home page title tag to **ZENORY | Online Psychic Readings | Live Psychic Phone and Chat ** If this will push my rankings down though this page will have alot more valuable content etc? For obvious reasons im going to guess it will make drop, I'm wondering though if it is worth changing the title tags and meta descriptions around or leaving it as is if its already doing well? How much of a difference do title tags and meta descriptions really make? Any insight into this would be great! Thanks
White Hat / Black Hat SEO | | edward-may1 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
Keywords in Google Local results
We have a client in the moving business and I'm absolutely flabbergasted by the "local" results and the number of them that are not following Google's guidelines for Google Local accounts. 3 of them are using exact match keyword strings as their company names. I've reported all 3, every week for the last 2 months and have not seen a single dip in the rankings. Meanwhile our client has a duplicate listing we've verified and "suspended" and it hasn't changed for 4 months! Any tips? I've attached a photo of the listings as well. xwWZWyT.gif
White Hat / Black Hat SEO | | SmartWebPros0 -
Is this site penalized?
So I am working on a potential new client and they run several very well established and well ranking ecommerce sites. They have 1 site which is new and underperforming which they want me to "start" on as a trial. The idea being that if they like the progress I would take over SEO on the other sites. After a little research I am concerned that this site may be have a penalty. The site is www.discoverhookah.com The MOZrank and MOZtrust are actually pretty good considering the site is 6 months old, but if you look at the links they are ALL junk. They seems to be some reciprocal linking as well. I believe this is something they have done on their other sites and been ok with because they are 10+ years old and very trusted, however for a new site this link profile worries me. I do not have their analytics yet but looking at their traffic in compete.com shows a HUGE drop off shortly after the site went up (like from 2500 to under 100 visitors). I dont really trust compete.com's numbers outside of being and good indicator for trends, but it has me concerned. The client did tell me they are getting virtually no traffic. I am waiting on the crawl report to confirm its not a crawl or onsite problem but i dont think it is. I have 2 concerns: 1. I am taking this site on the cheap in order to establish a successful project, so I can work on their other sites, and I dont want to walk into a losing situation on the cheap! 2. I believe their webmaster is following some misguided SEO strategies but she has been with them for a long time. I dont think she wants to do theor SEO anyway, as she is very busy with maintenance and development, but if I could prove a penalty it would go a long way in helping me win the whole account from an SEO standpoint.
White Hat / Black Hat SEO | | BlinkWeb0