Is white text on a white background an issue when...?
-
Hi guys,
This question was loosely answered here (http://www.seomoz.org/q/will-google-index-a-site-with-white-text-will-it-give-it-bad-ratings), but I wanted to elaborate on the concern.
The issue I have is this,
http://www.searchenginexperts.com.au/preview/white-text-white-background-issue
Of the four div elements on the page, which;
-
is best practice for SEO? and
-
which of them would not be penalized by google on the grounds of hidden text?
The reason I ask is that I have a site that is currently implementing the first div styling, but if you either remove the image OR uncheck the repeat-x (in inspect element) the text is left as white on white.
I have added the transparent image on green to prove that having a background colour to back up the tiled image is not always going to work. What can be done in this scenario?
Thanks in advance,
Dan (From my managers account)
-
-
Yes Dan something like that could get reported. You should do your best not to have this happen, mostly on a large scale, a single incident would likely be ignored.
-
Thx Gents,
To clarify, the content in question was footer links on my clients site.
It sounds like the consensus is that the approaches I have in the example should be fine as my intention is not to deceive and only visitors (most likely competition) would flag this manually if it was.
What remains unanswered is that the last two examples on my test page will still create issues.
The third example inadvertently has a transparent section of the background image where text exists. You can see this if you click/drag over the middle section. I would imagine this would get flagged by visitors as hidden text (as it currently shows white text on white), but aside from offering a complimentary background colour to either the div element or the entire site (say a pastel colour) is there a better way to manage this than the fourth example (where I have simply offer a fallback green colour. This looks pretty bad)?
Thanks again...
Dan
-
Hey Dan
Ultimately, I don't think this would be a problem on an otherwise non spammy site. There is generally a big difference between a site that is using a set of spammy or manipulative techniques and one that makes a simple mistake like this so I doubt you have much to worry about if everything else is as it should be.
That said, I guess the simple question here is:
If you are using a background image and white text, why not use a background colour as well?
This would address the obvious usability issues relating to the image not displaying and clarify that there is no bad intention here to trick anything. Better for users, better for search engines, better for your SEO penalty related anxiety issues.
Hope that helps.
Marcus
-
Dan the rule of thumb is if the text is readable and not purposelessly hidden then you're safe. The operative word there is purposelessly.
I will also add that in general crawlers are not going to find these types of problems rather they are reported by users or more often than not your competition. From there search engines may have a human evaluate the report and make a manual ruling.
-
Ok the thing is, if text is humanly readable, you are safe. Just because you are using white texts and then something goes wrong with the style and the texts go invisible for a few days will not necessarily get your website banned. However, here I am assuming that you are not stuffing keywords there
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential issue: Page design might look like keyword stuffing to a web crawler
We have an interesting design element we might try on our home page. Here's a mockup: https://codepen.io/dsbudiac/pen/Bwrgjd I'm worried web crawlers will interpret this as keyword stuffing and affect our rankings. It features: Mostly transparent/hidden text Repeating keyword list I could try a couple methods to skirt around crawling concerns: Load keywords through an iframe Make the keywords an image (would significantly increase page load) Inject keywords after page load into a container w/ javascript (prob not effective as crawlers are only getting better at indexing javascript) Load the keywords into an svg element Load the keywords into a canvas element via javascript I have a few questions: Should I be concerned about any potential keyword stuffing / SEO issues with this design? Can you comment on the effectiveness (with proof) of the above strategies? Am I better off just abandoning this type of design?
Intermediate & Advanced SEO | | dsbud0 -
Multiple H2 with no direct links to content and invisible body text - is this an issue?
OK, so we've lost pagerank and I think it's because of the way our site works (and we operate it). We have a videofolio, which shows on most of our pages, showcasing our work. Over time, we have tended to unpublish these pages and created new videofolio pages to show on our home page and relevant pages. The videofolio is a set of pages, each with a title, body text and a place to insert a link to the video, which is played through a videofolio showcase on our website (www.curveball-media.co.uk). Each is set a category, e.g. film, and when the user clicks the tab for 'film', the thumbnails pop up and the user can play the video. We have to work it this way as it's the only way to remove the videos from showing on our home page and to show new content instead. Simply deselecting a category still allows the videos to be seen when the 'all' category is selected by the user. Last week, I found a way of bringing back these unpublished pages by removing the 'all' tab from the videofolio. Then I turned each one into a blog like structure instead. Essentially, instead of the video link being played through the videofolio, we deselected a category (e.g. animation, film etc) and left the page floating. The only way you can access it without being attached to a videofolio category is through the direct link. By turning off the 'all' category and deselecting the page from any other categories, we were able to properly SEO these pages. NB: If they are created for use with the videofolio, you can have only extremely limited body text and no H2, as this is the text that appears when you hover over the video thumbnail. That's just the nature of the template. What I didn't anticipate is that now the code on the home page shows all these now (un)published pages and their corresponding H2 tags. Without a category selected, there is no way to get to these pages unless I create a direct link. I plan to do this through a blog post. In the home page code, the entire videofolio page shows, including the body text and link to the video. **This text doesn't show on the home page though, i.e. the user never sees this text. ** 1. Is it an issue to have so many similar H2 tags on the homepage? 2. Is it an issue that the code has text which is essentially invisible on the home page? 3. Is it an issue that the content is not linked to through the home page visibly? Thanks!
Intermediate & Advanced SEO | | curveballmedia0 -
OK to change the anchor text of a link?
I have built a link on behalf of a ciient in a long, well-written article on a reputable website that accepts contributor accounts. I therefore control the link. I have since realised that the anchor text of the link could be optimized much better than it currently is (while still only being a partial match). Would I be punished by the algorithm for going in and changing the link? I know it's not 100% "natural," but then we're SEOs, and i don't think it's too implausible that a website owner may go in and do the same... Maybe if I add some text as well, it would make things look more natural?
Intermediate & Advanced SEO | | zakkyg1 -
Issue with Site Map - how critical would you rank this in terms of needing a fix?
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
Intermediate & Advanced SEO | | KateWaite
Kate0 -
Schema.org mark up to avoid duplicate issue?
Hey there, I was wondering, does product's mark-up help to avoid penalization due to duplicate content? Here is the example: one of my client doesn't supply unique content. Because the major part of the content is technical description of products made by a couple of manufactures, do you think it will help me to link the official manufacturer webpage in a schena.org product mark-up? I know this is the right procedure to add mark-ups, but as on the pages of my client an outbound-link will show up, so I want to tell him this will be the only way to have that duplicate content without incurring in penalisation. I'd like to give him more than one solution, as I'm pretty sure it will never supply us with unique content. Thanks Pierpaolo
Intermediate & Advanced SEO | | madcow780 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Does Google read texts when display=none?
Hi, In our e-commerce site on category pages we have pagination (i.e toshiba laptops page 1, page 2 etc.). We implement it with rel='next' and 'prev' etc. On the first page of each category we display a header with lots of text information. This header is removed on the following pages using display='none'. I wondered if since it is only a css display game google might still read it and consider duplicated content. Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0