Seo style="display: none;" ?
-
i want to have a funktion which shortens text in categorie view in my shop.
apple is doing this in their product configurator
see the "learn more" button at the right side:
http://store.apple.com/us/configure/MC915LL/A
apple is doing this by adding dynamic content but i want it more seo type by leaving the content indexable by google.
i know from a search that this was used in the past years by black had seos to cover keywordstuffing.
i also read an article at google.
i beleive that this is years ago and keywordstuffing is completly no option anymore.
so i beleive that google just would recognise it like the way its meant to be.
but if i would not be sure i would not ask here
what do you think?
-
exactly, so in this case you are completely safe.
-
thanx alot!
-
If you are doing it as a way of formatting the page and still offering an option or button that allows the user to see the rest of the text, then it is not the same thing that you are thinking of in regards to Google. Google states that you should not hide text on the page to purposely try and trick the search engine.
In E-Commerce situations it is very common to hide part of the text, for instance when you have 4 tabs for "description, features, specification, colors, etc." it is a good idea to use a 'display: none' so that all 3 of the tabs are not shown all the time. This is not considered Black Hat, it is considered good design.
Matt Cutts has said quite a few times, if it is good for the user it is good for Google.
It is when you intentionally hide a block of text on the page with no way for the user to view it that you are using Black Hat technique.
-
yes, we have a button with real text layed on it which says more information or so.
the funny thing is google once sayed clearly dont do this and the text is still available. i remeber that this came out something like 5 years ago.
-
I think it really depends on the purpose. I make websites everyday, and i use style="display:none;" on almost ever page of them. I think if it is used for a design purpose it is completely ok, and no i don't think it is keyword stuffing. Is there a function on the site where a user action unhides this content? or are you trying to hide it always?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Significant "Average Position" dips in Search Console each time I post on Google My Business
Hi everyone, Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly. Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did. We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image). I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content. I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc. My URL is https://www.photographybymatthewjames.com/ Thanks in advance Matthew C0000OTrpfmNWx8g
White Hat / Black Hat SEO | | PhotoMattJames0 -
E-Commerce Cart Migration SEO Advice
Hi all, First time post here. We operate a small ecommerce store and plan on moving cart, most likely from Interspire to Magento or possibly Prestashop. We want to be sure not to damage our current search rankings when making this move and ideally improve our rankings at the same time by utilizing the new cart’s <acronym title="Search Engine Optimization">seo</acronym> functionality as best we can. Stage 1 of the project will see us simply move our current store from one cart to another. For this move we are keeping our existing single domain and intend on moving our current set up without making many, if any changes to content, product descriptions , URL’s etc as we believe this best practice for ensuring our current rankings remain as they are- is such thinking correct? Or should we do otherwise Stage 2 would see us operate a multi lingual, multistore, with 4 domains operating with 1 back end. For the 3 new domains we are looking to set up these storefronts in whichever manner will be most beneficial from an <acronym title="Search Engine Optimization">seo</acronym> perspective We welcome any advice as to what we should consider? What we should and shouldn’t do? and best practices for this project Please advise if any other information is required to best answer our query Thanks for taking the time to read our post, any forthcoming tips and advice will be greatly appreciated
White Hat / Black Hat SEO | | StuSol0 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
White Hat / Black Hat SEO | | makote0 -
Redesigning my site, and not sure what is best for seo, subfolders or direct .html links?
,I have 4 examples to choose from, what is best:? http://hoodamath.com/games/dublox/index.html http://hoodamath.com/games/dublox.html http://hoodamath.com/dublox/index.html http://hoodamath.com/dublox.html
White Hat / Black Hat SEO | | hoodamath0 -
Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
White Hat / Black Hat SEO | | GroupeDSI0 -
Here's some more proof white hat SEO works
I guess this is the most logical place to share this with you. I do SEO for many sites. I've recently been focusing on two in particular for the same client. We used Netfirms SEO services to get links--he insisted--which basically consists of writing articles in broken English and placing them all over blog networks with our desired anchor text. On the other site, I simply refused to employ those services. This was the client's main site, and was way too important to mess around with. I built links myself, the legit way. Long story short, for months I watched the shady, black hat site climb and climb in the SERPs, while the white hat one kept falling. This morning, I checked my SEOmoz campaigns and my white hat site went from #8 to #2 and my black hat site went from page 2 to no longer being in the top 50. Just another example of what's been happening with Google lately and how great it is. Interestingly, the black hat site never got a warning in GWT about buying links. Now I just have to figure out a way to break the news to my boss and tell him I told him so without actually using those words.
White Hat / Black Hat SEO | | UnderRugSwept5 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30