Content within a toggle, Juice or No Juice?
-
Greetings Mozzers,
I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized.
Thoughts and advice is greatly appreciated!
-
Glad I could help. I would just run it through one or two more versions of that tool just for peace of mind. You can just google "spider view tool" and try out a couple others.
-
Thanks Marisa,
Looks like it shows up there. I appreciate the tip on that tool.
-
It definitely depends on how you're doing it. Without seeing it, I can't say for sure, but usually this method only hides things from the user, not the search engines. I'd recommend running your page through a service that shows you what the spiders see.
Try: http://www.iwebtool.com/spider_view
If you see your text in the results, you're probably safe.
One of the sites I designed, customwovenlabels.com, uses this with javascript. If you go there and look at the copy on the homepage, you will see a link that says, "read more." Google sees all the text and doesn't distinguish the difference between what's initially seen vs. hidden.
So if you discover that your method isn't ideal, there are always other alternatives.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Moving content form Non-performing site to performing site - wihtout 301 Redirection
I have 2 different websites: one have good amount of traffic and another have No Traffic at all. I have a website that has lots of valuable content But no traffic. And I want to move the content of non-performing site to performing site. (Don't want to redirect) My only concern is duplicate content. I was thinking of setting the pages to "noindex" on the original website and wait until they don't appear in Google's index. Then I'd move them over to the performing domain to be indexed again. So, I was wondering If it will create any copied content issue or not? What should i have to take care of when I am going to move content from one site to another?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Content placement in HTML and display
Does Google penalize for content being placed at the top of the page and display for users at bottom of the page? This technique is done by CSS. Thank you in advance for your feedback!
White Hat / Black Hat SEO | | Aerocasillas0 -
Buying quality domainname with juice
I run the SEO side for our company and the MOZ tools have been quite helpfull to track keywords and bump the rankings of certain pages by filtering the errors and filling missing tags like H1, H3 , meta etc. However, the website is an webshop which sells niche products, this makes getting quality backlinks quite a challenge and besides some forums and directories there is little I think I can do to get more quality backlinks without entering grey hat or even blackhat practices. There is not even blogs related to this niche in relevant language, otherwise we would send them samples of products so they would write about it. Recently ( 6 months ago, in SEO time this is ages I think) one of the competitors went bust and their domain name has become available for purchase. It's domainauthority ranks at 21/100 while our own stands at 20/100. now comes my question: If we would to purchase this domain, and do a 301 redirect, would it pass on the juice to our site? what else can I do to improve the ranking except the usual part like: titletags, H1, metatags, img alt, valuable text etc. what other ways are there to get quality backlinks in nichemarkets, I don't want to buy backlinks as I consider that as a shortterm solution and blackhat, and since 80% of our traffic comes organic from google, last thing I want is a penalty from our lord google
White Hat / Black Hat SEO | | sami800 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Content ideas?
We run a printing company and we are struggling to come up with unique content people will actually want to know, is there any way of getting the ball rolling? We were thinking of ideas such as exhibition guide but this seems to have been overdone. Any help would be appreciated.
White Hat / Black Hat SEO | | BobAnderson0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
'Stealing' link juice from 404's
As you all know, it's valuable but hard to get competitors to link to your website. I'm wondering if the following could work: Sometimes I spot that a competitor is linking to a certain external page, but he made a typo in the URL (e.g. the competitor meant to link to awesomeglobes.com/info-page/ but the link says aewsomeglobes.com/info-page/). Could I then register the typo domain and 301 it to my own domain (i.e. aewsomeglobes.com/info-page/ to mydomain.com/info-page/) and collect the link juice? Does it also work if the link is a root domain?
White Hat / Black Hat SEO | | RBenedict0