Moving content in to tabs
-
Hi,
I'm kind of an SEO noobie, so please bare with me
On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs.
This makes sense. We tried it and it makes navigating through the information much easier for visitors.
My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab.
Will this cause SEO issues?
Thank you!
-
It works with ajax (some javascript and some css). The content is rendred inside the page, but is only displayed (via javascript) when the visitor clicks a tab.
-
What are you using to accomplish this ? CSS ? HTML5 ? AJAX or something else ? All these technologies have a Search Engine Friendly way of doing it. It really depends how you are doing it. It sounds like AJAX to me, but could also be CSS...so..depends.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Do the back-links go wasted when anchor text or context content doesn't match with page content?
Hi Community, I have seen number of back-links where the content in that link is not matching with page content. Like page A linking to page B, but content is not really relevant beside brand name. Like page with "vertigo tiles" linked to page about "vertigo paints" where "vertigo" is brand name. Will these kind of back-links completely get wasted? I have also found some broken links which I'm planning to redirect to existing pages just to reclaim the back-links even though the content relevancy is not much beside brand name. Are these back-links are beneficial or not? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Moving to https and back to http, would it it hurt?
We have redirected everything on our blog from http to https. Our blog is in a subfolder so that now it looks like this: https://ourdomain/blog; But everything else i.e. our shop continues to be on http at http://ourdomain We are wondering: 1- Does the domain authority for SEO purposes have different values for the http and the https version of a domain? 2- If yes, is there a way to check the difference in authority between the http and the https version? 3- If we do have a higher authority on our http version (as historically we have been mostly on our http), would it make sense to go back to the http for the blog to enjoy that authority too? 4- Would changing our mind and going back to http after a few months of just having moved to https from http send any negative signals to Google? Would Google care if we do a back and forth essentially? Many thanks!
Algorithm Updates | | TVape0 -
Does it impact over ranking of any website if their same content being used some other external sources
Hi Moz & members, I just want to make sure over website www.1st-care.org , does it impact over ranking this website if the same content (of about us or home care services) being used some other external sources or local citations places. Do those published same content create any ranking drop issue with this website's and making its content strengthen week? . As I was on 9th position in Google.com before, now it has slipped to 29th position. WHY? is there content issue or anything else which i am not aware.
Algorithm Updates | | Futura
See the content used:
Home page content
About us page content Regards,
Teginder Ravi0 -
Will Parked Domain hurt My SEO as Duplicate Content?
Hello, I have one website (Migration Lawyers) and I have an extra 8 domains Parked so they are basically cloning the content of the site. so if the main site is:  migrationlawyers.co.za and I have an addon domain  migration-lawyers.com  is that good or bad? is there a proper way to redirect the sites, will redirecting (301) subdomains be more effective? Thanks for your Input 🙂 0i8VXqr.png
Algorithm Updates | | thealika0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
Rankings moving in every 2 days.
Hi, I am seeing strange behavior of Google for my rankings. I have a very competitive keyword (I can't disclose because of NDA) but for example keyword like "Cameras" on google.co.uk , it stayed on #4 spot for years, we have been practising all good SEO techniques for this. From last couple of weeks, it goes to 9 then stays there for 2 days, then come back to 4 for 2 days, then go back to 9 & then after 2 days come back to 4 & it is keep moving like that... at the time of this post, this one is on 9. Our site have been on top for very competitive kws even like word "Camera" we are 2 which is stable... None of our other kws ranks are changed but only word "Cameras" is affected. I am not sure how & why this is happening, I have been following SEOMoz from long time & hope can find solution here. Note : Keywords used here are not actual, but carry equal importance Please let me know.
Algorithm Updates | | spopli0