How can we optimize content specific to particular tabs, but is loaded on one page?
-
Hi,
Our website generates stock reports. Within those reports, we organize information into particular tabs. The entire report is loaded on one page and javascript is used to hide and show the different tabs.
This makes it difficult for us to optimize the information on each particular tab. We're thinking about creating separate pages for each tab, but we're worried about affecting the user experience.
We'd like to create separate pages for each tab, put links to them at the bottom of the reports, and still have the reports operate as they do today.
Can we do this without getting in trouble with Google for having duplicate content? If not, is there another solution to this problem that we're not seeing?
Here's a sample report: http://www.vuru.co/analysis/aapl
In advance, thanks for your help!
-
Thanks for the response.
Yeah, we want to avoid affecting the user experience if possible. We feel that making each tab load as its own page could severely impact it.
Any other ideas? Thoughts on the issue of duplicate content?
-
If you have a very fast website, and a decent amount of content within each tab, you could have each tab be it's own page. Everything will look the same to the user, but you can optimize for that specific tab. However, this will not work if you have a slow website.
Otherwise I would suggest making each tab it's own individual page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bespoke Website With Lack of Front Page Content
Hey guys, I wanted to ask you your opinion.. If you had a website - portfolio style for argument's sake and it was based on wordpress, obviously the front page won't be SEO friendly if you want to keep the minimalistic approach - there will be hardly any content to tell google what to rank your site for... So my question is, can you use a plugin that Google can 'see' content - such as a long unique article - that the user can't see in order to help you rank? I.e. for Gbot, the plugin would load the content plugin as plain html, but 'hide' it from most people visiting the site... What would you do in this scenario? Your response would be much appreciated! Thanks in advance for your help!
Intermediate & Advanced SEO | | geniusenergyltd0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
"Authorship is not working for this webpage" Can a company G+ page be both Publisher AND Author?
When using the Google Structured Data testing tool I get a message saying....... **Authorship Testing Result - **Authorship is not working for this webpage. Here are the results of the data for the page http://www.webjobz.com/jobs/ Authorship Email Verification Please enter a Google+ profile to see if the author has successfully verified an email address on the domain www.webjobz.com to establish authorship for this webpage. Learn more <form id="email-verification-form" action="http://www.google.com/webmasters/tools/richsnippets" method="GET" data-ved="0CBMQrh8">Verify Authorship</form> Email verification has not established authorship for this webpage.Email address on the webjobz.com domain has been verified on this profile: YesPublic contributor-to link from Google+ profile to webjobz.com: YesAutomatically detected author name on webpage: Not Found.Publisher | Publisher markup is verified for this page. |
Intermediate & Advanced SEO | | Webjobz
| Linked Google+ page: | https://plus.google.com/106894524985345373271 | Question - Can this company Google plus account "Webjobz" be both the publisher AND the author? Can I use https://plus.google.com/106894524985345373271 as the author of this and all other pages on our site? 98emVv70 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
How many inner links on one page?
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do? We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow? We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site? Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
Intermediate & Advanced SEO | | komeksimas1 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Have completed keyword analysis and on page optimization. What else can I do to help improve SERP ranking besides adding authoritative links?
Looking for concrete ways to continue to improve SERP results. thanks
Intermediate & Advanced SEO | | casper4340 -
Can obfuscated Javascript be used for too many links on a page?
Hi mozzers Just looking for opinions/answers on if it is ever appropriate to use obfuscated Javascript on links when a page has many links but they need to be there for usability? It seems grey/black hat to me as it shows users something different to Google (alarm bells are sounding already!) BUT if the page has many links it's losing juice which could be saved....... Any thoughts appreciated, thanks.
Intermediate & Advanced SEO | | TrevorJones0