Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
-
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines.
I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important.
Resources:
http://www.ericpender.com/blog/tabs-and-seo
http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en
http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643
Sample in use: http://www.seomoz.org/article/search-ranking-factors
**Old Version: **
http://screencast.com/t/BWn0OgZsXt
http://seatgeek.com/boston-celtics-tickets/
New Version with tabs:
http://screencast.com/t/VW6QzDaGt
http://screencast.com/t/RPvYv8sT2
http://seatgeek.com/miami-heat-tickets/
Notes:
- Content not displayed stacked on browser when Javascript turned off, but it is in the source code.
- Content shows up in Google cache of new page in the text version.
- In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be?
- Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional).
- This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource.
- Are there any issues with hidden text / is this too far down in the html?
Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
-
Cool. When we launched them separately we overvalued the potential for ticket prices rankings and had so little respect from engines that double ranking was hard. Also, I wasn't as on my game with SEO back then.
I think merging is the way to go, I am filing it into our dev. queue for the coming weeks.
-
I'd probably agree with that merge decision. Topic is basically the same, primary difference is inclusion of "price" in the keyword targeting from what I see, and that can likely be achieved with one master page.
Furthermore, having awesome data integrated like that will lead to links, because it's better than most crappy ticket sites. Big boost in PA from that leads to better rankings than just the 2 pages IMO.
-
Thanks for the helpful response. And I definitely am with you on the idea of having better data on all our pages. I initially set it up separately but have been leaning towards merging those ticket price pages with the tickets pages and killing off (301ing the price pages to the tickets pages). Make sense?
-
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Newswire service good for SEO?
Hi, I am thinking of signing up for Newswire and send PR every month? Will it have any SEO benefits in terms of backlinks? I am just worried because of Panda release those PR services might have no value anymore...
Technical SEO | | get12000 -
Technical SEO - Where to begin?
Hi all, I'm looking to learn more about technical SEO. My background was digital marketing/PR where I learned the importance of links, of anchor text, of page speed, of improving UX signals, of SSL, utilising things like Google My Business etc. However, I find I am chasing my tail when it comes to things like understanding JS/CSS/log file analysis etc. I've tried reading so many articles on the subjects and I just find it so damn confusing. AnugalarJS/BackboneJS. Fetching & rendering, URL parameters...etc. I know from my own experiments that JS pages struggle to rank and I've created two very similar pages, one without JS, one with JS (which had far more links) and the non-JS page ranked far higher. So, I suppose I'm asking for some help with how to begin learning this stuff. I find the articles on Moz, Search Engine Land etc to be a bit confusing...maybe I'm not technically minded enough! Cheers, Rhys
Technical SEO | | SwanseaMedicine0 -
SEO for Interspire Relic
Hi All, Does anyone know of optimization best practices for the now largely defunct Interspire Web Publisher? Specifically, I'm looking for a canonical plugin or workaround to try and get rid of a few duplicate content issues (most importantly root vs. index.php). I'd like to just redo the site with a cms that has better support...unfortunately client budget constraints are a little tight at the moment. Thanks!
Technical SEO | | G2W0 -
Pages not being indexed
Hi Moz community! We have a client for whom some of their pages are not ranking at all, although they do seem to be indexed by Google. They are in the real estate sector and this is an example of one: http://www.myhome.ie/residential/brochure/102-iveagh-gardens-crumlin-dublin-12/2289087 In the example above if you search for "102 iveagh gardens crumlin" on Google then they do not rank for that exact URL above - it's a similar one. And this page has been live for quite some time. Anyone got any thoughts on what might be at play here? Kind regards. Gavin
Technical SEO | | IrishTimes0 -
Correct Indexing problem
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
Technical SEO | | kicksetc0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
Where to get expert SEO help?
I joined SEOmoz knowing very little about SEO (it turns out even less than I thought!) I signed up because my business website that had be ranking very well for years made a fast and furious fall to the purgatory of page 2, 3, whatever. We'll I've definitely learned a lot and made a several changes that have helped. Specifically link building (directory submissions) and eliminating duplicate content. But we're still far below where we used to be and I've done everything I can do without making a career change to SEO. I've hired a few offshore SEOs to help but they have all failed to live up to their promises. So, I would love to find a GOOD SEO that can 1. Fix the remaining on-page technical issues in our CMS website (Business Catalyst), and 2. help us develop an SEO strategy for the next year. (I prefer not to post the name of the website for competitive reasons) Our keywords are really not very competitive at all due to the uniqueness of the business. Where should I look for help? Thanks
Technical SEO | | Placeboo0 -
Why am i still getting duplicate page title warnings after implementing canonical URLS?
Hi there, i'm having some trouble understanding why I'm still getting duplicate page title warnings on pages that have the rel=canonical attribute. For example: this page is the relative url http://www.resnet.us/directory/auditor/az/89/home-energy-raters-hers-raters/1 and http://www.resnet.us/directory/auditor/az/89/home-energy-raters-hers-raters/2 is the second page of this parsed list which is linking back to the first page using rel=canonical. i have over 300 pages like this!! what should i do SEOmoz GURUS? how do i remedy this problem? is it a problem?
Technical SEO | | fourthdimensioninc0