Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
-
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines.
I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important.
Resources:
http://www.ericpender.com/blog/tabs-and-seo
http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en
http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643
Sample in use: http://www.seomoz.org/article/search-ranking-factors
**Old Version: **
http://screencast.com/t/BWn0OgZsXt
http://seatgeek.com/boston-celtics-tickets/
New Version with tabs:
http://screencast.com/t/VW6QzDaGt
http://screencast.com/t/RPvYv8sT2
http://seatgeek.com/miami-heat-tickets/
Notes:
- Content not displayed stacked on browser when Javascript turned off, but it is in the source code.
- Content shows up in Google cache of new page in the text version.
- In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be?
- Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional).
- This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource.
- Are there any issues with hidden text / is this too far down in the html?
Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
-
Cool. When we launched them separately we overvalued the potential for ticket prices rankings and had so little respect from engines that double ranking was hard. Also, I wasn't as on my game with SEO back then.
I think merging is the way to go, I am filing it into our dev. queue for the coming weeks.
-
I'd probably agree with that merge decision. Topic is basically the same, primary difference is inclusion of "price" in the keyword targeting from what I see, and that can likely be achieved with one master page.
Furthermore, having awesome data integrated like that will lead to links, because it's better than most crappy ticket sites. Big boost in PA from that leads to better rankings than just the 2 pages IMO.
-
Thanks for the helpful response. And I definitely am with you on the idea of having better data on all our pages. I initially set it up separately but have been leaning towards merging those ticket price pages with the tickets pages and killing off (301ing the price pages to the tickets pages). Make sense?
-
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Replication on Search
Hi. We recently created a Christmas category page on our eCommerce website (christowhome.co.uk). Earlier today, I Googled ‘Christow Christmas Silhouette Lights’ (Christow being the name of our website and Christmas silhouette lights being one of the sub-categories we recently created). I was curious to see how the page appeared on search. Bizarrely, the page appeared multiple times on search (if you click on the link above, it should show you the search results). As you can see, multiple meta titles and descriptions have been created for the same page. This is something that is affecting a number of our Christmas category pages. I don't quite understand why this has happened. We recently added filters to the category. Could the filters be responsible? Any idea how I can prevent this from happening? How I can stop Google indexing these weird replica pages? Many thanks, Dave
Technical SEO | | Davden0 -
Canonical Url Structure Vs. Google Search View
I recently set up a new site and set the "preferred" domain in Google Webmasters to show URLs WITHOUT the WWW for google search purposes. In the confirmation email from google, this confused me: "This setting defines which host - www or not - should be considered the canonical host when indexing your site." In the website, we have cononical URLS at the top of every page in the header, but still have the WWW in those. Any issues with that?
Technical SEO | | vikasnwu0 -
Canonical needed after no index
Hi do you need to point canonical from a subpage to main page if you have already marked a no index on the subpage, like when google is not indexing it so do we need canonicals now as is it passing any juice?
Technical SEO | | razasaeed0 -
Will invalid HTML code generated by WordPress affect SEO efforts?
Hi all, I'm new to SEOmoz and SEO in general really. I run a small but well regarded freelance website and graphic design business, and until very recently had an employee who handled the SEO side of things. I'm now looking to step into this role myself and hopefully learn the in's and out's of SEO. I've no doubt there will be much to learn, but the SEOmoz tools and it's community seem excellent and helpful. My question then is basically, if WordPress generated HTML code can have an effect on SEO, when it's reported as invalid by tools such as the W3C HTML validator? I'm used to hand coding the majority of my websites for clients, where creating valid HTML and CSS code is something I can do with relative ease. A new client however wants to use WordPress - for ease of updating the site content themselves. The client does however consider any potential SEO implications to be a very important factor in choosing a hand coded vs. WordPress based website. I am aware that WordPress itself is just a means of generating HTML code, and that to the search engines there is no difference between this and the hand coded websites I usually produce. However if WordPress is generating HTML that is being reported as invalid, would this make the search engines penalise the site? On a second note, will the search engines look negatively on a WordPress site where it is being used as a standard website, and the content may not be updated as frequently, as say, a blog? Thanks for your time, and I look forward to hearing your suggestions.
Technical SEO | | SavilleWolf0 -
SEO Audit - Panda
I am looking for a reputable SEO company to help diagnose Panda issues. I am very familiar with SEO and lead an in-house team so I need more than a basic audit. e.g You need unique content
Technical SEO | | WEB-IRS
e.g. You need to create quality content I am looking for someone with a technical mind to help diagnose. Please reach if you have someone in mind.0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0 -
Html5 in SEO
What is the convinience of using html5 for seo.As i read is not too good using many h1 in each metacontent (due to crawler alerts) , but it is good to use html5. We have follow or so this web guidelines www.tumanitas.com whtat do you think about taht?
Technical SEO | | ofuente0 -
How to get user genreated reviews indexed properly?
We are currently working to improve the deployment of a review widget on our website. The widget was deployed about 18 months ago and all reviews are behind Java navigation. I have been working with our IT staff to get the reviews into an HTML page which will either live on the product page as a tab or will be a link from the product page. Our IT staff has suggested leaving the Java navigation for users and creating separate HTML pages specifically for search engines. Based on my experience, this sounds like a bad idea, basically creating pages just for search engines that will not be use by site visitors, although the visitors will have access to the same content via the Java navigation. Anyone care to comment on this? Is creating HTML pages specifically for search engines a bad idea? An acceptable idea?
Technical SEO | | seorunner0