Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
-
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines.
I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important.
Resources:
http://www.ericpender.com/blog/tabs-and-seo
http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en
http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643
Sample in use: http://www.seomoz.org/article/search-ranking-factors
**Old Version: **
http://screencast.com/t/BWn0OgZsXt
http://seatgeek.com/boston-celtics-tickets/
New Version with tabs:
http://screencast.com/t/VW6QzDaGt
http://screencast.com/t/RPvYv8sT2
http://seatgeek.com/miami-heat-tickets/
Notes:
- Content not displayed stacked on browser when Javascript turned off, but it is in the source code.
- Content shows up in Google cache of new page in the text version.
- In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be?
- Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional).
- This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource.
- Are there any issues with hidden text / is this too far down in the html?
Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
-
Cool. When we launched them separately we overvalued the potential for ticket prices rankings and had so little respect from engines that double ranking was hard. Also, I wasn't as on my game with SEO back then.
I think merging is the way to go, I am filing it into our dev. queue for the coming weeks.
-
I'd probably agree with that merge decision. Topic is basically the same, primary difference is inclusion of "price" in the keyword targeting from what I see, and that can likely be achieved with one master page.
Furthermore, having awesome data integrated like that will lead to links, because it's better than most crappy ticket sites. Big boost in PA from that leads to better rankings than just the 2 pages IMO.
-
Thanks for the helpful response. And I definitely am with you on the idea of having better data on all our pages. I initially set it up separately but have been leaning towards merging those ticket price pages with the tickets pages and killing off (301ing the price pages to the tickets pages). Make sense?
-
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Getting a Vanity (Clean) URL indexed
Hello, I have a vanity (clean looking) URL that 302 redirects to the ugly version. So in other words http://www.site.com/url 302 >>> http://www.site.com/directory/directory/url.aspx What I'm trying to do is get the clean version to show up in search. However, for some reason Google only indexes the ugly version. cache:http://www.site.com/directory/directory/url.aspx is showing the ugly URL as cached and cache:http://www.site.com/url is showing not cached at all. Is there some way to force Google to index the clean version? Fetch as Google for the clean URL only returns a redirect status and canonicalizing the ugly to the clean would seem to send a strange message because of the redirect back to the ugly. Any help would be appreciated. Thank you,
Technical SEO | | Digi12340 -
Indexing Problem
My URL is: www.memovalley.comWe have submitted our sitemap last month and we are having issues seeing our URLs listed in the search results. Even though our sitemaps contain over 200 URLs, we only currently only have 7 listed (excluding blog.memovalley.com).Can someone help us with this? | |
Technical SEO | | Memovalley
| | | | It looks like Googlebot has timed out, at least once, for one of our URLs. Why is Googlebot timing out? My server is located at Amazon WS, in North Carolina and it is a small instance. Could Google be querying multiple URLs at the same time and jamming my servers? Could it be becauseThanks for your help!0 -
Removing a staging area/dev area thats been indexed via GWT (since wasnt hidden) from the index
Hi, If you set up a brand new GWT account for a subdomain, where the dev area is located (separate from the main GWT account for the main live site) and remove all pages via the remove tool (by leaving the page field blank) will this definately not risk hurting/removing the main site (since the new subdomain specific gwt account doesn't apply to the main site in any way) ?? I have a new client who's dev area has been indexed, dev team has now prevented crawling of this subdomain but the 'the stable door was shut after the horse had already bolted' and the subdomains pages are on G's index so we need to remove the entire subdomain development area asap. So we are going to do this via the remove tool in a subdomain specific new gwt account, but I just want to triple check this wont accidentally get main site removed too ?? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
E-ccomerce SEO conundrum
I work on the marketing for an eccomerce website. And lately we have been having an issue with the "wrong" page ranking and the right page "being nearly unfindable." For example the product bit o honey - there is a brand page, category page and individual product pages. We want the category page to be the top level bit o honey "hub" page and have placed all of the products onto it. We've written a nice description about Bit O Honey on that page and spent time running social campaigns around that page to try and get social activity on page as well. There have even been a few back links from blogs or reviews onto that category page. It also has the highest page authority among all of the bit o honey pages. But for some odd reason it is always the absolute last result among the 3 options when i search? In fact the only time I can even find it is if I type in about 3 sentences worth of text from that page and it find exact match. I'm really pretty confused as to why the highest page authority page with the most content, activity and link profile, would have the worst ranking capability among the three nearly identical pages.
Technical SEO | | Jonathan_Murrell0 -
True HTML
It might sound like a dumb question but can someone provide me a technique to determine if the navigational links of a website is true HTML? Thank you!!
Technical SEO | | Ideas-Money-Art0 -
How does google know a search result is a search result?
In the google webmaster forums, google specifically states that you should not include search results in the google index. What is the best way to make dynamic, great content show in search results without receiving a penalty?
Technical SEO | | nicole.healthline0 -
Help with SEO
Hello, I am brand new to SEO and I'm learning on the go everyday. I am having issues with Google and getting any sort of ranking or analysis or even just traffic reports. I understand the site has never really been optimized so it might really not have any reports. So basically my real question is what helpful tricks or hints do you guys have that I can implement? Anything and everything helps. So far I have run the crawl diagnostics and I'm working on fixing the errors. Thanks fr your help.
Technical SEO | | Future130