Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
-
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines.
I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important.
Resources:
http://www.ericpender.com/blog/tabs-and-seo
http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en
http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643
Sample in use: http://www.seomoz.org/article/search-ranking-factors
**Old Version: **
http://screencast.com/t/BWn0OgZsXt
http://seatgeek.com/boston-celtics-tickets/
New Version with tabs:
http://screencast.com/t/VW6QzDaGt
http://screencast.com/t/RPvYv8sT2
http://seatgeek.com/miami-heat-tickets/
Notes:
- Content not displayed stacked on browser when Javascript turned off, but it is in the source code.
- Content shows up in Google cache of new page in the text version.
- In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be?
- Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional).
- This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource.
- Are there any issues with hidden text / is this too far down in the html?
Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
-
Cool. When we launched them separately we overvalued the potential for ticket prices rankings and had so little respect from engines that double ranking was hard. Also, I wasn't as on my game with SEO back then.
I think merging is the way to go, I am filing it into our dev. queue for the coming weeks.
-
I'd probably agree with that merge decision. Topic is basically the same, primary difference is inclusion of "price" in the keyword targeting from what I see, and that can likely be achieved with one master page.
Furthermore, having awesome data integrated like that will lead to links, because it's better than most crappy ticket sites. Big boost in PA from that leads to better rankings than just the 2 pages IMO.
-
Thanks for the helpful response. And I definitely am with you on the idea of having better data on all our pages. I initially set it up separately but have been leaning towards merging those ticket price pages with the tickets pages and killing off (301ing the price pages to the tickets pages). Make sense?
-
My general rule of thumb is that as long as all of the content is delivered via HTML (which it appears to be), and the switching of the tabs is done via javascript (which it is) than you're mostly OK.
You do have one issue though - the current code on http://seatgeek.com/miami-heat-tickets/ doesn't die gracefully. You recognized this in your notes, but if a user doesn't have Javascript turned on, they can't access the text. That's an issue for usability, and you could make an argument that it might be bad for SEO, but either way I believe it should be fixed. When javascript isn't enabled, the content should still load below the event listings. Typically that means it should load that way automatically, and javascript should then hide the tab when the page loads and show it once they click on the tab.
Ideally the content would be made easily available (currently the tabs aren't as intuitive as they are on a Facebook page, for example). Putting them above the photo might help that?
Also, from a user perspective, the written content is mostly there for SEO purposes right now. Stuff like the price stats is cool information that I would find interesting while shopping for tickets - maybe there's a way to show that graphically on the page in a more interesting way than text?
Update - I just noticed that those stats are displayed on http://seatgeek.com/miami-heat-ticket-prices in an awesome way - do stuff like that for all of your pages!
On the same tabs topic, but separate from your implementation, I've seen companies load content from an XML file using Javascript. That is definitely not SEO friendly and can cause indexation issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
Is Newswire service good for SEO?
Hi, I am thinking of signing up for Newswire and send PR every month? Will it have any SEO benefits in terms of backlinks? I am just worried because of Panda release those PR services might have no value anymore...
Technical SEO | | get12000 -
How to de-index a page with a search string with the structure domain.com/?"spam"
The site in question was hacked years ago. All the security scans come up clean but the seo crawlers like semrush and ahrefs still show it as an indexed page. I can even click through on it and it takes me to the homepage with no 301. Where is the page and how to deindex it? domain/com/?spam There are multiple instances of this. http://www.clipular.com/c/5579083284217856.png?k=Q173VG9pkRrxBl0b5prNqIozPZI
Technical SEO | | Miamirealestatetrendsguy1 -
No index
Screaming frog spider does index pages on our website like: wp-content/plugins/woocommerce/assets/js/frontend/jquery-ui-touch-punch.min.js?ver=2.3.9 wp-content/plugins/mailchimp-for-wp/assets/css/checkbox.min.css?ver=2.3.2 Is it a bad/good idea to set my parameters in Webmastertools and tell Google not to crawl pages that begin with wp/content? Thanks!
Technical SEO | | Happy-SEO1 -
No Index PDFs
Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!
Technical SEO | | MonicaOConnor0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
Is there any evidence that using Google Site Search will help your ranking, speed of indexing, or traffic?
I am considering using Google Site Search on our new site. I was told... "We have also seen a bump in traffic for sites when using Google Site Search because Google indexes the site more often (they claim using the paid Google Site Search has no effect on search rankings but we have also seen bumps in rankings after using it so that may just be what they have to say legally)." Is there any evidence of this? Would you recommend using Google Site Search? Thanks David
Technical SEO | | DavidButler710 -
How to handle .mobi and normal website for mobile search and regular search
Hi, we have our regular website at jameda.de and a mobile only page at jameda.mobi Users on mobile devices will be automatically redirected to .mobi if they click on a link to jameda.de in the SERPs. What is the best practice to ensure, that Googlebot is indexing jameda.de and Googlebot Mobile is indexing jameda.mobi without duplicate content issues and having Link-Juice benefits on mobile search at the same time? Thanks a lot
Technical SEO | | jameda0