The use of tabs on productpages, do or don't?
-
Does google has any trouble reading content tabs? The content is not loaded by ajax and is already in the page source code.
As i'm checking some big e-commerce websites or (amazon.com for example) they get rid of the tabs with content and put the different content below eachother.Is his better for SEO purpose? But what about user experience? For users it think it is easier to navigate by tabs then to have a long page to scroll.
What do you guys think about this issue?
-
oke. So if i would like to try to get better SEO results(better organic results) i should get rid of the tabs and put the hidden content and hidden links below eachother.?(like Amazon)
Here you can see an example of our current productdetail page with tabs: http://goo.gl/wce1EO -
I wouldn't never suggest you put content into anything that is hidden (until revealed with a click) and hope to get any SEO benefit from it. But move away from any SEO questions for a moment and remember that there are occasions where this does work well, as you mentioned, for the user experience, and this is why it should be done - nothing more.
Sometimes they will work better than others. It all depends upon the application in question.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Country and Language tags-Running an SEO audit on a site that definitely has more than one language, but nothing is pulling up. I don't quite understand href lang or how to go about it. HELP HELP!
Ran an SEO audit and I don't really understand country and language tags. For example, sony.com definitely has more than one language, but how do I seo check href lang ? Do I inspect the page? etc?
Technical SEO | | Mindgruver0 -
What IS SEO FRIENDLY BEST PRACTICE FOR URLS FILTERED 'TAGGED'
EX: https://www.STORENAME.com/collections/all-deals/alcatel– Tagged "Alcatel", when I run audits, I come across these URLS that give me duplicate content and missing H1. This is Canonical: https://www.STORENAMEcom/collections/all-deals/alcatel Any advice on how to tackle these I have about4k in my store! Thank you
Technical SEO | | Sscha0030 -
Specific pages won't index
I have a few pages on my site that Google won't index, and I can't understand why. I've looked into possible issues with Robots, noindex, redirects, canonicals, and Search Console rules. I've got nothing. Example: I want this page to index https://tour.franchisebusinessreview.com/services/franchisee-satisfaction-surveys/ When I Google the full URL, I get results including the non-subdomain homepage, and various pages on the subdomain, including a child page of the page I want, but not the page itself. Any ideas? Thanks for the help!
Technical SEO | | ericstites0 -
Received A Notice Regarding Spammy Structured Data. But we don't have any structured data or do we?
Got a message that we have spammy structured data on our site via webmaster tools and have no idea what they are referring to. We do not use any structured data using schema.org mark up. Could they be referring to something else? The message was: To: Webmaster of <a>http://www.lulus.com/</a>, Google has detected structured markup on some of your pages that violates our structured data quality guidelines. In order to ensure quality search results for users, we display rich search results only for content that uses markup that conforms to our quality guidelines. This manual action has been applied to lulus.com/ . We suggest that you fix your markup and file a reconsideration request. Once we determine that the markup on the pages is compliant with our guidelines, we will remove this manual action. What could we be showing them that would be interpreted as structured data, and or spammy structured data?
Technical SEO | | KentH0 -
Sitemap url's not being indexed
There is an issue on one of our sites regarding many of the sitemap url's not being indexed. (at least 70% is not being indexed) The url's in the sitemap are normal url's without any strange characters attached to them, but after looking into it, it seems a lot of the url's get a #. + a number sequence attached to them once you actually go to that url. We are not sure if the "addthis" bookmark could cause this, or if it's another script doing it. For example Url in the sitemap: http://example.com/example-category/0246 Url once you actually go to that link: http://example.com/example-category/0246#.VR5a Just for further information, the XML file does not have any style information associated with it and is in it's most basic form. Has anyone had similar issues with their sitemap not being indexed properly ?...Could this be the cause of many of these url's not being indexed ? Thanks all for your help.
Technical SEO | | GreenStone0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0 -
Url's don't want to show up in google. Please help?
Hi Mozfans 🙂 I'm doing a sitescan for a new client. http://www.vacatures.tuinbouw.nl/ It's a dutch jobsite. Now the problem is here: The url http://www.vacatures.tuinbouw.nl/vacatures/ is in google.
Technical SEO | | MaartenvandenBos
On the same page there are jobs (scroll down) with a followed link.
To a url like this: http://www.vacatures.tuinbouw.nl/vacatures/722/productie+medewerker+paprika+teelt/ The problem is that the second url don't show up in google. When i try to make a sitemap with Gsitecrawler the second url isn't in de sitemap.. :S What am i doing wrong? Thanks!0