Why does Google rank a product page rather than a category page?
-
Hi, everybody
In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag..
What could be the critical factors that makes the product page more relevant than the category page as the landing page?
-
Usually, the reason is because category pages tend to be light on content and thus, not as indexable. And sometimes category pages are noindex due to this. Google is likely looking at your product page and finding it more relevant, based on content, than the category page for people who are looking for "sleeping bags."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does you page need to be unique to rank
What I mean by unique is : Let's imagine I want to rank one "seo ranking factors." In order to compete do I need to have (in terms of design) that is totally different than everything out there or can I rank with a page that is presented in a very similar way than everything out there but with different content. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Page not ranking because of React.js ?
Hey guys, I'm struggling with this part of my website which uses react.js . My developers used this saying it's much better and much quicker (which I think so too) but we have really low traffic coming from google compared to the other parts of the website (not using react.js). Moz gives me a score of 85% for the page but we get less than 100 visits / day and we were targeting 10.000 visits/day giving the traffic of this section in our competitors website (our whole website has 60.000 visits / day). (Section is online since 3 months now) Can you help me see what is wrong there ? I'm in Belgium so we have the website in 3 languages (FR/NL/EN) but the most important ones are FR & NL. FR : https://gocar.be/fr/prix-voitures-neuves/Audi/A3/A3-Sportback/1-0-TFSI_39CER NL : https://gocar.be/nl/prijzen-nieuwe-wagens/Audi/A3/A3-Sportback/1-0-TFSI_39CER EN : https://gocar.be/en/price-new-cars/Audi/A3/A3-Sportback/1-0-TFSI_39CER Main competitors having a better ranking than us (exemple in FR) : https://www.moniteurautomobile.be/modele--audi--a3/prix.html https://www.vroom.be/fr/prix/audi-a3/citadine-2012/197 Cheers ! Jean-Philippe
Intermediate & Advanced SEO | | Gocar_be0 -
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
Ecommerce category pages
Hi there, I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise. They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'. So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes. Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's. I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on. Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools. Any opinions on this matter?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
How many times will Google read a page?
Hello! Do you know if Google reads a page more than once? We want to include a very robust menu that has a lot of links, so we were thinking about coding a very simple page that loads first and immediately loading the other code that has all the links thinking that perhaps Google will only read the first version but won't read it the second time with all the links. Do you know if we will get penalized? I'm not sure if I got the idea across, let me know if I need to expand more. Thanks,
Intermediate & Advanced SEO | | alinaalvarez0 -
Should I use individual product pages for different formats of the same product?
Hi All -- I'm working with a publishing client who is launching a new site. They have a large product catalogue offered in a number of format types (print, ebook, online learning, packages) with each one possessing a unique ISBN code. From past experience, I know that ISBN codes can be a really important ranking factor. We are currently trying to sort out product page guidelines. The proposed methods are: A single product page for all formats. The user then has the option to select which format they wish to purchase. The page would contain all key descriptors for each format, including: individual ISBN, format, title, price, author, etc. We would then use schema mark-up just to assist search engines with understanding and crawling. BUT we worry that the single page won't rank as well as say an invidual product page with a unique ISBN in the URL (for example: http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470573325.html) Which leads to the next option... Individual URLs for each format. We understand that most e-commerce guidelines state you shouldn't dilute link equity amongst multiple pages with very similar products and descriptions. BUT we want searchers to be able to search by individual ISBN and still find that specific format within the SERPs. This seems to rule out canonicalizing, because we don't prefer one format over the other and still want say the ebook to show up as much as the print version. If anyone has any other options or considerations that we haven't thought about, it would be greatly appreciated. Thanks, U
Intermediate & Advanced SEO | | HarborOneBank0 -
Google stripping down Page Titles
When viewing pages indexed by Google, I've noticed the Page Titles have be down stripped as follows: Actual Page Title: CITY Keyword - STATE keyword
Intermediate & Advanced SEO | | SpadaMan
Google Indexed Page Title: 1 - Domain.com None of the keywords in the actual PAGE TITLE are present; all words have been replaced with a random digit -domain.com. We launched a new version of the site several months back.. Any Idea on what can be causing this?0 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0