Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My product category pages are not being indexed on google can someone help?
-
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us.
Our website is www.skirtinguk.com
And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
-
Hi
Am also having same issue on this category please
https://artistsbloc.org/celebrity-biographies/ -
This is probably more of a ranking authority problem, rather than an indexation problem. If you can force Google to render one of your category URLs within its search results, then it's highly likely the page is indeed indexed (it's just not ranking very well for associated keywords)
Follow this link:
https://www.google.co.uk/search?q=site%3Askirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
As you can see, the category URL which you referenced is indexed. Google can render it within their search results!
Although Google know the page exists and it is in their index, they don't bother to keep a cache of the URL: http://webcache.googleusercontent.com/search?q=cache:https%3A%2F%2Fwww.skirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
This probably means that they don't think many people use the page or that it is of low value.
What you have to keep in mind is, lower value long-tail terms (like product keywords or part number keywords) are much easier to achieve. Category terms are worth more in terms of search volume, so competition for them is higher. If your site ranks for product terms but not for category terms, it probably means your authority and / or trust metrics (as well as UX metrics) may be lower. Remember: Google don't consider their ranking results to be a space to advertise lots of companies. They want to render the best results possible for the end-user (that way people keep 'Googling' and Google continue to leverage revenue from Google AdWords etc)
Let's look at your site's domain-level metrics and see if they paint a picture of an 'authoritative' site which should be ranking for such terms...
Domain Level Metrics from Moz
Domain Authority: 24 (low)
Total Inbound Links: 1,200+
Total Referring Domains (much more important than total link count!): 123 - This is too many links from too few domains IMO
Ranking keywords: 38
Domain Level Metrics from Ahrefs
Homepage URL Rating: 11 (very low)
Domain Rating: 11 (very low)
Total Inbound Links: 2,110+
Referring Domains: 149 - Again, the disparity here could be causing problems! Not a diverse backlink profile
Ranking Keywords: 374 (Ahrefs usually finds more, go with this figure)
SEO Traffic Insights: Between 250 and 380 visits (from SEO) a day on average, not much traffic at all from SEO before November 2016 when things improved significantly
SEMRush Traffic Insights (to compare against Ahrefs): Estimates between 100 and 150 visits from SEO per day. This is narrowed to UK only though. Seems to tally with what Ahrefs is saying, the Ahrefs data is probably more accurate
Domain Level Metrics from Majestic SEO
Trust Flow: 5 - This is extremely low and really bad! Basically Majestic track the number of clicks from a seed set of trusted sites, to your site. A low number (it's on a scale of 0 to 100 I think) indicates that trustworthy seed sites aren't linking to you, or that where you are linked - people avoid clicking a link to your site (or visiting it)
Citation Flow: 24 - low but now awful
What do I get from all of this info?
I don't think your site is doing enough digital PR, or making 'enough of a difference to the web' to rank highly for category related terms. Certainly the site looks very drab and 'cookie-cutter' in terms of the template. It doesn't instil a sense of pride in the business behind the website. That can put people off linking to you, which can cause your SEO authority to fall flat on its face leaving you with no ranking power.
A lot of the product images look as if they are fake which probably isn't helping. They actually look at lot like ads which often look a bit cartoony or CGI-generated, with a balance between blue and white (colour deployment). Maybe they're being misinterpreted as spam due to Google PLA (Page Layout Algorithm). Design is not helping you out at all I am afraid!
So who is ranking for MDF skirting board? The top non-PPC (ad-based) result on Google.co.uk is this one:
https://skirtingboardsdirect.com/products/category/mdf-skirting-boards/
Ok so their content is better and deeper than yours (bullet-pointed specs or stats often imply 'granular' content to Google, which Google really likes - your content is just one solid paragraph). Overall though, I'd actually say their design is awful! It's worse than the design of your site (so maybe design isn't such a big factor here after all).
Let's compare some top-line SEO authority metrics on your site against those earned by this competitor
- Domain Authority from Moz: 24
- Referring Domains from Moz: 123
- Ahrefs Homepage URL Rating: 11
- Ahrefs Domain Rating: 11
- Ahrefs Referring Domains: 149
- Majestic SEO Trust Flow: 5
- Majestic SEO Citation Flow: 24
Now the other site...
- Domain Authority from Moz: 33 (+9)
- Referring Domains from Moz: 464 (+341)
- Ahrefs Homepage URL Rating: 31 (+20)
- Ahrefs Domain Rating: 65 (+54)
- Ahrefs Referring Domains: 265 (+116)
- Majestic SEO Trust Flow: 29 (+24)
- Majestic SEO Citation Flow: 30 (+6)
They beat you in all the important areas! That's not good.
Your category-level URLs aren't Meta no indexed, or blocked in the robots.txt file. Since we have found evidence that Google are in fact indexing your category level URLs, it's actually a ranking / authority problem, cleverly disguised as an indexation issue (I can see why you assumed that). These pages aren't **good enough **to be frequently indexed by Google, for keywords which they know hold lucrative financial value. Only the better sites (or the more authoritative ones) will rank there
A main competitor has similar design standards but has slightly deeper content and much more SEO authority than you do. The same is probably true for other competing sites. In SEO, you have to fight to maintain your positions. Sitting back is equivalent to begging your competitors to steal all of your traffic...
Hope this analysis helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Topical keywords for product pages and blogs
Hi all, I have a question regarding keywords. Of course we all know that keyword research should be focused on a certain topic and on user intent (and thus on answering specific questions) instead of trying to put keywords in a page to make it rank. However, duplicate content is of course still an issue. So here's my question: A client that sells floor heating systems that you can install yourself, has a product page for this topic and blog pages for questions regarding this topic. So following pages are on the website: Product page about the floor heating systems the client sells Blog article with tips how to install a floor heating system yourself Blog article about how to choose the right floor heating system These pages all answer different questions and are written about different topics. However, inevatibly all these pages also talk about different aspects of floor heating systems so this broad term comes up on all pages naturally. You could say that a solution is to merge pages and redirect the blogs to the product page, so the product page would answer all questions. But that is not what a customer is looking for. The goal of a product page is to trigger a conversion: let a customer contact the company or ask for a price offer. If the content on a product page is not comprehensive enough, the goal gets lost. Moreover, it doesn't make sense to talk about tips and tricks on a product page. So how do you tackle this problem without creating duplicate content? In search results, the blog pages rank for the specific questions, but the product page doesn't rank for the generic term 'floor heating'. The internal link structure is ok: the product page has obviously more incoming links than the blogs. All on page SEO factors are taken care of as well. Any ideas on this? Thanks!
Intermediate & Advanced SEO | | Mat_C0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80