Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My product category pages are not being indexed on google can someone help?
-
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us.
Our website is www.skirtinguk.com
And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
-
Hi
Am also having same issue on this category please
https://artistsbloc.org/celebrity-biographies/ -
This is probably more of a ranking authority problem, rather than an indexation problem. If you can force Google to render one of your category URLs within its search results, then it's highly likely the page is indeed indexed (it's just not ranking very well for associated keywords)
Follow this link:
https://www.google.co.uk/search?q=site%3Askirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
As you can see, the category URL which you referenced is indexed. Google can render it within their search results!
Although Google know the page exists and it is in their index, they don't bother to keep a cache of the URL: http://webcache.googleusercontent.com/search?q=cache:https%3A%2F%2Fwww.skirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
This probably means that they don't think many people use the page or that it is of low value.
What you have to keep in mind is, lower value long-tail terms (like product keywords or part number keywords) are much easier to achieve. Category terms are worth more in terms of search volume, so competition for them is higher. If your site ranks for product terms but not for category terms, it probably means your authority and / or trust metrics (as well as UX metrics) may be lower. Remember: Google don't consider their ranking results to be a space to advertise lots of companies. They want to render the best results possible for the end-user (that way people keep 'Googling' and Google continue to leverage revenue from Google AdWords etc)
Let's look at your site's domain-level metrics and see if they paint a picture of an 'authoritative' site which should be ranking for such terms...
Domain Level Metrics from Moz
Domain Authority: 24 (low)
Total Inbound Links: 1,200+
Total Referring Domains (much more important than total link count!): 123 - This is too many links from too few domains IMO
Ranking keywords: 38
Domain Level Metrics from Ahrefs
Homepage URL Rating: 11 (very low)
Domain Rating: 11 (very low)
Total Inbound Links: 2,110+
Referring Domains: 149 - Again, the disparity here could be causing problems! Not a diverse backlink profile
Ranking Keywords: 374 (Ahrefs usually finds more, go with this figure)
SEO Traffic Insights: Between 250 and 380 visits (from SEO) a day on average, not much traffic at all from SEO before November 2016 when things improved significantly
SEMRush Traffic Insights (to compare against Ahrefs): Estimates between 100 and 150 visits from SEO per day. This is narrowed to UK only though. Seems to tally with what Ahrefs is saying, the Ahrefs data is probably more accurate
Domain Level Metrics from Majestic SEO
Trust Flow: 5 - This is extremely low and really bad! Basically Majestic track the number of clicks from a seed set of trusted sites, to your site. A low number (it's on a scale of 0 to 100 I think) indicates that trustworthy seed sites aren't linking to you, or that where you are linked - people avoid clicking a link to your site (or visiting it)
Citation Flow: 24 - low but now awful
What do I get from all of this info?
I don't think your site is doing enough digital PR, or making 'enough of a difference to the web' to rank highly for category related terms. Certainly the site looks very drab and 'cookie-cutter' in terms of the template. It doesn't instil a sense of pride in the business behind the website. That can put people off linking to you, which can cause your SEO authority to fall flat on its face leaving you with no ranking power.
A lot of the product images look as if they are fake which probably isn't helping. They actually look at lot like ads which often look a bit cartoony or CGI-generated, with a balance between blue and white (colour deployment). Maybe they're being misinterpreted as spam due to Google PLA (Page Layout Algorithm). Design is not helping you out at all I am afraid!
So who is ranking for MDF skirting board? The top non-PPC (ad-based) result on Google.co.uk is this one:
https://skirtingboardsdirect.com/products/category/mdf-skirting-boards/
Ok so their content is better and deeper than yours (bullet-pointed specs or stats often imply 'granular' content to Google, which Google really likes - your content is just one solid paragraph). Overall though, I'd actually say their design is awful! It's worse than the design of your site (so maybe design isn't such a big factor here after all).
Let's compare some top-line SEO authority metrics on your site against those earned by this competitor
- Domain Authority from Moz: 24
- Referring Domains from Moz: 123
- Ahrefs Homepage URL Rating: 11
- Ahrefs Domain Rating: 11
- Ahrefs Referring Domains: 149
- Majestic SEO Trust Flow: 5
- Majestic SEO Citation Flow: 24
Now the other site...
- Domain Authority from Moz: 33 (+9)
- Referring Domains from Moz: 464 (+341)
- Ahrefs Homepage URL Rating: 31 (+20)
- Ahrefs Domain Rating: 65 (+54)
- Ahrefs Referring Domains: 265 (+116)
- Majestic SEO Trust Flow: 29 (+24)
- Majestic SEO Citation Flow: 30 (+6)
They beat you in all the important areas! That's not good.
Your category-level URLs aren't Meta no indexed, or blocked in the robots.txt file. Since we have found evidence that Google are in fact indexing your category level URLs, it's actually a ranking / authority problem, cleverly disguised as an indexation issue (I can see why you assumed that). These pages aren't **good enough **to be frequently indexed by Google, for keywords which they know hold lucrative financial value. Only the better sites (or the more authoritative ones) will rank there
A main competitor has similar design standards but has slightly deeper content and much more SEO authority than you do. The same is probably true for other competing sites. In SEO, you have to fight to maintain your positions. Sitting back is equivalent to begging your competitors to steal all of your traffic...
Hope this analysis helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
How to setup multiple pages in Google Search?
How to setup multiple pages in Google Search? I have seen sites that are arranged in google like : Website in Google
Intermediate & Advanced SEO | | Hall.Michael
About us. Contact us
Services. Etc.. Kindly review screenshot. Is this can achieved by Yoast Plugin? X9vMMTw.png0 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
[e-commerce] Should I index product variants?
Hi guys, I have e-commerce site, that sells car tires. I was wondering would I benefit from making all Product Variants ( for example each tire size ) as different page, that has link to the main product to provide some affiliation, or should I make each variant noindex, and add rel=canonical to the main product. The benefits from having each variant indexed can be many: greater click through rate more relative results for customers etc. But I'm not sure how to handle the duplicate content issue ( in this case, only the title, URL and H1 can be different ). Regards.
Intermediate & Advanced SEO | | seo220 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0