Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My product category pages are not being indexed on google can someone help?
-
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us.
Our website is www.skirtinguk.com
And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
-
Hi
Am also having same issue on this category please
https://artistsbloc.org/celebrity-biographies/ -
This is probably more of a ranking authority problem, rather than an indexation problem. If you can force Google to render one of your category URLs within its search results, then it's highly likely the page is indeed indexed (it's just not ranking very well for associated keywords)
Follow this link:
https://www.google.co.uk/search?q=site%3Askirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
As you can see, the category URL which you referenced is indexed. Google can render it within their search results!
Although Google know the page exists and it is in their index, they don't bother to keep a cache of the URL: http://webcache.googleusercontent.com/search?q=cache:https%3A%2F%2Fwww.skirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
This probably means that they don't think many people use the page or that it is of low value.
What you have to keep in mind is, lower value long-tail terms (like product keywords or part number keywords) are much easier to achieve. Category terms are worth more in terms of search volume, so competition for them is higher. If your site ranks for product terms but not for category terms, it probably means your authority and / or trust metrics (as well as UX metrics) may be lower. Remember: Google don't consider their ranking results to be a space to advertise lots of companies. They want to render the best results possible for the end-user (that way people keep 'Googling' and Google continue to leverage revenue from Google AdWords etc)
Let's look at your site's domain-level metrics and see if they paint a picture of an 'authoritative' site which should be ranking for such terms...
Domain Level Metrics from Moz
Domain Authority: 24 (low)
Total Inbound Links: 1,200+
Total Referring Domains (much more important than total link count!): 123 - This is too many links from too few domains IMO
Ranking keywords: 38
Domain Level Metrics from Ahrefs
Homepage URL Rating: 11 (very low)
Domain Rating: 11 (very low)
Total Inbound Links: 2,110+
Referring Domains: 149 - Again, the disparity here could be causing problems! Not a diverse backlink profile
Ranking Keywords: 374 (Ahrefs usually finds more, go with this figure)
SEO Traffic Insights: Between 250 and 380 visits (from SEO) a day on average, not much traffic at all from SEO before November 2016 when things improved significantly
SEMRush Traffic Insights (to compare against Ahrefs): Estimates between 100 and 150 visits from SEO per day. This is narrowed to UK only though. Seems to tally with what Ahrefs is saying, the Ahrefs data is probably more accurate
Domain Level Metrics from Majestic SEO
Trust Flow: 5 - This is extremely low and really bad! Basically Majestic track the number of clicks from a seed set of trusted sites, to your site. A low number (it's on a scale of 0 to 100 I think) indicates that trustworthy seed sites aren't linking to you, or that where you are linked - people avoid clicking a link to your site (or visiting it)
Citation Flow: 24 - low but now awful
What do I get from all of this info?
I don't think your site is doing enough digital PR, or making 'enough of a difference to the web' to rank highly for category related terms. Certainly the site looks very drab and 'cookie-cutter' in terms of the template. It doesn't instil a sense of pride in the business behind the website. That can put people off linking to you, which can cause your SEO authority to fall flat on its face leaving you with no ranking power.
A lot of the product images look as if they are fake which probably isn't helping. They actually look at lot like ads which often look a bit cartoony or CGI-generated, with a balance between blue and white (colour deployment). Maybe they're being misinterpreted as spam due to Google PLA (Page Layout Algorithm). Design is not helping you out at all I am afraid!
So who is ranking for MDF skirting board? The top non-PPC (ad-based) result on Google.co.uk is this one:
https://skirtingboardsdirect.com/products/category/mdf-skirting-boards/
Ok so their content is better and deeper than yours (bullet-pointed specs or stats often imply 'granular' content to Google, which Google really likes - your content is just one solid paragraph). Overall though, I'd actually say their design is awful! It's worse than the design of your site (so maybe design isn't such a big factor here after all).
Let's compare some top-line SEO authority metrics on your site against those earned by this competitor
- Domain Authority from Moz: 24
- Referring Domains from Moz: 123
- Ahrefs Homepage URL Rating: 11
- Ahrefs Domain Rating: 11
- Ahrefs Referring Domains: 149
- Majestic SEO Trust Flow: 5
- Majestic SEO Citation Flow: 24
Now the other site...
- Domain Authority from Moz: 33 (+9)
- Referring Domains from Moz: 464 (+341)
- Ahrefs Homepage URL Rating: 31 (+20)
- Ahrefs Domain Rating: 65 (+54)
- Ahrefs Referring Domains: 265 (+116)
- Majestic SEO Trust Flow: 29 (+24)
- Majestic SEO Citation Flow: 30 (+6)
They beat you in all the important areas! That's not good.
Your category-level URLs aren't Meta no indexed, or blocked in the robots.txt file. Since we have found evidence that Google are in fact indexing your category level URLs, it's actually a ranking / authority problem, cleverly disguised as an indexation issue (I can see why you assumed that). These pages aren't **good enough **to be frequently indexed by Google, for keywords which they know hold lucrative financial value. Only the better sites (or the more authoritative ones) will rank there
A main competitor has similar design standards but has slightly deeper content and much more SEO authority than you do. The same is probably true for other competing sites. In SEO, you have to fight to maintain your positions. Sitting back is equivalent to begging your competitors to steal all of your traffic...
Hope this analysis helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you check the google cache for hashbang pages?
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?
Intermediate & Advanced SEO | | navidash0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
[e-commerce] Should I index product variants?
Hi guys, I have e-commerce site, that sells car tires. I was wondering would I benefit from making all Product Variants ( for example each tire size ) as different page, that has link to the main product to provide some affiliation, or should I make each variant noindex, and add rel=canonical to the main product. The benefits from having each variant indexed can be many: greater click through rate more relative results for customers etc. But I'm not sure how to handle the duplicate content issue ( in this case, only the title, URL and H1 can be different ). Regards.
Intermediate & Advanced SEO | | seo220 -
Magento: URLs for Products in Multiple Categories
I am working in Magento to build out a large e-commerce site with several thousand products. It's a great platform, but I have run into the issue of what it does to URLs when you put a product into multiple categories. Basically, "a book" in two categories would make two URLs for one product: 1) /books/a-book 2) author-name/a-book So, I need to come up with a solution for this. It seems I have two options: Found this from a Magento SEO article: 'Magento gives you the ability to add the name of categories to path for product URL's. Because Magento doesn't support this functionality very well - it creates duplicate content issues - it is a very good idea to disable this. To do this, go to System => Configuration => Catalog => Search Engine Optimization and set "Use categories path for product URL's to "no".' This would solve the issues and be a quick fix, but I think it's a double edged sword, because then we lose the SEO value of our well named categories being in the URL. Use Canonical tags. To be fair, I'm not even sure this is possible. Even though it is creating different URLs and, thus, poses a risk of "duplicate content" being crawled, there really is only one page on the admin side. So, I can't go to all of the "duplicate" pages and put a canonical tag, because those duplicate pages don't really exist on the back-end. Does that make sense? After typing this out, it seems like the best thing to do probably will be to just turn off categories in the URL from the admin side. However, I'd still love any input from the community on this. Thanks!
Intermediate & Advanced SEO | | Marketing.SCG0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0