After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fully index big ecommerce websites (that have deep catalog hierarchy)?
-
When building very large ecommerce sites, the catalog data can have millions of product SKUs and a massive quantity of hierarchical navigation layers (say 7-10) to get to those SKUs. On such sites, it can be difficult to get them to index substantially. The issue doesn’t appear to be product page content issues. The concern is around the ‘intermediate’ pages -- the many navigation layers between the home page and the product pages that are necessary for a user to funnel down and find the desired product. There are a lot of these intermediate pages and they commonly contain just a few menu links and thin/no content. (It's tough to put fresh-unique-quality content on all the intermediate pages that serve the purpose of helping the user navigate a big catalog.) We've played with NO INDEX, FOLLOW on these pages. But structurally it seems like a site with a lot of intermediate pages containing thin content can result in issues such as shallow site indexing, weak page rank, crawl budget issues, etc. Any creative suggestions on how to tackle this?
-
Yes, the links should come from your own website.
If you have a powerful site, creating sitewide links to several logical category pages within your product pages can be adequate.
If your site is new or not very strong yet then it may be best to grow the number of product pages in steps as your site is able to get them in the index and hold them in the index. A weak site will probably not be able to get 5,000,000 pages indexed. If your site is not powerful, attempting to do it usually results in a ranking decline on the original part of the site.
-
Thanks for the response. To clarify... you're suggesting we link internally from our highest PR pages to pages deep inside the catalog (ie. product pages)?
-
Link deep into the site at many different internal hubs from high PR pages. That forces spiders into the depths of the site and forces them to chew their way out through unindexed pages. These links must remain in place permanently if you want the site to stay in the index, because if Google goes too long without spidering a page it will forget about it.
A mistake that people often make is to try to place five million pages on a PR3 website. That will not work. Not enough spiders coming in. For a site like you are talking about you might need many dozen healthy PR6 links or hundreds of PR5 links and quite a bit of prayer. For a site as deep as yours you might need to link to hubs at multiple depths because Google does budget the amount of crawl that they will perform. The spiders will die down there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Explore more categories
-
Chat with the community about the Moz tools.
-
Discuss the SEO process with fellow marketers
-
Discuss industry events, jobs, and news!
-
Chat about tactics outside of SEO
-
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
-