How to index e-commerce marketplace product pages
-
Hello!
We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed.
I've attached images of the reasons given for not indexing the platform.
How would we go about fixing this?
-
To get your e-commerce marketplace product pages indexed, make sure your pages include unique and descriptive titles, meta descriptions, relevant keywords, and high-quality images. Additionally, optimize your URLs, leverage schema markup, and prioritize user experience for increased search engine visibility.
-
@fbcosta i hve this problem but its so less in my site
پوشاک پاپیون -
I'd appreciate if someone who faced the same indexing issue comes forward and share the case study with fellow members. Pin points steps a sufferer should do to overcome indexing dilemma. What actionable steps to do to enable quick product indexing? How we can get Google's attention so it can start indexing pages at a quick pace? Actionable advice please.
-
There could be several reasons why only 25 out of approximately 10,000 links have been indexed by Google, despite successfully submitting your sitemap through Google Search Console:
Timing: It is not uncommon for indexing to take some time, especially for larger sites with many pages. Although your sitemap has been submitted, it may take several days or even weeks for Google to crawl and index all of your pages. It's worth noting that not all pages on a site may be considered important or relevant enough to be indexed by Google.
Quality of Content: Google may not index pages that it considers low-quality, thin or duplicate content. If a significant number of your product pages have similar or duplicate content, they may not be indexed. To avoid this issue, make sure your product pages have unique, high-quality content that provides value to users.
Technical issues: Your site may have technical issues that are preventing Google from crawling and indexing your pages. These issues could include problems with your site's architecture, duplicate content, or other issues that may impact crawling and indexing.
Inaccurate Sitemap: There is also a possibility that there are errors in the sitemap you submitted to Google. Check the sitemap to ensure that all the URLs are valid, the sitemap is up to date and correctly formatted.
To troubleshoot this issue, you can check your site's coverage report on Google Search Console, which will show you which pages have been indexed and which ones haven't. You can also check your site's crawl report to see if there are any technical issues that may be preventing Google from crawling your pages. Finally, you can also run a site audit to identify and fix any technical issues that may be impacting indexing.
-
@fbcosta As per my experience, if your site is new it will take some time to index all of the URLs, and the second thing is, if you have Hundreds of URLs, it doesn't mean Google will index all of them.
You can try these steps which will help in fast indexing:
- Sharing on Social Media
- Interlinking from already indexed Pages
- Sitemap
- Share the link on the verified Google My Business Profile (Best way to index fast). You can add by-products or create a post and link it to the website.
- Guest post
I am writing here for the first time, I hope it will help
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
301 Redirects from example.com to store.example.com and then removing store.example.com subdomain
Hi I'm trying to wrap my head around the best approach for migrating our website. We're migrating from our example.com (joomla) site to our existing store.example.com (shopify) site... with the plan to finish the redirects/migration then remove the subdomain from shopify and use example.com moving forward. I've never done this and asking here to see if any harm will come from re-directing example.com URLs to store.example.com URL's then changing the store.example.com URL's to example.com. Right now my plan would run like this: redirect example.com URL's to store.example.com remove subdomain on store.example.com use example.com moving forward. wonder what happens next? Is there going to be any issues here, possible harm to the URL's?
Technical SEO | | Minarets0 -
Blogs Not Getting Indexed Intermittently - Why?
Over the past 5 months many of our clients are having indexing issues for their blog posts.
Technical SEO | | JohnBracamontes
A blog from 5 months ago could be indexed, and a blog from 1 month ago could be indexed but blogs from 4, 3 and 2 months ago aren't indexed. It isn't consistent and there is not commonality across all of these clients that would point to why this is happening. We've checked sitemap, robots, canonical issues, internal linking, combed through Search Console, run Moz reports, run SEM Rush reports (sorry Moz), but can't find anything. We are now manually submitting URLs to be indexed to try and ensure they get into the index. Search console reports for many of the URLs will show that the blog has been fetched and crawled, but not indexed (with no errors). In some cases we find that the blog paginated pages (i.e. blog/page/2 , blog/page/3 , etc.) are getting indexed but not the blogs themselves. There aren't any nofollow tags on the links going to the blogs either. Any ideas? *I've added a screenshot of one of the URL inspection reports from Search Console alt text0 -
Optimizing shop content for desktop and mobile users
When arranging content on a shop category page I place a descriptive optimized opening paragraph of text above products. On desktop this shows both the opening text and the products above the fold (visible here https://www.scamblermusic.com/royalty-free-music-downloads/ - also shown on the screen grab below). The text may well be ignored by most visitors (who will likely be drawn straight to product images) but it still serves a purpose. dekstop.png When it comes to smaller mobile screens I have started to disable the opening paragraph of text (above the products) and instead place a copy of it below the products, (screen grab below). This keeps the optimized text on the page, but it means that mobile users instantly see products rather than having to scroll past text that they may see as inconvenient. mobile.png I'm conscious of the fact that Google indexes mobile content first, and it also doesn't like duplicate content. I therefore have three questions relating to this: Will moving the optimized text content below all the products to the bottom of the page devalue it (I understand important content should be as near to the top of page as possible)? Although the optimized paragraph of text only displays once on desktop (at the top of the page) and once on mobile (at the bottom of the page) it is actually visible twice in the source code - does this count as duplication, and could it therefore hurt the performance of the page in SERPs? If this practice does cause issues, is there an ideal way to optimize content on pages (especially shop category pages) that doesn't require mobile users to scroll through text before seeing products? Lastly, on topic optimized landing pages that feature product promotions such as this one - https://www.scamblermusic.com/royalty-free-music-downloads/music-licensing-scotland/ - I wonder if it is best to lead with an optimized text introduction above product images, or better to place the products right at the top of the page for immediate impact, then follow this with the content/article/blog post? Many thanks for any advice offered.
On-Page Optimization | | JCN-SBWD0 -
Could a dropdown list of products dilute the page content?
Hi all, On our site, due to the fact we only have some 120 or so products split across 5 different categories we have a dropdown menu that displays all of the products in the menu. Forgetting usability for a moment, my question is whether by having links to all of products appear on each and every page (because they are in the main menu), are we diluting the content on the page. For example, if I take a particular product - the main phrase I want that page to be discovered for is "perspex sheet". This phrase does appear in the H1, H2 and within the main description of the product - but, as mentioned, each of our pages has some 120+ internal links due to the menu which contain all sorts of product names that arent relevant to "perspex sheet". The Moz report does flag a Medium issue on every page due to the number of internal links. I don't know whether I'm making a fuss about nothing, or whether this does have some serious side effects. It's an eCommerce site so of course im nervous of making changes that could have an adverse affect on our rankings. I thought there used to be a tool on Moz that showed what phrases a page was optimised for but i can no longer find that tool. Any help would be greatly appreciated. Regards,
Technical SEO | | SimplyPlastic
Al0 -
Pros and Cons of Rel Author on Product Pages
I've heard that having rel=author enabled on your pages can be great for increasing click through rate but you should not use it on every page on your site. What are the pros and cons of using rel=author on product pages? Do you use rel=author on your product pages or just on your blog articles?
Technical SEO | | Charlessipe1 -
How to verify a page-by-page level 301 redirect was done correctly?
Hello, I told some tech guys to do a page-by-page relevant 301 redirect (as talked about in Matt Cutts video https://www.youtube.com/watch?v=r1lVPrYoBkA) when a company wanted to move to a new domain when their site was getting redesigned. I found out they did a 302 redirect on accident and had to fix that, so now I don't trust they did the page-by-page relevant redirect. I have a feeling they just redirected all of the pages on the old domain to the homepage of the new domain. How could I confirm this suspicion? I run the old domain through screaming frog and it only shows 1 URL - the homepage. Does that mean they took all of the pages on the old domain offline? Thanks!
Technical SEO | | EvolveCreative0