Robots.txt file in Shopify - Collection and Product Page Crawling Issue
-
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.**
- Disallow: /collections/+
- Disallow: /collections/%2B
- Disallow: /collections/%2b
- Disallow: /blogs/+
- Disallow: /blogs/%2B
- Disallow: /blogs/%2b
I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages?
Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow:
Thanks.
-
Make sure products are in your sitemap and it has been re-submitted. You can also submit your products to request indexing for them in Google Search Console.
-
Thank you for replying,
But, our main issue is that we have already crawled all collection pages but the product pages haven't crawled yet. Now we don't figure out that whether it's robots.txt issue or other crawling issue?
For example: "www.abc.com/collection/" page is crawled but "www.abc.com/collection/product1/" page hasn't crawled.
Please reply me some tips here.
-
While you may not want context indexed, it's still valuable to be crawled and access your most important content like products.
If you are blocking your /collections pages, Google will not be able to see that page's meta robots set to noindex, causing an issue for you. You may consider allowing robots to crawl your /collections pages but noindex them if they are low value or duplicative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Moving Pages Up a Folder to come off root domain
Good Morning I've been doing some competitor research to see why they're ranking higher than us and noticed that one who seems to be doing well has changed their url structure so that rather than being www.domain.com/product-category/product-subcategory/product-info-page/ they've removed levels so for instance they now have: www.domain.com/product-subcategory/ and www.domain.com/product-info-page/ basically everything seems to come off the root domain rather than having the traditional structure. Our rankings for the product-subcategory pages, which are probably what most people would search for, are just sitting below the first page in most instances and have been for a while I'm interested to know other people's thoughts and if this is an approach they've taken and had good results?
White Hat / Black Hat SEO | | Ham19790 -
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
Duplicate Content issue in Magento
I am getting duplicate content issue because of the following product URL in my Magento store. http://www.sitename.com/index.php/sports-nutritions/carbohydrates http://www.sitename.com/sports-nutritions/carbohydrates Please can someone guide me on how to solve it. Thanks Guys
White Hat / Black Hat SEO | | webteamBlackburn0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Hits in H1 will improve ranking by regular crawling ?
Hello ! I was wondering if it's a good idea to keep the "Hits" in the H1 ? http://www.ibremarketing.com/item/netapp-e5400-storage-system.html Will Google come to check regularly the update (new information if I'm right) or if he will not like the idea to come back just for hits update. As I have very good results on this part of the website, I do not want to take any risk. Thanks a lot !
White Hat / Black Hat SEO | | AymanH0 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
How to Solve Mysteries for Disabled Products?
I want to solve mysteries regarding disabled products on my eCommerce website. I want to give one example for my one product to know more about it. Product URL: http://www.vistastores.com/indoorlighting-patiolivingconcepts-20947.html Product Name: Floor Lamp in Monterey Bronze Finish Before 3 Months, This product was live on my website with In Stock status. Google have crawled that product, added in XML sitemap, added in Google merchant center, added in too many external website during link building campaign. Before 15 Days, This product was live on my website with Out of Stock status. Now, visitor can visit this page but, can not add in shopping cart. Now, This product is disabled from website and not available for sell. I have done lot of work to compile content, image, page rank and many other SEO stuffs to get rank with specific long trail keyword. This product is suddenly disabled from website so, it's shows 404 error and redirect to custom 404 error page. But, I am not satisfy with 301 redirect to set 302 redirect. But, is it really good? Is it require to set 301 or 302 redirect on disable products? I will never sell this product again on website. But, what about my indexing, external links, page authority? This is creating too many up an down in webmaster tools data, merchant center data, xml sitemap data and impression data. What is best solution for it? Can any one share good example for eCommerce website.
White Hat / Black Hat SEO | | CommercePundit0