Reason for robots.txt file blocking products on category pages?
-
Hi
I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google.
Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!!
Thanks
-
Thanks again AL123al!
I would be concerned about my internal linking because of this problem. I've always wanted to keep important pages within 3 clicks of the Homepage. My worry here is that while these products can get clicked by a user within 3 clicks of the Homepage, they're blocked to Googlebot.
So the product URLS are only getting crawled in the sitemap, which would be hugely ineffcient? So I think I have to decide whether opening up these pages will improve my linking structure for Google to crawl the product pages, but is that important than increasing the amount of pages it's able to crawl and wasting crawl budget?
-
Hello,
The canonical product URLS will be getting crawled just fine as they are not blocked in the robots.txt. Without understanding your problem completely, I think the guys before you were trying to stop all the duplicate URLS with parameters being crawled and just leaving Google to crawl the canonicals - which is what you want.
If you remove the parameter from robots.txt then Google will crawl everything including the parameter URLS. This will waste crawl budget. So better that Google is only crawling the canonicals.
Regarding the sitemap, being present on the sitemap will help Googlebot decide what to prioritise crawling but won't stop it finding other URLS if there is good internal linking.
-
Thanks AL123al! The base URL's (www.example.com/product-category/ladies-shoes) do seem to be getting crawled here & there, and some are ranking which is great. But I think the only place they can get crawled is the sitemap, which has has over 28,000 URLs on one page (another thing I need to fix)!
So if Googlebot gets to the parameter URL through category pages (www.example.com/product-category/ladies-shoes?cgid...) and sees it's blocked, I'm guessing it can't see it's important to us (from the website hierarchy) or the canonical tag, so I'm presuming it's seriously damaging or power in getting products ranked
In Screaming Frog, 112,000 get crawled and 68% are blocked by robots. 17,000 are URL's which contain "?cgid", which I don't think is too big for Googlebot to crawl, the websites has a pretty good authority so I think we have a pretty deep crawl.
So I suppose what really want to know is will removing "?cgid" from the robots file really damage the site? I my opinion, I think it'll really help
-
This looks like the products are being appended by a parameter ?cgid - there may be other stuff attached to the end of each URL like this below:
e.g. www.example.com/product-category/ladies-shoes?cgid-product=19&controller=product etc
but canonical URL is www.example.com/product-category/ladies-shoes
These products may have had a canonical to the base URL which means that there won't be any problem with duplicates being indexed. So all well and good.
Except.....Google has to crawl each of these parameter URLs to find the canonical. In a huge website this means that crawl budget is being consumed by unnecessary crawling of these parameterised URLs.
You can tell Google not to crawl the parameter URLs in search console (at least in the old version you can). But you can also stop Google crawling these URLS unnecessarily by blocking them in robots txt if you are sure that the parameters are not changing how the page is looking in search.
So long story short is that is why you may see that the URLS with parameters are being blocked in robots.txt. The canonical version URLS will be getting crawled just fine since they don't have any parameters and hence not being blocked.
Hope that makes sense?
-
Yes, it's in the robot.txt, that's the problem. Someone had to physically put it in there, but I've no idea why they would.
-
Did you check your robot txt file? Or check if any plugin creating this problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Problem With My Pages ?
Hi everyone, I created a self-hosted custom PHP website the page is ( custom cosmetic boxes ), but it doesn't show up on Google. I tried to follow tutorials on the HubSpot, but even my site sucks. Please help me where I am wrong. Maybe the little thing is left unattended, which makes it void.
Web Design | | fbowable
Please help me.0 -
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Infinite Scroll and SEO - Is it enough to only link to the previous and next page in the pagination?
Hi all, We are implementing an eCommerce site where the results pages of the products will be visibile on one page (always loading new products when you scroll down the page). Now, I have read that the Google spiders cannot "load" new products scrolling down the page, hence the spider only sees the first few products of the results page. Our developer wants to implement a system where a users sees the first products on example.com/products Then scrolling down, he will see new products with the URL changing to example.com/page/2 and so on. Is it enough that we add a pagination link that goes from example.com/products to example.com/page/2 Then another link that goes from example.com/page/2 to example.com/page/3 and so on, so the Google spider can make his way through all the pages? Or is that too much deep linking and the spider wouldn't even crawl all the results pages? Any recommendations how to go about this? Many thanks in advance!
Web Design | | Gabriele_Layoutweb0 -
40 percent redundant content on landing pages with 60 percent unique information.
I have searched schema.org for tags to use for our redudant content on 25 unique local landing pages. The redundant content references our services and abilities on each page. Could anyone tell me how to retain this content and direct the search engines to disregard this portion of the landing page. We are a WordPress site -- if there is a plugin - I would love to know which one might work, although I have not been able to find one that will protect us from duplicate content issues. Thank you in advance.
Web Design | | seant1190 -
Page Title Optimization
I am reviewing the optimization on my site and it appears that my page titles follow this method: PAGE_NAME | KEYWORD in CITY ST - COMPANY_NAME I am pretty well optimized for "KEYWORD in CITY ST" but am wondering if I should drop it from all page titles except for the pages that actually deal with that keyword. What are your thoughts on optimizing?
Web Design | | nusani0 -
Are there any studies, statistics or measurable impact of using mixed fonts on landing pages?
Are there any studies, statistics or measurable impact of using mixed fonts on landing pages? One of our landing pages is using five variations of the Arial font where size, strength (bold, italics) and color all vary. One camp internally believes that this okay, whereas another camp wants to standardize the presentation where there's less variance (such as a heading as one and the body copy as another). Have you been through a similar trial or test in the past? I've seen some instances of a Marketing Sherpa study on the topic, but no real numbers to support one thing or another. I've attached an example image of our current LP. I have a lot of strong opinions on a number of items - but we're looking to have an immediate internal discussion on the font issue first. Thanks! GIzA8.jpg
Web Design | | eMagineSEO0 -
Duplicate Page Content
Currently experiencing duplicate pages for all hotel pages. What would be recommendation to fix the pop up pages that uses javascript? | http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=fotos http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=localizacion http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=panorama http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=tourVisual |
Web Design | | Melia0 -
Is it more beneficial to have internal links without the full file path?
When linking internally in my site is it more beneficial to have links written like, Fast Blenders OR Fast Blenders
Web Design | | tickettoss0