Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
-
Hi
I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? :
t&c's
shipping policy
pricing policy
privacy policy
etc
So in summary:
-
Shall I unblock these?
-
What caused it Shopify default settings or more likely my migration team?
All Best
Dan
-
-
Thanks for your advice Alex, yes i agree, will ask Shopify if this was them (re default settings) or if my migrators have been over enthusiastic but contrary to best practices.
Have a great BH weekend !
All Best
Dan
-
I wouldn't block them. While it's unlikely to affect the rank of your other pages, it may result in a poorer user experience, e.g. if someone were to be searching for one of your policies in Google, it would not be returned.
I'm afraid I'm not an expert on Shopify at all, so I can't answer why they wouldn't have been blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
Fire a tag when element is loaded on page (Google Tag Manager)
I'm using an Element Visibility trigger to track a value that appears on a page. However, I want to track this value even when the user doesn't scroll to the area of the page where the element is (i.e. when the page is loaded, and the value is displayed below the fold, but the user doesn't scroll down there). Is there a way of doing this
Reporting & Analytics | | RWesley0 -
Will noindex pages still get link equity?
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them? Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com? Thank you!
Reporting & Analytics | | HTXSEO0 -
Difference Between Android Browser & Android Webview
Hello All, In google analytic I can see traffic from android browser & android webview so android webview is also a browser? Thanks!
Reporting & Analytics | | dhisman0 -
Google Analytics - Organic Search Traffic & Queries -What caused the huge difference?
Our website traffic dropped a little bit during the last month, but it's getting better now, almost the same with previous period. But our conversion rate dropped by 50% for the last three weeks. What could cause this huge drop in conversion rate? In Google Analytics, I compared the Organic Search Traffic with previous period, the result is similar. But the Search Engine Optimization ->Queries shows that the clicks for last month is almost zero. What could be the cause of this huge differnce? e9sJNwD.png k4M8Fa5.png
Reporting & Analytics | | joony0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Switch to www from non www preference negatively hit # pages indexed
I have a client whose site did not use the www preference but rather the non www form of the url. We were having trouble seeing some high quality inlinks and I wondered if the redirect to the non www site from the links was making it hard for us to track. After some reading, it seemed we should be using the www version for better SEO anyway so I made a change on Monday but had a major hit to the number of pages being indexed by Thursday. Freaking me out mildly. What are people's thoughts? I think I should roll back the www change asap - or am I jumping the gun?
Reporting & Analytics | | BrigitteMN0 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0