Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
-
Hi
I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? :
t&c's
shipping policy
pricing policy
privacy policy
etc
So in summary:
-
Shall I unblock these?
-
What caused it Shopify default settings or more likely my migration team?
All Best
Dan
-
-
Thanks for your advice Alex, yes i agree, will ask Shopify if this was them (re default settings) or if my migrators have been over enthusiastic but contrary to best practices.
Have a great BH weekend !
All Best
Dan
-
I wouldn't block them. While it's unlikely to affect the rank of your other pages, it may result in a poorer user experience, e.g. if someone were to be searching for one of your policies in Google, it would not be returned.
I'm afraid I'm not an expert on Shopify at all, so I can't answer why they wouldn't have been blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Migrated website but Google Analytics still displays old URL's and none new?!
I migrated a website from a .aspx to a .php and hence had to 301 all the old urls to the new php ones. It's been months after and I'm not seeing any of the php pages showing results but I'm still getting results from the old .aspx pages. Has any one had any experience with this issue or knows what to do? Many thanks,
Reporting & Analytics | | CoGri0 -
Google Analytics and Bounce Rates Query - Should I block access from foreign countries ?
Hi , When I look at my google analytics for my UK Website, I can see alot of visits come from outside the UK , i.e Brazil and USA. Both of which give me almost 100% bounce rates from people visiting from there. I am wondering, if google looks at bounce rates with regards to ranking factors and should I therefore block access to my site from visitors outside the UK ?... Would this help increase my rankings ? Given that we only serve uk customers, I cant see any benefit of allowing non uk customers the ability to see the site . what does people think ? thanks pete
Reporting & Analytics | | PeteC121 -
Google Analytics - Next Page Path is the Same URL?
Hey Everyone, I have a Google analytics question. I'm looking through a client's site and when I look at the next page path, I get the same URL as the next path. For example, on the homepage, the next page path I get is the homepage again? This happens for all URL's, is this an implementation error? Is there a way to fix this? Thanks!
Reporting & Analytics | | EvansHunt0 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0 -
How to get a list of robots.txt file
This is my site. http://muslim-academy.com/ Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt In Google Webmaster its not showing up.Just giving the number of blocked URL's. Any plugin or Software to extract the list of blocked URL's.
Reporting & Analytics | | csfarnsworth0 -
Time on page: What happens when I open many tabs?
Hello everyone, I was studying Analytics, and checked that the time on page is calculated by the diference of the time you entered the page and when you click to go to another one. But how the time is calculated when I open several links using new tabs in different moments? Does Google counts the last tab? Just a guess... Thanks!
Reporting & Analytics | | seomasterbrasil0