Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
-
Hi
I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? :
t&c's
shipping policy
pricing policy
privacy policy
etc
So in summary:
-
Shall I unblock these?
-
What caused it Shopify default settings or more likely my migration team?
All Best
Dan
-
-
Thanks for your advice Alex, yes i agree, will ask Shopify if this was them (re default settings) or if my migrators have been over enthusiastic but contrary to best practices.
Have a great BH weekend !
All Best
Dan
-
I wouldn't block them. While it's unlikely to affect the rank of your other pages, it may result in a poorer user experience, e.g. if someone were to be searching for one of your policies in Google, it would not be returned.
I'm afraid I'm not an expert on Shopify at all, so I can't answer why they wouldn't have been blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
GA Landing Page Inaccuracies
I had seen a thread on this a while back but no solution posted. There was a link posted to someone else explaining the issue but I got a 404 when clicking. Have a client that does mostly PPC and they are getting their conversion page showing up as landing page from paid many times. This is definitely not a sitelink, etc. The only way you get to this page is if you filled out the form. There are a few other pages showing up as landing pages that don't make sense too. Can this be attributed to someone being "inactive" for 30 minutes and then coming back and performing an action on this page (leaving)? If so, does this double count the conversion if a page visit here is a conversion? Just trying to make sense of the landing page report showing so many instances of our conversion page. Thanks in advance!
Reporting & Analytics | | jeremyskillings0 -
Track PDF's in Google Analytics
Hi Mozzers, Is it possible to track PDF's via Google Analytics/Google tag manager? I'm not only looking for PDF downloads but for the actual activity when someone opens an interactive PDF document. So would it be possible to have onclick events on buttons in the PDF etc... Many thanks!
Reporting & Analytics | | WeAreDigital_BE
Sander0 -
Migrated website but Google Analytics still displays old URL's and none new?!
I migrated a website from a .aspx to a .php and hence had to 301 all the old urls to the new php ones. It's been months after and I'm not seeing any of the php pages showing results but I'm still getting results from the old .aspx pages. Has any one had any experience with this issue or knows what to do? Many thanks,
Reporting & Analytics | | CoGri0 -
Get anchor text, nofollow info etc from a list of links
Hi everybody. I'm currently doing a backlink audit for a client and I've hit a small problem. I'm combining data from Ahrefs, OSE, Webmaster Tools and Link Detox. I've got around 27k links in total now, but the issue is that WMT does not provide data on target page, anchor text and nofollow/dofollow. This means I have around 1k links with only partial information. Does anyone know of a way that I can get this data automatically? Thanks!
Reporting & Analytics | | Blink-SEO1 -
Longevity of robot.txt files on Google rankings
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice: An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server. Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
Reporting & Analytics | | gfiedel0 -
Increase in 'Googlebot-Image' visits in analytics
Hi, I noticed a substantial increase in 'Googlebot-Image' visits data under Technology>Browser & OS in Google analytics for a few clients. Is this a bug? Are there any known fixes apart from just adding a filter to exclude the data? Regards Niladri
Reporting & Analytics | | neildomain0 -
Analytics goals & funnels - troubleshooting 100% proceed to next step
As you can see from the screenshots, 100% of the visitors in the checkout proceed from step 2 (Delivery Date Selected) to step 3 (Enter Delivery Details) - this sample is a month, but this applies going all the way back. Can you tell me if some RegEx is missing from the goal set up? Thanks fEXtD
Reporting & Analytics | | jdeb1