Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
-
Hi
I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? :
t&c's
shipping policy
pricing policy
privacy policy
etc
So in summary:
-
Shall I unblock these?
-
What caused it Shopify default settings or more likely my migration team?
All Best
Dan
-
-
Thanks for your advice Alex, yes i agree, will ask Shopify if this was them (re default settings) or if my migrators have been over enthusiastic but contrary to best practices.
Have a great BH weekend !
All Best
Dan
-
I wouldn't block them. While it's unlikely to affect the rank of your other pages, it may result in a poorer user experience, e.g. if someone were to be searching for one of your policies in Google, it would not be returned.
I'm afraid I'm not an expert on Shopify at all, so I can't answer why they wouldn't have been blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
How important is Lighthouse page speed measurement?
Hi, Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse). When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40. Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async. Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async? Thank you, Dave bg_05.jpg
Reporting & Analytics | | erdev0 -
Strange landing page in Google Analytics
Hello MOZ Community, The website in question is https://x-y.com/ When i looked at the landing pages report in GA , x-y.com is appended at the end of every URL like this. https://x-y.com/x-y.com When i open the above URL in GA interface, it shows page not found. This is obvious as there is no such URL.
Reporting & Analytics | | Johnroger
The metrics like sessions, Users, Bounce rate all look good. In the property settings, The default URL is written like this http:// cell-gate.com (Please note that s is missing in property settings). But how is traffic tracked correctly How do i solve this problem. What settings should we change to make the landing pages report look ok Thanks0 -
Hello, our domain authority dropped significantly overnight from 37 to 29\. We have been building good links from high DA pages and producing quality, regular content.
Hello, our domain authority dropped significantly overnight from 37 to 29. We have been building good links from high DA sites and producing regular, good quality content. Anyone able to offer any ideas why? Thanks
Reporting & Analytics | | ProMOZ1231 -
Google Analytics Goal/Event/SOMETHING to show only Wordpress "Posts", not pages, etc
Hi all, Our site is build on Wordpress and formerly the post URL's had the typical date format at the beginning. This made it easy for me to look at, for example, all search traffic to the blog. I would just view URL's containing /2014/ and /2015/ and boom. We have since removed the dates from the URL's with proper redirects etc, which is great, but now I can't figure out a way to look at ONLY the blog in GA. I like to track a KPI of 'search visits to blog posts' and I can't figure out how to now. Can I set up a GA event that only fires when the post type template for blog posts loads? Some other solution? I'm lost here, and there's gotta be a good way to do it...
Reporting & Analytics | | 3DR0 -
Google Analytics - Organic Search Traffic & Queries -What caused the huge difference?
Our website traffic dropped a little bit during the last month, but it's getting better now, almost the same with previous period. But our conversion rate dropped by 50% for the last three weeks. What could cause this huge drop in conversion rate? In Google Analytics, I compared the Organic Search Traffic with previous period, the result is similar. But the Search Engine Optimization ->Queries shows that the clicks for last month is almost zero. What could be the cause of this huge differnce? e9sJNwD.png k4M8Fa5.png
Reporting & Analytics | | joony0 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
Set Up of Goal Tracking with Google Analytics-$750 a Fair Price????
Greetings Moz Community! My firm operates commercial real estate website that contains 3-4 forms. Each form represents a goals. Google Analytics has been set up for years, but it does not track these form completions/goals properly. My SEO firm has offered to configure Goals on Google Analytics for $750. Is this a fair price? If the set up takes one hour, I am really over paying. But if this is a complex project that may take 7-9 hours the pricing seems OK. Also, the SEO firm will require an additional $750 in the future to set up event tracking. Is this excessive? I might add that my developer will need to add code to my web site. My SEO company has proven reliable and accurate. I can go to sleep at night knowing they are doing a good job. Where as my Argentinian developers really try their best, but perhaps because of the language barrier, they can make mistakes from time to time. I am willing to pay a premium to ensure that the job is done correctly domestically, however I don't appreciate over paying. Is the $750 payment for setting up Google Analytics reasonable assuming the job is done well??? Thanks,
Reporting & Analytics | | Kingalan1
Alan0