RegEx help needed for robots.txt potential conflict
-
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both:
Allow: /*?p=
and
Disallow: /?p=&
I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number?
I've looked at several resources and there is practically no reference to what "&" does...
Can anyone shed any light on this, to ensure I am allowing suitable access to a shop?
Thanks in advance for any assistance
-
Hey James
It looks to me like you are just disallowing access to any URLs that have more than the initial p= variable. So, you are reducing the impact of potential duplication through searches and the like.
Good
?p=1
Bad
?p=1&q=search string
I am no magento expert but this seems to be a simple attempt to reduce the myriad duplication that can happen with search pages and the like inside a complex CMS like Magento.
The SEOMoz crawler tool should give you some good insight and to be sure, try removing the 'Disallow: /?p=&' and see if you get a buckletload of duplicate content warnings.
Ultimately, the thing to remember here is that the & is part of the URL and not part of the regex.
Hope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Steady, but continous Google traffic drop. Help?
I am facing a steady, but deteriorating Google traffic drop and i am struggling to find the real reasons. The site is in the video entertainment niche. Here is some data regarding the site in general: Site redesign happened around June We are not doing a lot of off-site and link building in 2013, besides social activities and content distribution to partner sites (top links from OSE are gained long time ago) GWT crawls errors are clean (besides 404s) No manual actions from Google or penalty notifications Serious fluctuations in indexed content in June, but no real effect on traffic Removal of old content during August Top keywords are dropping (may be caused by category pages removal and flattening site structure) Robots file disallows only search-generated pages No sitemap errors or warnings 2K of duplicate meta-titles and descs Thank you and appreciate your help. wyOyoA9
Technical SEO | | dimicos1 -
HTTP 500 Internal Server Error, Need help
Hi, For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting: <code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code> and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ How can i fix that? Thank you
Technical SEO | | Vmezoz0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Canonical Tag - Magento - Help
Hello, I was hoping to get some help or tips on how to best control the canonical tag on a Magento based website. When you go into the Magento admin and enable the option to use the canonical tag on pages, all that does is input the canonical tag to the exact page just with the http:// in the url. My goal is to use the canonical tag on specific pages and point it to other pages, not just the same page with an http:// For example, right now for page: example.com/question/baseball the canonical tag is pointing to http://example.com/question/baseball What i want is to be able to do is take: example.com/question/baseball and have the canonical tag point to example.com/question/baseballbats Is this possible? Does what I'm saying make sense? Please let me know what you all think.... Thanks!
Technical SEO | | Prime850 -
Best practise needed for translating content
Hi all, I was after some advice in the best solution to follow for translating website content into multiple languages? I am working on a content rich UK site and want to know a good solution to translate this content into other languages for best practise SEO? Would anyone have any recommendations in best practise to follow as well as best solutions? Many thanks Simon
Technical SEO | | simonsw0 -
Help With Analytics Data
Hello, I'm seeing the following Analytics data for some of my keywords: Multiple Visits Pages/Visit: 1 Avg Visit Duration: 00:00 % New Visits: 100% Bounce Rate: 100% The data is the same on all "affected keywords". What is going on and how do I fix it? Thanks for the help!
Technical SEO | | AWCthreads0 -
Warnings for blocked by blocked by meta-robots/meta robots Nofollow...how to resolve?
Hello, I see hundreds of notices for blocked by meta-robots/meta robots nofollow and it appears it is linked to the comments on my site which I assume I would not want to be crawled. Is this the case and these notices are actually a positive thing? Please advise how to clear them up if these notices can be potentially harmful for my SEO. Thanks, Talia
Technical SEO | | M80Marketing0 -
Is the Sandbox Real? Need Help!
To start, I'm very new at this so I've likely made a ton of mistakes but here is the breakdown of what's happened/what's been done to my site. I own a wedding photography company which was based in Portland, we decided about six months prior that we wanted to relocate to San Diego. It was too soon to optimize our website for our new town of San Diego so I created a brand new site. It was born around June 2011. It looks just like the old site but all the content is different (different titles, re-uploaded images, text, etc was optimized for San Diego). What may be my pitfall is I imported our blog posts from the old site to the new site and we continued to keep both blogs live (writing the post in one, importing to the other). San Diego site: http://continuumweddings.com Old Site (now optimized for LA): http://continuumphotography.com From there I began link building. I signed up for the SEO Scheduler and began making the changes suggested there. It told me to sign up for Linxboss, and I did it. Other than that, my links have been build naturally and I have quite a few of them, definitely enough to compete with my top competitors. At one point I was #3 for "San Diego Wedding Photographer" and I stayed there for a couple weeks. Then I began to drop. Now I'm somewhere on page 10. I've read a lot of articles on here and I know I have a lot of things potentially hurting me. Site age, Duplicate content, etc. I'm just not sure why I dropped (still rank on 1st page in Yahoo & Bing) and what I should do about it. I tend to get overwhelmed and every post I read seems to talk about something new I may have done wrong. I'm willing to put in the time to fix this; I just need to know where my time is best spent.
Technical SEO | | mrsmelmitch0