Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Overly-Dynamic URL
-
Hi,
We have over 5000 pages showing under
Overly-Dynamic URL error
Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like,
http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y
http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3
Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there?
like:
Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y
if not how far into the url should be disallowed?
So far we have added the following to our robots,txt
Disallow: /?sort=title
Disallow: /?use_selected_filter=Y
Disallow: /?sort=price
Disallow: /?clearall=Y
Just not sure if they are correct.
Any help would be greatly appreciated.
Thank you,Kami
-
Hi Kami,
It's unfortunate, but a number of modern day e-commerce platforms still suffer from poor canonicalisation and multiple URL's.
If possible, rather than blocking off access to those queries via robots.txt or meta, I'd start by trying to specify a canonical URL when a query is created.
E.G
Query: http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
Canonical: http://www.dellamoda.com/Designer-Accessories.htmlFailing that I'd try to implement a "follow,noindex" meta tag or via x-robots if you're any good with apache.
If that's still a no go, then try GWT, Google is getting much better at handling dynamic URL's within e-commerce platforms and you can specify which queries Google should ignore directly within GWT
There's a great post on Moz that deals with e-commerce and canonicalisation - http://www.seomoz.org/blog/qa-from-ecommerce-seo-fix-and-avoid-common-issues-webinar - I'd suggest starting there!
As a last resort I'd look to block the URL's within robots.txt, but this prevents crawlers from flowing freely within your site and can result in poor indexation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wildcarding Robots.txt for Particular Word in URL
Hey All, So I know that this isn't a standard robots.txt, I'm aware of how to block or wildcard certain folders but I'm wondering whether it's possible to block all URL's with a certain word in it? We have a client that was hacked a year ago and now they want us to help remove some of the pages that were being autogenerated with the word "viagra" in it. I saw this article and tried implementing it https://builtvisible.com/wildcards-in-robots-txt/ and it seems that I've been able to remove some of the URL's (although I can't confirm yet until I do a full pull of the SERPs on the domain). However, when I test certain URL's inside of WMT it still says that they are allowed which makes me think that it's not working fully or working at all. In this case these are the lines I've added to the robots.txt Disallow: /*&viagra Disallow: /*&Viagra I know I have the solution of individually requesting URL's to be removed from the index but I want to see if anybody has every had success with wildcarding URL's with a certain word in their robots.txt? The individual URL route could be very tedious. Thanks! Jon
Intermediate & Advanced SEO | | EvansHunt0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
How to deal with URLs and tabbed content
Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks
Intermediate & Advanced SEO | | J_Sinclair0 -
What is the best URL structure for categories?
A client's site currently uses the URL structure: www.website.com/�tegory%/%postname% Which I think is optimised fairly well, as the categories are keywords being targeted. However, as they are using a category hierarchy, often times the URL looks like this: www.website.com/parent-category/child-category/some-post-titles-are-quite-long-as-they-are-long-tail-terms Best practise often dictates (such as point 3 in this Moz article) that shorter URLs are better for several reasons. So I'm left with a few options: Remove the category from the URL Flatten the category hierarchy Shorten post titles two a word or two - which would hurt my long tail search term traffic. Leave it as it is What do we think is the best route to take? Thanks in advance!
Intermediate & Advanced SEO | | underscorelive0 -
Crazy long weird URLs... help
I have a HTML website, mysite1.com, and I placed a link on the home page to another one of my sites, mysite2.com Today I checked the links to mysite2.com in Majestic and noticed 24 links coming from the mysite1.com instead of just one link. The URLs from mysite1.com that are showing in Majestic are like this mysite1.com/?epl=4donafvFK3fMXxZXMWQRQLodmPchoXCK5C7-kbBv_agkwlkJrZAoaSDVUlhqFmUqt0f8c2Q6jF6GO6DNMnbidqRsikriF-IEBEt5okmICLEB0FxP36GrsxoPGQ3SGBo1PVR7itDUA4CYmjypn5gi mysite1.com,was inherited from a friend and I believe that it was originally built in Frontpage. Can you tell me how I can get rid of these multiple links as I only want 1 showing from the home page Thanks in advance
Intermediate & Advanced SEO | | JohnPeters0 -
Changing a url from .html to .com
Hello, I have a client that has a site with a .html plugin and I have read that its best to not have this. We currently have pages ranking with this .html plug in. However If we take the plug in out will we lose rankings? would we need a 301 or something?
Intermediate & Advanced SEO | | SEODinosaur0 -
URL Shorteners. Are they SEO Friendly?
Do URL shortener services like bit.ly act as 301 redirects? I was thinking about utilizing one for longer query based URLs and didn't want to risk losing link juice. Thanks for the insight! Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Blocking Dynamic URLs with Robots.txt
Background: My e-commerce site uses a lot of layered navigation and sorting links. While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google. For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do? Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed. Is there a better way to do this, or is this a good solution? Thank you!
Intermediate & Advanced SEO | | AndrewY1