Hi,
We have over 5000 pages showing under
Overly-Dynamic URL error
Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like,
http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y
http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3
Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there?
like:
Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y
if not how far into the url should be disallowed?
So far we have added the following to our robots,txt
Disallow: /?sort=title
Disallow: /?use_selected_filter=Y
Disallow: /?sort=price
Disallow: /?clearall=Y
Just not sure if they are correct.
Any help would be greatly appreciated.
Thank you,Kami