Overly-Dynamic Urls how to fix in SEOMOZ?
-
Hello.
I have about 300 warnings of overly-dynamic urls.
In urls like this:
http://www.theprinterdepo.com/clearance?dir=asc&order=price&p=10
As you can see all parameters are needed, and my ecommerce solution generates them automatically.
How can I get rid of these warnings? I suppose that by using robots.txt, but I have no idea about it.
In my google webmaster tools I have already configured that these parameteres the crawler should not index them.
Check the image here:
-
Hi Kami,
You might want to ask this in a new post. New posts on old threads don't bump a post in the Q&A forums, so not too many people will see this post (I only knew about it as I was notified from having subscribed last year).
-
Hi, We sort of have the same issue, We have over 5000 pages with the same issues actually. Our ecommerce site uses several different filter (Using Ajax) and we have many different urls like,
http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y
http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
http://www.dellamoda.com/Designer-Accessories.html?sort=title&sort_direction=1&use_selected_filter=Y
Could we use the robots.txt file to disallow these as well? and do we need to put the whole url in there?
like:
Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y
if not how far into the url should be disallowed?
Any help would be greatly appreciated.
Thank you, Tony
-
John, having a ton of URLs indexed for the same page will actually dilute things, not help your rankings. Dr. Pete wrote a great post at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world explaining duplicate content that should help give you a better understanding of things.
-
Hello there,
I have encounter a similar problem in a similar scenario. If we do not allow this pages to be crawled, wouldn't it reduce the number of pages indexed resulting in lower Google ranking?
-
Hi! Did Rasmus answer your question, or are you looking for some more help?
-
Disallow: /*?dir=desc
Disallow: /*&order=
I think yout should try with these lines, and test in Google Webmaster tools. This should leave only this page
https://www.theprinterdepo.com/clearance
That is what you want right?
-
Can I know the syntax for robots.txt to ignore those?
-
Sure thing!
For example:
<a <span="">href</a><a <span="">="</a>https://www.theprinterdepo.com/clearance?dir=desc&order=price" title="Set Descending Direction">src="https://www.theprinterdepo.com/skin/frontend/default/MAG060062/images/i_asc_arrow.gif" alt="Set Descending Direction" class="v-middle" />
This code is the code for reversing the result. So you can check to see if the page has a url query with "dir=asc" (the standard). If so the code above should instead be:
<a rel="nofollow" <span="">href</a><a rel="nofollow" <span="">="</a>https://www.theprinterdepo.com/clearance?dir=desc&order=price" title="Set Descending Direction">src="https://www.theprinterdepo.com/skin/frontend/default/MAG060062/images/i_asc_arrow.gif" alt="Set Descending Direction" class="v-middle" />However, I believe the best approach will be to change the meta tag for robots for the page.
if the url query is dir=asc, order=price then robots="index, follow". If dir is not asc, and order is not price then robots="noindex, follow".
-
I am a really newbie to SEO, can you please explain me how to do it?
-
Can you either set rel="nofollow" on the links on the page that changes the sorting, such that moz and google do not check these pages? Or you can set the robots="noindex, follow" on pages which are not the "standard" sort?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doubts about the technical URL structure
Hello, first we had this structure Categorie: https://www.stoneart-design.de/armaturen/ Subcategory: https://www.stoneart-design.de/armaturen/waschtischarmaturen/ Oft i see this https://www.xxxxxxxx.de/badewelt/badmoebel/ But i have heard it has something to do with layers so google can index it better, is that true ? "Badewelt" is an extra layer ? So i thought maybe we can better change this to: https://www.stoneart-design.de/badewelt/armaturen/ https://www.stoneart-design.de/badewelt/armaturen/waschtischarmaturen/ and after seeing that i thought we can do it also like this so the keyword is on the left, and make instead "badewelt" just a "c" and put it on the back https://www.stoneart-design.de/armaturen/c/ https://www.stoneart-design.de/armaturen/waschtischarmaturen/c/ I dont understand it anymomre which is the best one, to me its seems to be the last one The reason was about this: this looks to me keyword stuffing: Attached picture Google indexed not the same time the same url, so i thougt with this we can solve it Also we can use only the word "whirlpools" in de main category and the subs only the type without "whirlpools" in text thanks Regards, Marcel SC9vi60
Technical SEO | | HolgerL0 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
To include / at the end of a URL or not
Hi I have recently noticed my site works with / and the end of a URL and without. I wanted to know if there is any SEO impact on this? Will it be seen as 2 different pages? if so what is the best option to go for www.mydomain.com/page/ or www.mydomain.com/page Thanks E
Technical SEO | | Direct_Ram0 -
Google Search Results Display URL
Our urls show as www.domain.com/getproduct.aspx?productid=48376 (url #1) in Google search results. When you click on the link and go to the site the URL is www.domain.com/product-name.aspx (url #2) I checked in Google Webmaster Tools (Fetch as Google) and there is a 302 redirect from url #1 to url #2. It also shows a Set-Cookie value, ASP.NET_SessionID= If we make it a 301 redirect instead, will the url displayed in Google search results be the url #2? We need to get rid of the Set-Cookie for crawlers correct?
Technical SEO | | Guy_Huyett0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
URL Structure for Deal Aggregator
I have a website that aggregates deals from various daily deals site. I originally had all the deals on one page /deals, however I thought that maybe it might be more useful to have several pages e.g. /beautydeals or /hoteldeals. However if I give every section it's own page that means I have either no current deals on the main /deals page or I will have duplicate content. I'm wondering what might be the best approach here? A few of the options that come to mind are: 1. Return to having all the deals on one page /deals and linking internally to content within that page
Technical SEO | | andywozhere
2. Have both a main /deals page with all of the deals plus other pages such as /beautydeals, but add re="canonical" to point to the main /deals page
3. Create new content for the /deals page... however I think people will probably want to see at least some deals straight away, rather than having to click through to another page.
4. Display some sub-categories on the main /deals page, but have separate URLs for other more popular sub-categories e.g. /beautydeals (this is how it works at the moment) I should probably point out that the site also has other content such as events and a directory. Any suggestions on how best to approach this much appreciated! Cheers, Andy0 -
Fix or Block Webmaster Tools URL Errors Not Found Linked from a certain domain?
RE: Webmaster Tool "Not Found" URL Errors are strange links from webstatsdomain.com Should I continue to fix 404 errors for strange links from a website called webstatsdomain.com or is there a way to ask Google Webmaster Tools to ignore them? Most of Webmaster Tools "URL Not Found errors" I find for our website are from this domain. They refer to pages that never existed. For example, one was to www.mydomain.com/virtual. Thanks for your help.
Technical SEO | | zharriet0 -
What is the best way to fix legacy overly-nested URLs?
Hi everyone, Due to some really poor decisions I made back when I started my site several years ago, I'm lumbered with several hundred pages that have overly-nested URLs. For example: /theme-parks/uk-theme-parks/alton-towers/attractions/enterprise I'd prefer these to feature at most three layers of nesting, for example: /reviews/alton-towers/enterprise Is there a good approach for achieving this, or is it best just to accept the legacy URLs as an unfixable problem, and make sure that future content follows the new structure? I can easily knock together a script to update the aliases for the existing content, but I'm concerned about having hundreds of 301 redirects (could this be achieved with a single regular express in .htaccess, for example?). Any guidance appreciated. Thanks, Nick
Technical SEO | | ThemeParkTourist0