Webmaster tool parameters
-
Hey forum,
About my site, idealchooser.com. Few weeks ago I've defined a parameter "sort" at the Google Webmaster tool that says effect: "Sorts" and Crawl: "No URLs". The logic is simple, I don't want Google to crawl and index the same pages with a different sort parameter, only the default page without this parameter.
The weird thing is that under "HTML Improvement" Google keeps finding "Duplicate Title Tag" for the exact same pages with a different sort parameter. For example:
/shop/Kids-Pants/16//shop/Kids-Pants/16/?sort=Price/shop/Kids-Pants/16/?sort=PriceHi
These aren't old pages and were flagged by Google as duplicates weeks after the sort parameter was defined.
Any idea how to solve it? It seems like Google ignores my parameters handling requests.
Thank you.
-
I just thought of something else, if I do a robots.txt disallow to this parameter, wouldn't it kill all the rankings from external links to pages that contain this parameter?
-
It's been at least 6 weeks since I set it up, probably more like 2 months.
Thanks for the robots.txt idea, I'll give it a shot. However I would still like to figure out what is going on with Google parameter definition.
-
How long you have set this? I think it may take up some time before you can see some changes in Google Webmaster Tools. But again I personally do not like Google Parameter handling tools because it is meant only for Google and therefore, your website will appear duplicate to other search engines. So, I would rather use robots.txt file to fix this. A simple tag like – Disallow: /?sort=*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
block primary . xxx domain with disavow tool
Hi friends I discovered spam url attack on top sites with good google positions for specific keyword. Can I block primary domain like .xxx with disavow tool? There is hundreds of different domains but primary domain is always the same. for example like this domain:xxx? Thanks
Intermediate & Advanced SEO | | netcomsia0 -
Best tools for submitting contact forms of 1000 websites?
For a new B2B service we have identified websites that we would like to make aware of our service.
Intermediate & Advanced SEO | | lcourse
There are about 1000 websites for which it was not possible to retrieve emails, and where we need to do the outreach using the websites contact pages. Do you know of any tools that save time or outsource companies specialized in such a service? We do not want to fully automize the process but a human should do a visual check that form is properly filled. What I imagine could save time would be tools that already load from a list of URLs the next pages already in the background of the browser and good autoform fillers. Any recommendations?0 -
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
Anyone know of a free keyword monitoring tool?
I need a free hosted keyword monitoring tool to monitor several thousand keywords. It would also be beneficial to have domain monitoring also for monitoring competitors. All ideas very welcome...
Intermediate & Advanced SEO | | seoman100 -
Site wide external links analysis tool?
Hi Guys, I just got a remove url email from someone asking us to remove their link. What website or tool is best to see ALL of your external links sitewide from your website? And as a bonus, columns of "nofollow" and "follow". Thank you!
Intermediate & Advanced SEO | | Shawn1240 -
Long urls created by filters (not with query parameters)
A website adds subfolders to a category URL for each filter that's selected. In a crawl of the website some of these URLs reach over 400 characters. For example, if I select shoe size 5, 5.5 and 6, white and blue colour, price $70-$100, heel and platform styles, the URL will be as follows: www.example.com/shoes/womens/filters/shoe-size--5--5.5--6/color--white--blue/price--70-100/style--heel--platform There is a canonical that points to www.example.com/shoes/womens/ so it isn't a duplicate content issue. But these URLs still get crawled. How would you handle this? It's not a great system so I'm tempted to tell them to start over with best practice recommendations, but maybe I should just tell them to block the "/filters/" folder from crawlers? For some products however, filtered content would be worth having in search indexes (e.g. colour).
Intermediate & Advanced SEO | | Alex-Harford0 -
Why do pages with a 404 error drop out of webmaster tools only to reappear again?
I have noticed a lot of pages which have fallen out of webmaster tools crawl error log that had bee 404'ing are reappearing again Any suggestions as to why this might be the case? How can I make sure they don't reappear again?
Intermediate & Advanced SEO | | Towelsrus0 -
Google Webmaster Tools Sitemap errors for phantom urls?
Two weeks ago we changed our urls so the correct addresses are all lowercase. Everything else 301 redirects to those. We have submitted and made sure that Google has downloaded our updated sitemap several times since. Even so, Webmaster Tools is reporting 33000 + errors in our sitemap for urls that are no longer in our sitemap and haven't been for weeks. It claims to have found the errors within the last couple of days but the sitemap has been updated for a couple of weeks and has been downloaded by Google at least three times since. Here is our sitemap: http://www.aquinasandmore.com/urllist.xml Here are a couple of urls that Webmaster Tools says are in the sitemap: http://www.aquinasandmore.com/catholic-gifts/Caroline-Gerhardinger-Large-Sterling-Silver-Medal/sku/78664
Intermediate & Advanced SEO | | IanTheScot
Redirect error unavailable
Oct 7, 2011
http://www.aquinasandmore.com/catholic-gifts/Catherine-of-Bologna-Small-Gold-Filled-Medal/sku/78706
Redirect error unavailable
Oct 7, 20110