Webmaster tool parameters
-
Hey forum,
About my site, idealchooser.com. Few weeks ago I've defined a parameter "sort" at the Google Webmaster tool that says effect: "Sorts" and Crawl: "No URLs". The logic is simple, I don't want Google to crawl and index the same pages with a different sort parameter, only the default page without this parameter.
The weird thing is that under "HTML Improvement" Google keeps finding "Duplicate Title Tag" for the exact same pages with a different sort parameter. For example:
/shop/Kids-Pants/16//shop/Kids-Pants/16/?sort=Price/shop/Kids-Pants/16/?sort=PriceHi
These aren't old pages and were flagged by Google as duplicates weeks after the sort parameter was defined.
Any idea how to solve it? It seems like Google ignores my parameters handling requests.
Thank you.
-
I just thought of something else, if I do a robots.txt disallow to this parameter, wouldn't it kill all the rankings from external links to pages that contain this parameter?
-
It's been at least 6 weeks since I set it up, probably more like 2 months.
Thanks for the robots.txt idea, I'll give it a shot. However I would still like to figure out what is going on with Google parameter definition.
-
How long you have set this? I think it may take up some time before you can see some changes in Google Webmaster Tools. But again I personally do not like Google Parameter handling tools because it is meant only for Google and therefore, your website will appear duplicate to other search engines. So, I would rather use robots.txt file to fix this. A simple tag like – Disallow: /?sort=*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
batch 301 redirects with an external tool
Hi, I am migrating my e-commerce to another platform of a internet company in Brazil (Tray) that has no way to redirect 301 urls in batch, I also do not have access to files and ftp of it. Anyway, as I have hundreds of urls, I would like to know if there is any way to do batch redirects with an external platform tool? Thank you very much in advance
Intermediate & Advanced SEO | | didi090 -
How to handle broken links to phantom pages appearing in webmaster tools
Hi,Would love to hear different experiences and thoughts on this one. We have a site that is plagued with 404's in the Webmaster Tools. A significant number of them have never existed, for instance affiliates have linked to them with the wrong URL or scraper sites have linked to them with a truncated version of the URL and an ellipsis eg; /my-nonexistent... What's the best way to handle these? If we do nothing and mark as fixed, they reappear in the broken links report. If we 301 redirect and mark as fixed they reappear. We tried 410 (gone forever) and marking as fixed; they re-appeared. We have a lot of legacy broken links and we would really like to clean up our WMT broken link profile - does anyone know of a way we can make these links to non extistent pages disappear once and for all? Many thanks in advance!
Intermediate & Advanced SEO | | dancape0 -
Long urls created by filters (not with query parameters)
A website adds subfolders to a category URL for each filter that's selected. In a crawl of the website some of these URLs reach over 400 characters. For example, if I select shoe size 5, 5.5 and 6, white and blue colour, price $70-$100, heel and platform styles, the URL will be as follows: www.example.com/shoes/womens/filters/shoe-size--5--5.5--6/color--white--blue/price--70-100/style--heel--platform There is a canonical that points to www.example.com/shoes/womens/ so it isn't a duplicate content issue. But these URLs still get crawled. How would you handle this? It's not a great system so I'm tempted to tell them to start over with best practice recommendations, but maybe I should just tell them to block the "/filters/" folder from crawlers? For some products however, filtered content would be worth having in search indexes (e.g. colour).
Intermediate & Advanced SEO | | Alex-Harford0 -
GWT URL Removal Tool Risky to Use for Duplicate Pages?
I was planning to remove lots of URL's via GWT that are highly duplicate alike pages (similar pages exist on other websites across the web). However, this Google article had me a bit concerned: https://support.google.com/webmasters/answer/1269119?hl=en I already have "noindex, follow" on the pages I want to remove from the index, but Google seems to take ages to remove pages from index, which appear to drag down unique content pages from my site.
Intermediate & Advanced SEO | | khi50 -
URL Parameter & crawl stats
Hey Guys,I recently used the URL parameter tool in WBT to mark different urls that offers the same content.I have the parameter "?source=site1" , "?source=site2", etc...It looks like this: www.example.com/article/12?source=site1The "source parameter" are feeds that we provide to partner sites and this way we can track the referral site with our internal analytics platform.Although, pages like:www.example.com/article/12?source=site1 have canonical to the original page www.example.com/article/12, Google indexed both of the URLs
Intermediate & Advanced SEO | | Mr.bfz
www.example.com/article/12?source=site1andwww.example.com/article/12Last week I used the URL parameter tool to mark "source" parameter "No, this parameter doesnt effect page content (track usage)" and today I see a 40% decrease in my crawl stats.In one hand, It makes sense that now google is not crawling the repeated urls with different sources but in the other hand I thought that efficient crawlability would increase my crawl stats.In additional, google is still indexing same pages with different source parameters.I would like to know if someone have experienced something similar and by increasing crawl efficiency I should expect my crawl stats to go up or down?I really appreciate all the help!Thanks!0 -
Can you recommend a tool to identify contact email for list of 1000 domains?
We researched a list of about 1000 domains, which are all in one industry segment. Any tool you can recommend to identify corresponding contact emails, based on domain whois or email on website or in contact page? what is your experience with sending emails just to info@DOMAIN_NAME ?
Intermediate & Advanced SEO | | lcourse0 -
Keyword Research Tool For Local Customers
Hi all, and thanks in advance for your input. I help mostly small local businesses with SEO and other IM strategy, but am having a hard time finding a good tool for local seo searches. For instance, I have a smaller plumber that covers Denver, but really wants to market to some of the suburbs. What is a good tool to try to find search volume for "littleton plumbers" or similar searches? By the way Littleton is a suburb of Denver. Thanks again. Chris
Intermediate & Advanced SEO | | iFuseInternetMarketing0 -
Should I use both Google and Bing's Webmaster Tools at the same time?
Hi All, Up till now I've been registered only to Google WMT. Do you recommend using at the same time Bing's WMT? Thanks
Intermediate & Advanced SEO | | BeytzNet0