Webmaster tool parameters
-
Hey forum,
About my site, idealchooser.com. Few weeks ago I've defined a parameter "sort" at the Google Webmaster tool that says effect: "Sorts" and Crawl: "No URLs". The logic is simple, I don't want Google to crawl and index the same pages with a different sort parameter, only the default page without this parameter.
The weird thing is that under "HTML Improvement" Google keeps finding "Duplicate Title Tag" for the exact same pages with a different sort parameter. For example:
/shop/Kids-Pants/16//shop/Kids-Pants/16/?sort=Price/shop/Kids-Pants/16/?sort=PriceHi
These aren't old pages and were flagged by Google as duplicates weeks after the sort parameter was defined.
Any idea how to solve it? It seems like Google ignores my parameters handling requests.
Thank you.
-
I just thought of something else, if I do a robots.txt disallow to this parameter, wouldn't it kill all the rankings from external links to pages that contain this parameter?
-
It's been at least 6 weeks since I set it up, probably more like 2 months.
Thanks for the robots.txt idea, I'll give it a shot. However I would still like to figure out what is going on with Google parameter definition.
-
How long you have set this? I think it may take up some time before you can see some changes in Google Webmaster Tools. But again I personally do not like Google Parameter handling tools because it is meant only for Google and therefore, your website will appear duplicate to other search engines. So, I would rather use robots.txt file to fix this. A simple tag like – Disallow: /?sort=*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
batch 301 redirects with an external tool
Hi, I am migrating my e-commerce to another platform of a internet company in Brazil (Tray) that has no way to redirect 301 urls in batch, I also do not have access to files and ftp of it. Anyway, as I have hundreds of urls, I would like to know if there is any way to do batch redirects with an external platform tool? Thank you very much in advance
Intermediate & Advanced SEO | | didi090 -
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
Search Console Change of Address Tool Issue
We're currently migrating few "event" mini sites to a main site that will have a subfolder for each event.. Example:
Intermediate & Advanced SEO | | RichardUK
newsite.com/event1 The issue is that Search console is not able to verify this kind of redirect:
example.com --> 301 --> newsite.com/event Do you know any work around for this? I was thinking of using a subdomain instead which will in turn redirect to the /event subfolder.. but with each hop it will diminish the link's strength. I prefer not to leave it as subdomain as data gets mashed up in Google Analytics with subdomains and we have seen worse ranking results with subdomains. Any help is greatly appreciated.0 -
Do you know if there is a tool that can tell you if a url have backlink?
Hi, Do you know if there is a tool that I can check backlinks for thousands of URLs Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Tool to bulk check outbound links
Hi. I have a list of 50 domains I need to check for links to three different sites. Does anybody know an easy way to do this? The best solution I have found so far is to crawl each with Screaming Frog and search for the domains, but I can only do one at a time this way. Some way to speed it up would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
How to remove an entire subdomain from the Google index with URL removal tool?
Does anyone have clear instructions for how to do this? Do we need to set up a separate GWT account for each subdomain? I've tried using the URL removal tool, but it will only allow me to remove URLs indexed under my domain (i.e. domain.com not subdomain.domain.com) Any help would be much appreciated!!!
Intermediate & Advanced SEO | | nicole.healthline0 -
Is User Agent Detection still a valid method for blocking certain URL parameters from the Search Engines?
I'm concerned with the cloaking issue. Has anyone successfully implemented user agent detection to provide the Search engines with "clean" URLs?
Intermediate & Advanced SEO | | MyaRiemer0