Query parameters for normalization in Bing
-
Almost everyday I get this-
Query parameters for normalization found on www.sitename.com
Site: www.sitename.comDate: 3/26/2013Priority: LowBing has detected new parameters in your URLsAnyone know why? We aren't changing anything. I have read it has to do with internal urls but I can find out what internal urls this is a problem with.
-
We see this too. We have canonicals in place, and we still see the error. And there's no insight into which parameters are causing issues.
-
Here's an answer straight from Duane Forrester of Bing:
"It means that those parameters may be causing Bing to think you have duplicate content issues. If your content can appear on two individual URLs, that can be an issue, as we don't know which one you want indexed, ranked, etc. So, the tools we offer allow you to control this by telling us to ignore a parameter. We can suggest parameters we find, but it's your choice on if you want to tell us to ignore them (and the attendant URLs) or not.
For example, if you have a /print/ folder on your site, you can tell us to ignore everything under the "print parameter". By entering "print" as the parameter to be ignored, we'll skip indexing the content held in the print folder on your site."
Duane doesn't say so in the article, but you can adjust your parameter settngs in Bing Webmaster Tools. Info here:
http://www.bing.com/webmaster/help/ignore-url-parameters-d7496c65
Hope this helps! Best of luck with your SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
Sitemap Query
I've decided to write my own sitemap because frankly, the automated ones pull all kinds of out of I don't know where. So to get around that, manual it is. But I have some products appear in various categories, should I still list every product in each category in the sitemap, regardless of some being duplicates, or should I choose the most relevant category and list them there? I do have a canonical URL extension which should resolve any duplicate content I have.
Intermediate & Advanced SEO | | moon-boots0 -
Do I miss traffic (thus, page value) by using the GWMT Parameter Handling Tool?
I'm working through duplicate content issues. The tracking code or the session id in the URL is being recognized as a different page than the original. Example: www.example.com is dup content to www.example.com?_nk=x&ad=y&_ga=z, which is tied to a marketing campaign If my setup in the URL parameter tool is set to: Effect = None Crawl = Representative URL, then do I: 1. Miss all the traffic being driven to the ?_nk page?
Intermediate & Advanced SEO | | johnnybgunn
2. With a Rep URL, there still would be two indexed listings: the .com & the .com?_nk...right? Neither is good. Redirects of all the URLs is not an option b/c there are hundreds of these that would need to be redirected. And I also don't want to slow down page load time with excessive redirects, which has been the case when adding 100+ redirects for the recent website migration we did.0 -
REL prev/next on pages with additional sort parameters
We need a bit of advice on a site we are working on. Currently, the site displays items in the categories in order of date and all of the pages of the category listing are rel prev/next tagged correctly. This is great, and works really well - however we want to include some more sorting options (by popularity, name, file size... etc) into the mix. What's the best way to go about this using the correct tags? Is it better to NOINDEX all of the sorting options and just leave the default by date listings indexed? Also, we cannot canonical the sorted options to their counterparts because the page content would be different. Any ideas? Any help is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | Peter2640 -
Front end content optimisation query
One of my sites is installing Strange Loop, a front end content optimisation platform. Does anyone have any advice when dealing with this type of implementation or pitfalls that I need to look out for. Even just a headsup on some reading material would be good. Thanks
Intermediate & Advanced SEO | | BenFox0 -
Bing/Yahoo! Updates
On March 27th I noticed a huge rankings drop across the board on a client site in Bing and Yahoo! After some research, I found this article on SEOroundtable (it also links back to a Webmaster World discussion). For this particular site we're talking a few dozen of keywords dropping off the first page, or even from the first page dropping out of the top 50. The only thing not affected were brand keywords. The site was recently relaunched, and has a fairly weak backlink profile right now. It doesn't use keywords in the domain (which was one of the things identified in the SEOroundtable article). Has anyone else noticed changes? If so, what do you attribute them to and how are you combating them?
Intermediate & Advanced SEO | | BedeFahey0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Query deserves freshness
There was an seomoz article - http://www.seomoz.org/blog/does-query-deserves-diversity-algorithm-exist-at-google . I would like to point out the specific part of it - "So - because a lot of searchers express a preference for more diverse results than just those pages that ordinarily would "make the cut," Google provides an extra helping hand to pages they feel help to satisfy those searchers. This data could be gleaned from lower CTRs in the SERPs, greater numbers of query refinements, and even a high percentage of related searches performed subsequently" I don;t understand how data could be gleaned from lower CTRs, don't you think it should have been Higher CTRs ?
Intermediate & Advanced SEO | | seoug_20050