Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Query parameters for normalization in Bing
-
Almost everyday I get this-
Query parameters for normalization found on www.sitename.com
Site: www.sitename.comDate: 3/26/2013Priority: LowBing has detected new parameters in your URLsAnyone know why? We aren't changing anything. I have read it has to do with internal urls but I can find out what internal urls this is a problem with.
-
We see this too. We have canonicals in place, and we still see the error. And there's no insight into which parameters are causing issues.
-
Here's an answer straight from Duane Forrester of Bing:
"It means that those parameters may be causing Bing to think you have duplicate content issues. If your content can appear on two individual URLs, that can be an issue, as we don't know which one you want indexed, ranked, etc. So, the tools we offer allow you to control this by telling us to ignore a parameter. We can suggest parameters we find, but it's your choice on if you want to tell us to ignore them (and the attendant URLs) or not.
For example, if you have a /print/ folder on your site, you can tell us to ignore everything under the "print parameter". By entering "print" as the parameter to be ignored, we'll skip indexing the content held in the print folder on your site."
Duane doesn't say so in the article, but you can adjust your parameter settngs in Bing Webmaster Tools. Info here:
http://www.bing.com/webmaster/help/ignore-url-parameters-d7496c65
Hope this helps! Best of luck with your SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking #1 in Bing & DuckDuckGo, not at all for Google - where am I going wrong?
According to the Moz rank checking tool, my blog ranks in the top 3 for my name "James Crowley" on Bing, Yahoo (both in the US and UK), and also DuckDuckGo (though Moz can't tell me that). And yet doesn't rank anywhere for Google. I don't have any penalties, and for other keywords it appears fine on Google. Does this seem strange to you? Am I going wrong somewhere? The blog is https://www.jamescrowley.net/. Many thanks James Nq5uF2al.png
Intermediate & Advanced SEO | | james.crowley0 -
SEO effect of URL with subfolder versus parameters?
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
Intermediate & Advanced SEO | | Searchout0 -
Pagination parameters and canonical
Hello, We have a site that manages pagination through parameters in urls, this way: friendly-url.html
Intermediate & Advanced SEO | | teconsite
friendly-url.html?p=2
friendly-url.html?p=3
... We've rencently added the canonical tag pointing to friendly-url.html for all paginated results. In search console, we have the "p" parameter identified by google.
Now that the canonical has been added, should we still configure the parameter in search console, and tell google that it is being use for pagination? Thank you!0 -
Should I disallow all URL query strings/parameters in Robots.txt?
Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM
Intermediate & Advanced SEO | | jmorehouse
/Mulligan-Practitioner-CD-ROM?cat_id=87
/Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?0 -
How do I add https version of site to Bing webmaster tools?
I could add my site to Google Webmaster tools with no problems, but when I try to add it in Bing webmaster tools it just redirects me to what I already have. Everything is staying the same but the switch from http to https. Anyone else experienced this? This is what I just received back from Bing and it doesn't seem right- I understand that you switched to the https version of your site and you're now trying to use the Site Move tool. However, in order to do this, you must verify the https version of your site first. You cannot do this because it just redirects you to the dashboard. We thank you for reporting this to us. We've investigated on this matter and can see that you're already put a redirect from the http to the https version of your site. We also checked the /BingSiteAuth.xml file and this also redirects to the https version. At this point, we suggest that you remove the current website (http version) that is verified through Bing Webmaster Tool and add your https domain. When done, use the Site Move tool. Thoughts?
Intermediate & Advanced SEO | | EcommerceSite1 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
Query deserves freshness
There was an seomoz article - http://www.seomoz.org/blog/does-query-deserves-diversity-algorithm-exist-at-google . I would like to point out the specific part of it - "So - because a lot of searchers express a preference for more diverse results than just those pages that ordinarily would "make the cut," Google provides an extra helping hand to pages they feel help to satisfy those searchers. This data could be gleaned from lower CTRs in the SERPs, greater numbers of query refinements, and even a high percentage of related searches performed subsequently" I don;t understand how data could be gleaned from lower CTRs, don't you think it should have been Higher CTRs ?
Intermediate & Advanced SEO | | seoug_20050