Why is rel="canonical" pointing at a URL with parameters bad?
-
Context
Our website has a large number of crawl issues stemming from duplicate page content (source: Moz).
According to an SEO firm which recently audited our website, some amount of these crawl issues are due to URL parameter usage. They have recommended that we "make sure every page has a Rel Canonical tag that points to the non-parameter version of that URL…parameters should never appear in Canonical tags."
Here's an example URL where we have parameters in our canonical tag...
http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/
rel="canonical" href="http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/?pageSize=0&pageSizeBottom=0" />
Our website runs on IBM WebSphere v 7.
Questions
- Why it is important that the rel canonical tag points to a non-parameter URL?
- What is the extent of the negative impact from having rel canonicals pointing to URLs including parameters?
- Any advice for correcting this?
Thanks for any help!
-
Thanks for the response, Eric.
My research suggested the same plan of attack: 1) fixing the canonical tags and 2) Google Search Console URL Parameters. It's helpful to get your confirmation.
My best guess is that the parameters you've cited above are not needed for every URL. I agree that this looks like something WebSphere Commerce probably controls. I'm a few organizational layers removed from whoever set this up for us. I'll try to track down where we can control that.
-
Thanks Peter!
-
Peter has a great answer with some good resources referenced, and i'll try to add on a little bit:
1. Why it is important that the rel canonical tag points to a non-parameter URL?
It's important to use clean URLs so search engines can understand the site structure (like Peter mentioned), which will help reduce the potential for index bloat and ranking issues. The more pages out there containing the same content (ie duplicate content), the harder it will be for search engines to determine which is the best page to show in search results. While there is no "duplicate content penalty" there could be a self inflicted wound by providing too many similar options. The canonical tag is supposed to be a level of control for you to tell Google which page is the most appropriate version. In this case it should be the clean URL since that will be where you want people to start. Users can customize from there using faceted navigation or custom options.
2. What is the extent of the negative impact from having rel canonicals pointing to URLs including parameters?
Basically duplicate content and indexing issues. Both of those things you really want to avoid when running an eComm shop since that will make your pages compete with each other for ranking. That could cost ranking, visits, and revenue if implemented wrong.
3. Any advice for correcting this?
Fix the canonical tags on the site would be your first step. Next you would want to exclude those parameters in the parameter handling section of Google Search Console. That will help by telling Google to ignore URLs with the elements you add in that section. It's another step to getting clean URLs showing up in search results.
I tried getting to http://www.chasing-fireflies.com/costumes-dress-up/mens-costumes/ and realize the parameters are showing up by default like: http://www.chasing-fireflies.com/costumes-dress-up/mens-costumes/#w=*&af=cat2:costumedressup_menscostumes%20cat1:costumedressup%20pagetype:products
Are the parameters needed for every URL? Seems like this is a websphere commerce setup kind of thing.
-
Clean (w/o parameters) canonical URL helps Google to understand better your url structure and avoid several mistakes:
https://googlewebmastercentral.blogspot.bg/2013/04/5-common-mistakes-with-relcanonical.html <- mistake N:1
http://www.hmtweb.com/marketing-blog/dangerous-rel-canonical-problems/ <- mistake N:4So - your company that giving this advise is CORRECT! You should provide naked URLs everywhere when it's possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with parameter urls?
We have a ton of ugly parameter urls that are coming up in google, in semrush, etc. What do we do with them? I know they can cause issues. EX https://www.hibbshomes.com/wp-content/themes/highstand/assets/js/cubeportfolio/js/jquery.cubeportfolio.min.js?ver=6.3
Intermediate & Advanced SEO | | stldanni0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Building "keyword" backlinks
Looking for some opinions here please. Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth. I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing. These days my primary sources for backlinks are much more respectable... myblogguest bloggerlinkup postjoint Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy I use these sources alongside industry only directories and general word of mouth. Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed. The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site. My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days? The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!! Thanks Carl
Intermediate & Advanced SEO | | GrumpyCarl0 -
Could a HTML <select>with large numbers of <option value="<url>">'s affect my organic rankings</option></select>
Hi there, I'm currently redesigning my website, and one particular pages lists hotels in New York. Some functionality I'm thinking of adding in is to let the user find hotels close to specific concert venues in New York. My current thinking is to provide the following select element on the page - selecting any one of the options will automatically redirect to my page for that concert venue. The purpose of this isn't to affect the organic traffic - I'm simply introducing this as a tool to help customers find the right hotel, but I certainly don't want it to have an adverse effect on my organic traffic. I'd love to know your thoughts on this. I must add that in certain cities, such as New York, there could be up to 450 different options in this select element. | <select onchange="location=options[selectedIndex].value;"> <option value="">Show convenient hotels for:</option> <option value="http://url1..">1492 New York</option> <option value="http://url2..">Abrons Arts Center</option> <option value="http://url3..">Ace of Clubs New York</option> <option value="http://url4..">Affairs Afloat</option> <option value="http://url5..">Affirmation Arts New York</option> <option value="http://url6..">Al Hirschfeld Theatre</option> <option value="http://url7..">Alice Tully Hall</option> .. .. ..</select> Many thanks Mike |
Intermediate & Advanced SEO | | mjk260 -
Canonical VS Rel=Next & Rel=Prev for Paginated Pages
I run an ecommerce site that paginates product pages within Categories/Sub-Categories. Currently, products are not displayed in multiple categories but this will most likely happen as time goes on (in Clearance and Manufacturer Categories). I am unclear as to the proper implementation of Canonical tags and Rel=Next & Rel=Prev tags on paginated pages. I do not have a View All page to use as the Canonical URL so that is not an option. I want to avoid duplicate content issues down the road when products are displayed in multiple categories of the site and have Search Engines index paginated pages. My question is, should I use the Rel=Next & Rel=Prev tags on paginated pages as well as using Page One as the Canonical URL? Also, should I implement the Canonical tag on pages that are not yet paginated (only one page)?
Intermediate & Advanced SEO | | mj7750 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0 -
Has important is it to set "priority" and "frequency" in sitemaps?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Intermediate & Advanced SEO | | nicole.healthline2