Duplicate Page Content & Rel-Canonicals
-
The SEO Moz duplicate page content tool lists the following URL's as having duplicate content:
http://www.savvyboater.com/1988-newer-8-tooth-15-hp-honda-outboard-props.aspx
http://www.savvyboater.com/1988-newer-8-tooth-15-hp-honda-outboard-props.aspx?sort=PriceAsc&pi=2
The second URL is the price sorter/second page of the category and contains the following rel-canonical:
|
http://www.savvyboater.com/1988-newer-8-tooth-15-hp-honda-outboard-props.aspx">
Are we using the rel-canonical correctly in this case? If so, why does it continue to show up as duplicate content in our SEO Moz report? There are over 1,000 URLS listed in the report with the exact same issue.
|
-
You should use a robots.txt file and disallow "sort" parameters.
Such as:
Disallow: /*sort=
-
Hi Jim-
Yes, your canonical usage is correct, though you should also consider using a meta robots tag with "noindex" so your filtered pages don't show in the search results.
-John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with outdated and irrelevant content on a website?
Hi everyone, On our corporate website we have a blog where we publish articles which are directly related to our company (house heating systems and gas cylinders) and some articles which are completely irrelevant to our core business, but which might be of interest to our potential clients. Recently I've been told that it is not a good idea to include these not directly related posts to our core business, because Google might be somewhat confused at to what our core business is all about. I was advised to research this topic and think of completely removing blog posts that are irrelevant to our core business from our blog. By removing I mean completely removing pages and setting a 410 status to tell Google that it is not a 404 error but that these pages were intentionally removed. I would like to hear some independent advice from Moz community as to what I should do? Thank you very much in advance.
Content Development | | Intergaz0 -
Duplicate Content
I have a client based in the UK and one of their distributors based in the UAE have copied the content for their own website, will this affect my clients rankings because of duplicate content?
Content Development | | CreativeCow0 -
Writing <200 word pieces of content in a 7.5 hour day
My employer has a content writer who is currently working on writing unique descriptions for many pages, on the order of around 150-200 words per piece of content. A recurring theme in this content is to write a list of features such as "it does X, X, X, X, X and X", which can sometimes happen a couple of times during the content and takes up a decent chunk of wording. This content does not require in-depth research over and above reading the about us page of some sites and looking at what services they provide, as well as some quick details like their payment and delivery methods etc. As well as that the writer also writes the Meta Description and then uploads these to a CMS. There are no other tasks. Considering the writer is doing this 5 days a week, 7.5 hours a day, and isn't getting paid a poor or trainee-type wage, what would you say would be an acceptable amount to achieve on the average day? The current average works out to around, or slightly less than 8 of these pieces of content each day. Thoughts?
Content Development | | crystal.fde1 -
Project content marketing SEO value and Traffic
I'm creating a content marketing plan and need to project the visitor impact it could have on an exact-keyword domain name. The plan will produce at least 10 quality, original articles per day. Are there any common metrics or ways to guess what the visitor impact could be?
Content Development | | msantore821 -
Content Marketing - Car Space
Hey looking for cool content marketing examples in the car industry. Like major car companies leveraging their resources, in developing awesome, and viral content. Anyone aware of any cool campaigns? Cheers, Mark
Content Development | | MBASydney0 -
Content Duplication for Job Posting
Hello! I am responsible for SEO of a job portal of a recruitment agency. We get 30-40 jobs every month which we post on a) our job portal b) other job portals (like monster, career builder, Naukri , etc). How do avoid content duplication? We can only post the same job descriptions everywhere. We always post the job on our site first and then the other job portals**. How to ensure that Google knows our portal is the original job posters and not other job portals.** Thank you.
Content Development | | peoplesutra0 -
How can i solve duplicate problem with different url needed?
My client is a big international firm with 10 websites with different url (.co.uk, .com, .com.au, .pl... etc). All websites are exactly the same except the price. I suggested them to only use .com and use region as a sub domain like au.xxx.com instead of xxx.com.au. However they cannot do that for some reason. I am trying to solve the duplicate issue. I dont think i can use 301 redirect or canonial link because all regions are making even traffics. Any suggestions?
Content Development | | ringochan0 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230