Duplicate content error?
-
I am seeing an error for duplicate content for the following pages:
http://www.bluelinkerp.com/contact/
http://www.bluelinkerp.com/contact/index.asp
Doesn't the first URL just automatically redirect to the default page in that directory (index.asp)? Why is it showing up as separate duplicate pages?
-
@Streamline is right - as soon as the engines encounter both versions, they see it as two pages. It's no problem for human visitors, but it can create issues with duplicate URLs in the Google index. You can either 301-redirect the "index.asp" version back to the cleaner, root URL or use a canonical. ASP/.Net can be weird about 301s, so the canonical is probably easier.
Generally, we suggest people canonical to the shorter/friendlier version, but the trouble here is that you're using the "/index.asp" version in your internal links. If you can change the internal links to the "/contact/" version, I'd prefer that, but if not, then set the canonical tag to the "/index.asp" version. The most important thing is consistency. If you link to one version but canonical to the other, Google could ignore your canonical tag. Put simply, your canonical URL isn't really canonical, in that case.
-
Thanks for the response! Is it best practice to specify the canonical URL as the "unspecific" link? Should I not rather specify the canonical URL as "http://www.bluelinkerp.com/contact/index.asp"?
-
They're two different URLs.
If the URL changes but the content stays the same then it's classed as duplicate content.
I feel your pain though - the amount of duplicate pages I've ended up with just because copywriters like to capitalize their words...
-
There's several possible ways the search engines could have come across both versions of that page. If I had to guess, it's because somewhere on the web or even on your own website there are links to both URLs. It's a pretty common issue but one that is easily resolved with the rel="canonical" tag.
Simply put the following code in the header of that page -
This tells the search engines to use your designated URL anytime they access that page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Fix 404 Errors
Hey Moz'ers - I just added a new site to my Moz Pro account and when I got the crawl report back there was a ton of 404 errors (see attached). I realize the best way to fix these is to manually go through every single error and see what the issue is... I just don't have time right now, and I don't have a team member that can jump on this either, but realize this will be a huge boost to this client if/when I get these resolved... So my question is: Is there a quicker way to get these resolved? Is there an outsourcing company that can fix my clients errors correctly? Thanks for the help in advance:) wBhzEeV
Moz Pro | | 2Spurs0 -
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
Error in Moz duplicate content reports
Hi - I've run the Moz campaign on a client's site. Moz is saying that there are duplicate content errors, and when I look at the errors it is showing that they are all to do with the non-www URLs having being duplicated in the www form of the URLs. However this is not the case - all the non-www URLs are all 301 redirected to the www URLs. Is this an error in the Moz tool? Has anybody experienced something similar?
Moz Pro | | rorynatkiel0 -
Increase of 404 error after change of encoding
Hello, We just have launch a new version of our website with a new utf-8 encoding. Thing is, we use comma as a separator and since the new website went live, I have a massive increase of 404 error of comma-encoded URL. Here is an example : http://web.bons-de-reduction.com/annuaire%2C321-sticker%2Csite%2Cpromotions%2C5941.html instead of : http://web.bons-de-reduction.com/annuaire,321-sticker,site,promotions,5941.html I check with Screaming Frog SEO and Xenu, I can't manage to find any encoded URL. Is anyone have a clue on how to fix that ? Thanks
Moz Pro | | RetailMeNotFr0 -
Duplicate Content
I have tried searching for an exact example of the issues I am seeing, but didn't come up with anything. I decided to post my own question so I can get a direct answer on what I am experiencing. I recently took over a website and its' existing SEO practices with it. Upon placing the site on SEOmoz, I received many (LOTS) of duplicate content warnings. Pretty much, this is how the website is setup: domain.com/keyword-is-here/ but it is also coming up as domain.com/keyword-is-here/index.htm - Should I setup a redirect so domain.com/keyword-is-here/index.htm points to domain.com/keyword-is-here.htm or should I just leave it alone since it's pointing to the same exact? Any information on this questions is greatly appreciated in advance.
Moz Pro | | EQ-Richie0 -
Why am I getting duplicate content errors on same page?
In the SEOmoz tools I am getting multiple errors for duplicate page content and duplicate page titles for one section on my site. When I check to see which page has the duplicate title/content the url listed is exactly the same. All sections are set up the same, so any ideas on why I would be getting duplication errors in just this one section and why they would say the errors are on the same page (when I only have one copy uploaded on the server)?
Moz Pro | | CIEEwebTeam0 -
Error 403
I'm getting this message "We were unable to grade that page. We received a response code of 403. URL content not parseable" when using the On-Page Report Card. Does anyone know how to go about fixing this? I feel like I've tried everything.
Moz Pro | | Sean_McDonnell0 -
Tool for scanning the content of the canonical tag
Hey All, question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags? I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/). I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that. Cheers, Josh
Moz Pro | | prima-2535090