Testing for duplicate content and title tags
-
Hi there,
I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please?
Thanks,
Claire
-
Thanks Joseph. Very helpful.
-
Hello Claire,
I really don't think you are going to get those errors show up on the next crawl.
Just on another note after seeing your URL...
I would also code all of the links in the site with full absolute links.
I see the links as
<li> href=" /our-services/diabetes-support/get-started/"> Diabetes Supporta>li> I would add the **http://www.** in front of all those links.
-
Thanks Jared, that's awesome.
-
You can always check by testing in your browser but the best way is to check the header response to make sure the server is sending the proper response (a 301) - your landing pages look good (see below). I use Live HTTP Headers which is a firefox plugin - hers what it tell you:
http://pharmacy777.com.au/our-pharmacies/applecross-village/
GET /our-pharmacies/applecross-village/ HTTP/1.1
Host: pharmacy777.com.au
User-Agent: Mozilla/5.0 (Windows NT 6.0; rv:15.0) Gecko/20100101 Firefox/15.0.1
HTTP/1.1 301 Moved Permanently
Date: Thu, 04 Oct 2012 03:23:17 GMT
Server: Apache/2.2.22 (Ubuntu)
Location: http://www.pharmacy777.com.au/our-pharmacies/applecross-village/So the redirect is working. The only thing i noticed was that the home page instantly switched to www and didnt even return a 301 so it appears you may have implemented a redirect there outside of htaccess.
If your report is still showing duplicates make sure that its not the trailing slash. Your URLs can be loaded as such:
http://www.pharmacy777.com.au/our-pharmacies/applecross/
http://www.pharmacy777.com.au/our-pharmacies/applecross
The best way to find out if the SEOMoz report is counting these as dupes is to Export the crawl report to CSV (top right of crawl report). Then go all the way to the far right column called 'duplicate pages' and sort it alphabetically. This column will show you all of the duplicate urls for each particular URL row. Lots of times you can find little 'surprises' here - that csv report is priceless!
-
Hi Joseph,
Yes, I have done this test and it appears to be working. I just want to be sure I'm not going to be faced with a load of warnings when my next crawl runs on the weekend, as when I implemented the Webmaster preferred domain what happened was"
-
implemented preferred domain
-
rank crawl test - looked to have resolved the issue but then
-
4 days later the scheduled crawl report ran, all errors still present.
Luckily I told the client I had to wait for the report results and didn't tell think it was resolved after the crawl test looked OK!! This time I've run the crawl test and done the manual test you suggested, but I want to be able to feed back to the client today if I can (confidently), and I no longer trust the test.
Thanks very much for your answer, it's always good to have someone validate your own approach.
Cheers,
Claire
-
-
-
If I understand correctly you want to see if your re-direct has fixed your duplicate content issue.
Right now you still get the error ...
I would simply type the url in the address bar, with and without the www. and see if both show up.
If the re - direct is working then only one should show up , the other should re - direct immediately.
If they both show up do then your .htaccess code may have a mistake.
hope that helps.
-
Update: Reporting can be historic - so you are probably looking at a report from an older crawl.
-
Hi Claire - we need the url of your site to check the headers on the 301 redirect!
Definitely a good way to fix this is via htaccess like you are suggesting you did. When I get a new client its in the campaign startup list and it works well. Make sure there arent any other issues like the infamous trailing slash causing duplication. If you provide the URL a quick check can be made.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Duplicate content, how to solve?
I have about 400 errors about duplicate content on my seomoz dashboard. However I have no idea how to solve this, I have 2 main scenarios of duplication in my site: Scenario 1: http://www.theprinterdepo.com/catalogsearch/advanced/result/?name=64MB+SDRAM+DIMM+MEMORY+MODULE&sku=&price%5Bfrom%5D=&price%5Bto%5D=&category= 3 products with the same title, but different product models, as you can note is has the same price as well. Some printers use a different memory product module. So I just cant delete 2 products. Scenario 2: toners http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-73 http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-75 In this scenario, products have a different title but the same price. Again, in this scenario the 2 products are different. Thank you
Technical SEO | | levalencia10 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0