Testing for duplicate content and title tags
-
Hi there,
I have been getting both Duplicate Page content and Duplicate Title content warnings on my crawl diagnostics report for one of my campaigns. I did my research, and implemented the preferred domain setting in Webmaster Tools. This did not resolve the crawl diagnostics warnings, and upon further research I discovered the preferred domain would only be noted by Google and not other bots like Roger. My only issue was that when I ran an SEOmoz crawl test on the same domain, I saw none of the duplicate content or title warnings yet they still appear on my crawl diagnostics report. I have now implemented a fix in my .htaccess file to 301 redirect to the www. domain. I want to check if it's worked, but since the crawl test did not show the issue last time I don't think I can rely on that. Can you help please?
Thanks,
Claire
-
Thanks Joseph. Very helpful.
-
Hello Claire,
I really don't think you are going to get those errors show up on the next crawl.
Just on another note after seeing your URL...
I would also code all of the links in the site with full absolute links.
I see the links as
<li> href=" /our-services/diabetes-support/get-started/"> Diabetes Supporta>li> I would add the **http://www.** in front of all those links.
-
Thanks Jared, that's awesome.
-
You can always check by testing in your browser but the best way is to check the header response to make sure the server is sending the proper response (a 301) - your landing pages look good (see below). I use Live HTTP Headers which is a firefox plugin - hers what it tell you:
http://pharmacy777.com.au/our-pharmacies/applecross-village/
GET /our-pharmacies/applecross-village/ HTTP/1.1
Host: pharmacy777.com.au
User-Agent: Mozilla/5.0 (Windows NT 6.0; rv:15.0) Gecko/20100101 Firefox/15.0.1
HTTP/1.1 301 Moved Permanently
Date: Thu, 04 Oct 2012 03:23:17 GMT
Server: Apache/2.2.22 (Ubuntu)
Location: http://www.pharmacy777.com.au/our-pharmacies/applecross-village/So the redirect is working. The only thing i noticed was that the home page instantly switched to www and didnt even return a 301 so it appears you may have implemented a redirect there outside of htaccess.
If your report is still showing duplicates make sure that its not the trailing slash. Your URLs can be loaded as such:
http://www.pharmacy777.com.au/our-pharmacies/applecross/
http://www.pharmacy777.com.au/our-pharmacies/applecross
The best way to find out if the SEOMoz report is counting these as dupes is to Export the crawl report to CSV (top right of crawl report). Then go all the way to the far right column called 'duplicate pages' and sort it alphabetically. This column will show you all of the duplicate urls for each particular URL row. Lots of times you can find little 'surprises' here - that csv report is priceless!
-
Hi Joseph,
Yes, I have done this test and it appears to be working. I just want to be sure I'm not going to be faced with a load of warnings when my next crawl runs on the weekend, as when I implemented the Webmaster preferred domain what happened was"
-
implemented preferred domain
-
rank crawl test - looked to have resolved the issue but then
-
4 days later the scheduled crawl report ran, all errors still present.
Luckily I told the client I had to wait for the report results and didn't tell think it was resolved after the crawl test looked OK!! This time I've run the crawl test and done the manual test you suggested, but I want to be able to feed back to the client today if I can (confidently), and I no longer trust the test.
Thanks very much for your answer, it's always good to have someone validate your own approach.
Cheers,
Claire
-
-
-
If I understand correctly you want to see if your re-direct has fixed your duplicate content issue.
Right now you still get the error ...
I would simply type the url in the address bar, with and without the www. and see if both show up.
If the re - direct is working then only one should show up , the other should re - direct immediately.
If they both show up do then your .htaccess code may have a mistake.
hope that helps.
-
Update: Reporting can be historic - so you are probably looking at a report from an older crawl.
-
Hi Claire - we need the url of your site to check the headers on the 301 redirect!
Definitely a good way to fix this is via htaccess like you are suggesting you did. When I get a new client its in the campaign startup list and it works well. Make sure there arent any other issues like the infamous trailing slash causing duplication. If you provide the URL a quick check can be made.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dup Title tags
I am frustrated....Google Webmaster tools shows this as dup title tags....I've fixed other oages with this issue, but can't figure this out?! here is the page itself... http://www.seadwellers.com/tag/padi-1/ I can't figure out where this freakin page even iS?! | 2 |
Technical SEO | | sdwellers
| <a id="zip_1-anchor" class="zippedsection_title"></a>padi Archives - Sea Dwellers Dive Center of Key Largo, Florida Keys/category/padi//tag/padi/ | Any help with this thing wold be greatly appreciated...0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin <cite>dev.rollerbannerscheap.co.uk/</cite><a id="srsl_0" class="pplsrsla" tabindex="0" data-ved="0CEQQ5hkwAA" data-url="http://dev.rollerbannerscheap.co.uk/" data-title="Roller Banners Cheap » admin" data-sli="srsl_0" data-ci="srslc_0" data-vli="srslcl_0" data-slg="webres"></a>A description for this result is not available because of this site's robots.txt – learn more.This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google.Please can anyone help?
Technical SEO | | SO_UK0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Are duplicate page titles fixed by the canonical tag
Google Web Master Tools is saying that some of my pages have duplicate page titles because of pagination. However, I have implemented the canonical tag on the paginated pages which I thought would keep my site from being penalized for duplicate page titles. Is this correct? Or does canonical tag only relate to duplicate content issues?
Technical SEO | | Santaur0 -
Duplicate content / title caused by CAPITALS
What is the best way to stop duplicate content warning (and Google classing them as duplicate content), when it is caused by CAPITALS (i.e www.domain.com/Directory & www.domain.com/directory ). I try to always use lower case (unless a place name then i use Capitals for the first letter), but it looks like i have slipped up and got some mixed up and other sites will also be linking to Capitals Thanks Jon
Technical SEO | | jonny5123790 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
Duplicating Keywords in Page Title
Hello All, I am currently trying to establish the TITLE tag of my homepage. I am trying to target 2 terms plus my company name. For example, purposes, the two keywords are: Widget Program Widget Software My company Name is: Widget Direct I originally had the title as: Widget Program | Software | Widget Direct My thought was that I didn't want to repeat the word "Widget" too many times. However, the SEOmoz on-page report card keeps telling me I should have the exact keyword in my title tag. In that case it would make the title: Widget Program | Widget Software | Widget Direct Do you think that is better so that I have each keyword in the title or will that result in a penalty because it looks like I'm stuffing the title with the keyword 'widget'? Any insight is greatly appreciated! Thanks!
Technical SEO | | Robert-B0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860