How do I fix apparent duplicates
-
I'm auditing a site and would appreciate your help with possible explanations and solutions as to why Google Analytics in the Content Drilldown page is showing what appears to be duplicate pages. (Refer image)
I'm wondering if I have got my head around the rel=canonical tag because the page I'd consider a duplicate "page/" has a Canonical tag pointing to "~/page.html"
This is the tag from the page Locations/
rel="canonical" href="http://www.domain.com/Locations.html" /> so am unsure why both versions of the page are generating views. Shouldn't the Canonical tag work like a 301 redirect?
I'm unsure how the pages using the path page/ are generating so many views because I have not been able to find them and they are not indexed by Google.
Unfortunately the site is built using a Propriety CMS I'm not familiar with.
-
Hi Paul
I appreciate your explanation of when to use Canonical tags. I had previously thought they were limited to redirecting www.domain.com to domain.com.
I understand your solution to the Dupes problem and will be searching SEOMoz's resources for how to write rewrites and Search & Replace filters using RegEx in Analytics for that matter.
It's not the first time you've provided an high quality answer to a question of mine. I very much appreciate your contribution to my growing knowledge and the SEOMoz community.
Best
Nic
-
A canonical tag is fundamentally different from a 301-redirect, Nic. There's nothing about a canonical tag that stops a visitor from being able to visit that URL. A 301-redirect actually forwards the visitor to the target page as if the initial page doesn't even exist so there's no physical way for a visitor to land on it.
Put another way, the source page of a 301-redirected URL doesn't even exist as far as the search engines are concerned (and eventually the'll actually drop the original URL altogether).
The canonical tag serves a very specific purpose. When two pages must continue to be reachable by 2 different URLs but the page content is essentially identical (e.g. a product page sorted by size or colour), then a canonical tag suggests that the search engines should consolidate the ranking value in the primary URL. That's it.
In the case of the /contact+us.html and /contact+us/ pages - that page should only be reachable at one or the other URL. There's no reason or value to the user for the page to be reachable at the second address. The correct way to deal with this is to use a rewrite rule to 301-redirect all the page/ versions of the site's pages to the page.html (assuming that's what you've decided should be the canonical.
The only time to use canonical tags instead of redirects in a case like this is if it is technically impossible to implement the rewrites (a shared server that doesn't allow access to the .htaccess file for example). But this is sub-optimal and would still leave you with the same Analytics dupe page problem you're currently running into.
So what to do about the dupes in Analytics, given the site wasn't configured with the rewrites? You can write a custom Search and Replace filter for the site's profile that uses regex to merge both versions of each page into a single line. You'll absolutely want to do this in a new profile created just for this purpose though, keeping the original unfiltered profile for reference and historical data.
Note that this will only affect data collected from the date of creation of the new profile/filter. It's not retroactive. If you want to combine results for these pages for the existing data, you'll need to dump it to Excel and use a formula to combine the dupes.
Hope that all makes sense?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Metadata and duplicate content issues
Hi there: I'm seeing a steady decline in organic traffic, but at the same time and increase in pageviews and direct traffic. My site has about 3,000 crawl errors!! Errors are duplicate content, missing description tags, and description too long. Most of these issues are related to events that are being imported from Google calendars via ical and the pages created from these events. Should we block calendar events from being crawled by using the disallow directive in the robots.txt file? Here's the site: https://www.landmarkschool.org/
Reporting & Analytics | | BGR0 -
641 Crawl Errors In My Moz Report - 190 are high priority Duplicate Content
Hi everyone, There are high and medium level errors. I was surprised to see any especially since Google Analytics shows no errors whatsoever.190 errors - duplicate content.A lot of images are showing in the Moz Crawl Report as errors, and when I click on one of these links in the report, it directs to the image which displays on a blog post on the site unusually since I haven't started blogging yet.. So it looks like all those errors are because the images are appearing on their own post.So for example a picture of a mountain would be referred to with www.domain.com/mountains ; the image would be included in the content on a page but why give an image a page/post all of it's own when that was not my intention. Is there a way I can change this?# ----------------------------------------
Reporting & Analytics | | SEOguy1
These are things I first see at the top of the Moz Report: There are 2 similar home urls at the top of the report: http status code is 200 for both (1) and (2) Link Count for (1) is 71. Link count for (2) is 60. No client or server errors Rel Canonical Rel-Canonical Target
Yes http:// domain. co.uk/home
Yes http:// domain. co.uk/home/ Does this mean that the home page is being seen as a duplicate by Google and the search engines?http status codes on every page is 200.Your help would be appreciated.Best Regards,0 -
How do I fix 608's please?
Hi, I'm on the free trial and finding it very useful I've fixed all my 301's. but now I have a load of 608's. I don't no what this is! I feel like I've cured herpes only to get gonorrhea! can any one help. I have 41 608's which is more than the 301's I had. I hope they are non-related! I won't bore you with the whole list but some of the url's are: Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/catering-at-mr-mrs-currys-50th-wedding-anniversary/guestsarrive----608 Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/funeral-catering/picture4-2----608 Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/wedding-venues
Reporting & Analytics | | SussexChef831 -
Best way to handle duplicate title on Home page?
Moz reports two links to the same Home page ad duplicate titles ... http://myhjhome.com/index.php
Reporting & Analytics | | ElykInnovation
http://myhjhome.com I'm not sure if I should just 301 redirect http://myhjhome.com/index.php to http://myhjhome.com, or if there is a better way to handle that? Or should I comb the website and make sure all links to the Home page dont include index.php? Just looking for some extra help here, learning as I'm going, thanks!!0 -
Moz Crawler suddenly reporting 1000s of duplicates (BE.net)
In the last 3-4 days we've had several thousand 'duplicate content' warnings appear in our crawl report, 99% of them related to our on-site blog. The blog is BlogEngine.Net, but the pages simply don't exist. The majority seem to be Roger trying quasi-random URLs like:
Reporting & Analytics | | Progauto
/?page=410 /?page=151 Etc. etc. The blog will present content for these requests, but it is of course the same empty page since there's only unique content for up to /?Page=10 or so. Two questions: 1. Did something change recently? These blogs have been up for months, and this problem has only come up this week. Did Roger change to become more aggressive lately? 2. Suggested remediation? On one of the blogs I've put no-index no-follow for any page that has a /?page querystring, and we'll see what effect that has come next crawl next week. However, I'm not sure this will work as per: http://moz.com/community/q/functionality-of-seomoz-crawl-page-reports Anyone else had dynamic blogs suddenly blossom into thousands of duplicate content warnings? Google (rightly) ignores these pages completely.0 -
Duplicate content and ways to deal with it.
Problem I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images. http://i.imgur.com/OXnPp.png Solutions: 1) Quick: Change the link on the pages above to be lowercase 2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example: http://www.darden.virginia.edu/MBA" /> ''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.'' 3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you. What do you all think?????? OXnPp voJdp.png OXnPp.png
Reporting & Analytics | | Darden0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Time until duplicate penalty is lifted?
Hello, I recently discovered that half of the pages on my site, about 3,500 were not being indexed or were indexing very very slow and with a heavy weight on them. I discovered the problem in the "HTML Suggestions" within Google's Webmaster Tools. An example of my main issue. All 3 of these URL were showing 200 Status OK in Google. www.getrightmusic.com/mixtape/post/ludacris_1_21_gigawatts_back_to_the_first_time www.getrightmusic.com/mixtape/post/ludacris_1_21_gigawatts_back_to_the_first_time/ www.getrightmusic.com/mixtape/ludacris_1_21_gigawatts_back_to_the_first_time I added some code to the .htaccess in order to remove the trailing slashes across the board. I also properly set up my 404 redirects, which were not properly set up by my developer (when the site "relaunched" 6 months ago 😞 ) I then added the Canonical link rel tags on the site posts/entries. I'm hoping I followed all the correct steps in fixing the issue and now, I guess, I just have to wait until the penalty gets lifted? I'm also not %100 certain that I have been penalized. I'm just assuming based on the SERP ceiling I feel and the super slow or lack of indexing my content. Any insight, help or comments would be super helpful. Thank you. Jesse
Reporting & Analytics | | getrightmusic0