Duplicate content and ways to deal with it.
-
Problem
I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images.
Solutions:
1) Quick: Change the link on the pages above to be lowercase
2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example:
http://www.darden.virginia.edu/MBA" />
''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.''
3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you.
What do you all think??????
-
Because that is just filtering your data in your report. That will not stop this from happening.
-
I think (2) - the canonical tag - is a solid solution if just a few URLs are out of whack, but if you're using the mixed-case version internally, then you may need to change your structure as well. If you change your structure, then I'd probably look at a full-scale system of 301-redirects to preserve inbound link-juice.
It sounds like you're linking to mixed-case internally, so you may need to set up the redirects. Make sure that, depending on your platform, the case-specific redirects work properly (and don't create an endless loop). There is some risk to making the switch, so I'd probably only do it if you're seeing this happen a lot. Unfortunately, mixed-case URLs are often more trouble than they're worth.
-
Why would I not just do this?
http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=90397
-
I would stick to using the Rel=Canonical tag.
You could also check in Google Webmaster Tools and look at the URL parameter handling tool.
In this you will be able to:
1. Recognize duplicate content on your website.
2.Determine your preferred URLs.
3.Apply 301 permanent redirects where necessary and possible.
4.Implement the rel="canonical" link element on your pages where you can.
5.Use the URL parameter handling tool in Google Webmaster Tools where possible.
Further reading: http://googlewebmastercentral.blogspot.co.uk/2009/10/reunifying-duplicate-content-on-your.htmlI hope this helps
Ally
-
Option "2," using rel=canonical seems like the best course of action to me. You may also want to apply a 301 redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to block spambots in htaccess
I would like to block Russian Federation, China and Ukraine spam as well as semalt and buttonsforwebsite. I have come up with the following code, what do you think? For the countries: BLOCK COUNTRY DOMAINS RewriteCond %{HTTP_REFERER} .(ru|cn|ua)(/|$) [NC]
Reporting & Analytics | | ijb
RewriteRule .* - [F] And for buttons-for-website.com and semalt-semalt.com: BLOCK REFERERS RewriteCond %{HTTP_REFERER} (semalt|buttons) [NC]
RewriteRule .* - [F] or should it be: BLOCK USER AGENTS RewriteCond %{HTTP_USER_AGENT} (semalt|buttons) [NC]
RewriteRule .* - [F] Could I add (semalt|buttons|o-o-6-o-o|bestwebsitesawards|humanorightswatch) or is that too many?0 -
Is it possible to have too much content?
Is it possible to have too much content? If so, how do we prove (or at least get evidence) one way or the other to whether we are being adversely affected in the SERPs? The only way we could think of is to publish a lot less for a week or two, but nobody is willing to risk it. Numbers wise we are publishing an average of 125 articles a day, with an archive of around 300k fwiw thanks 🙂
Reporting & Analytics | | Fammy0 -
Pages with Duplicate Page Content
Hi Just started use the Moz and got an analytics report today! There about 104 duplicate pages apparently, the problem is that they are not duplicates, but just the way the page has been listed with a description! The site is an Opencart and every page as got the name of the site followed by the product name for the page! How do you correct this issue?? Thank for your help
Reporting & Analytics | | DRSMPR1 -
Analytics - content performance
Hi guys, is there any way to group specific pages together in analytics and just see how they have peroformed visit wise, etc? I need to group some resort guides all with different names, it's tedious going through each one manually. Help much appreciated as always!
Reporting & Analytics | | pauledwards0 -
When will traffic data be working ? also whats with the spike in duplicate listing issues with everyone.
Hi There, We have no traffic data, is this something we are doing wrong or is this an issue with SEOMOZ ? Also duplicate listings have gone sky high, check goggle analytics's and all ok ? Any answers ? Thanks Charlie
Reporting & Analytics | | pro580 -
Best way to generate analytics reports for listing style website
I'm working on a website that includes dedicated pages for ~40 local businesses, and I need to be able to generate and export some basic reports I can send to each business. The data I need for each report is split between general sitewide data: total number of visitors to the site for month. number of visitors to each main category page what country they are from (main countries) - top 5 traffic source / keywords avg time on site As well as specific data for each individual page: how many people viewed specific total pages Time spent on individual page. Would this be possible with custom reports in analytics? I can see the number of different reports being difficult to maintain, especially as site grows. Anyone had expereince on a similar site of ideas on the best way to do this? Thanks
Reporting & Analytics | | zeald0 -
Google Analytics Content Experiments don't deliver 50/50?
Our A/B test is actually delivering at about a 70/30 page view rate. 70% in favor of the original version and only 30% of the new. We are sending 100% of our traffic to this homepage test. Has anyone else experienced this? There seems to be a lot of folks experiencing this.....anyone know why?
Reporting & Analytics | | VistageSEO0 -
Is there a way to calculate Domain Authority and Domain Trust on a large scale?
Hi there! Since on of our websites is hit by the Penguin update I'm trying to find spammy and low quality external links from our domain. I'm using Xenu to get all the external links. Now I want to know about the quality of the external websites our domain links to. Is there a way to calculate domain authority and domain trust on a large scale in Excel or Google spreadsheet? Kind regards, Mark
Reporting & Analytics | | StephWeigert0