Duplicate content and ways to deal with it.
-
Problem
I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images.
Solutions:
1) Quick: Change the link on the pages above to be lowercase
2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example:
http://www.darden.virginia.edu/MBA" />
''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.''
3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you.
What do you all think??????
-
Because that is just filtering your data in your report. That will not stop this from happening.
-
I think (2) - the canonical tag - is a solid solution if just a few URLs are out of whack, but if you're using the mixed-case version internally, then you may need to change your structure as well. If you change your structure, then I'd probably look at a full-scale system of 301-redirects to preserve inbound link-juice.
It sounds like you're linking to mixed-case internally, so you may need to set up the redirects. Make sure that, depending on your platform, the case-specific redirects work properly (and don't create an endless loop). There is some risk to making the switch, so I'd probably only do it if you're seeing this happen a lot. Unfortunately, mixed-case URLs are often more trouble than they're worth.
-
Why would I not just do this?
http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=90397
-
I would stick to using the Rel=Canonical tag.
You could also check in Google Webmaster Tools and look at the URL parameter handling tool.
In this you will be able to:
1. Recognize duplicate content on your website.
2.Determine your preferred URLs.
3.Apply 301 permanent redirects where necessary and possible.
4.Implement the rel="canonical" link element on your pages where you can.
5.Use the URL parameter handling tool in Google Webmaster Tools where possible.
Further reading: http://googlewebmastercentral.blogspot.co.uk/2009/10/reunifying-duplicate-content-on-your.htmlI hope this helps
Ally
-
Option "2," using rel=canonical seems like the best course of action to me. You may also want to apply a 301 redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an easy way to switch hundreds of websites to https in GSC?
My company has hundreds of websites setup in Google Search Console but will soon be moving them all to secure domains. Is there an easy way to make the switch in GSC or do we have to change the address one by one?
Reporting & Analytics | | MJTrevens0 -
The best way to track internal links in Google Analytics
Hi there, we are a retail business and we have invested in quality editorial content which sits in our Blog at ourwebsite.co.uk/blog/ The Blog links to the main site (an online store) and I want to track the 'value' of the blog by how many clicks the blog content generates back to the main store. At the moment we're using this code on the end of every link in the Blog: ?utm_source=Blog&utm_medium=Widget&utm_campaign=FromBlog Does this affect SEO and is there a better way of doing it? Thanks.
Reporting & Analytics | | Bee1590 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
What is the most effective way of selecting a top keyword per page on a site?
We are creating fresh content for outdated sites and I need to identify the most significant keyword per page for the content developers, What is the best way to do this?
Reporting & Analytics | | Sable_Group0 -
Google Analytics is treating my blog like all the content is just on the home page.
Hello all, I installed Google Analytics on a main website and a blog (blog.travelexinsurance.com) While it appears to be tracking correctly (and when I test it in real time it shows that I'm visiting) but it is treating the entire blog as though it's one page. So I can't see data on blog post X. All I see is that X visitors came to my blog in aggregate. So I see blog.travelex.com has 999 visitors, but it doesn't show that /travel-luggage got 50 visits, while /insurace-tips got 75 and so forth. I assume I screwed up the tracking somehow, but can't figure out where I went wrong. Tracking on the main domain works just fine. It's specific to the blog.
Reporting & Analytics | | Patrick_G0 -
Large content snippets showing up as keywords?
I've started to notice something very strange: the search keywords report in analytics show a bunch of instances where a person copied large snippets of our site content and then pasted it into the search box. Half these searches are coming from the US and half from...India. I'm worried that this may be the sign of a competitor attempting to perform negative SEO on our site (though admittedly I don't know how). Anyone seen anything like this? Advice? Thanks!!
Reporting & Analytics | | SarahLK0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Sub-category considered duplicate content?
Hello, My craw diagnostics from the PRO account is telling me that the following two links have duplicate content and duplicate title tag: http://www.newandupcoming.com/new-blu-ray-releases (New Blu-ray Releases) http://www.newandupcoming.com/new-blu-ray-releases/action-adventure (New Action & Adventure Releases | Blu-ray) I am really new to the SEO world so I am stuck trying to figure out the best solution for this issue. My question is how should I fix this issue. I guess I can put canonical tag on all sub-categories but I was worried that search engines would not craw the sub-categories and index potentially valuable pages. Thanks for all the help.
Reporting & Analytics | | hirono0