I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
-
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
-
The 301 and canonical can be used to solve similar issues, so it gets confusing. For home pages, I think the canonical is a good route, because it "sweeps" up other variants as well. For example, someone might hit your home-page without the "www" with an affiliate ID, etc. One canonical tag on the home-page prevents all of that.
The "alerts" in our system can be a bit hyperactive. Usually the All-In-One canonicals are solid. We're probably just giving you a general warning, but it's tough to tell without a specific page.
-
Thanks for the reply. I read up a bit on rel=canonical today and as far as I understand, it's a tool to pass on link juice and page rank, correct?
If this is true, why would I be getting tons of "Alerts" about my blog (located as a subdomain) because the "All In One SEO Pack" Plugin added them to about 90+ of our blog posts.
This just confuses me on the whole about how the rel=canonical is properly used. Does it always have to supplement 301 redirects? Or can/should it be used without it?
Sorry for the confusion. To me, this is a very easy thing to mess up!
Derek M
-
You're going to want to implement a 301 redirect of http://www.accupos.com/index.php to to http://www.accupos.com/
You'd also want to add rel="canonical" tags to your index.php file to supplement the redirection, which will look like this:
Other reading:
- Read more about redirects here http://www.seomoz.org/learn-seo/redirection
- More info on how redirects work: http://www.seomoz.org/blog/url-rewrites-and-301-redirects-how-does-it-all-work
- Visit this page and scroll down to the URL Structure section and start reading: http://www.seomoz.org/beginners-guide-to-seo/basics-of-search-engine-friendly-design-and-development
- Also worth reading up on, but be careful with this one, it's trickier to implement: http://www.seomoz.org/learn-seo/canonicalization
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Is My Boilerplate Product Description Causing Duplicate Content Issues?
I have an e-commerce store with 20,000+ one-of-a-kind products. We only have one of each product, and once a product is sold we will never restock it. So I really have no intention to have these product pages showing up in SERPs. Each product has a boilerplate description that the product's unique attributes (style, color, size) are plugged into. But a few sentences of the description are exactly the same across all products. Google Webmaster Tools doesn't report any duplicate content. My Moz Crawl Report show 29 of these products as having duplicate content. But a Google search using the site operator and some text from the boilerplate description turns up 16,400 product pages from my site. Could this duplicate content be hurting my SERPs for other pages on the site that I am trying to rank? As I said, I'm not concerned about ranking for these products pages. Should I make them "rel=canonical" to their respective product categories? Or use "noindex, follow" on every product? Or should I not worry about it?
Technical SEO | | znagle0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
New Magento shop - how to best avoid duplicate content?
Hi all, My clients are about to have th elatest version of the free Magento store set up. It will sell in at least to different languages, so this need to be taken into account. Could any of you give some advice on what is the best way to avoid DC (if possible)? The shop is by now clean (from dc) but from experience I konw this will no continue... Thanks, Best regards, Christian
Technical SEO | | sembseo0 -
Why do I get duplicate content errors just for tags I place on blog entries?
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags. Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map. I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data. As seen in GWT: Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states
Technical SEO | | GerryWeitz0 -
Duplicate Content Issue within the Categories Area
We are in the process of building out a new website, it has been built in Drupal. Within the scan report from SEOMOZ Crawl Diagnostics and it look like I have a duplicate content issue. Example: We sell Vinyl Banners so we have many different templates one can use from within our Online Banner Builder Tool. We have broken them down via categories: Issue: Duplicate Page Content /categories/activities has 9 other URLS associated this issue, I have many others but this one will work for an example. Within this category we have multiple templates attached to this page. Each of the templates do not need their own page however we use this to pull the templates into one page onto the activities landing page. I am wondering if I need to nofollow, noindex each of those individule templates and just get the main top level category name indexed. Or is there a better way to do this to minimize the impact of Panda?
Technical SEO | | Ben-HPB0