Duplicate content problem
-
Hi there,
I have a couple of related questions about the crawl report finding duplicate content:
We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots?
The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great!
Thanks very much!
George
-
Thanks Don.
The links are external. We do have a general 404 page, but for whatever reason, when he built this site, the developer guided those specific 404s to that news page.
Thanks again.
-
The alternative he suggested was to delete the page.
Before I removed a page I would canonical it.You can always undo the Canonical once content becomes available.
-
I would only use the Canonical tag if you don't want these pages to rank for anything in particular. When you set the authoritative page, you are eliminating the ranking potential of the non authoritative pages. Is it possible to get user generated content for these pages? Like reviews?
-
Hi George,
To the first issue. Have you considered using a REL=CANONICAL tag on this pages with thin to no content (point to them to the parent category). Barring that option you could also just put a NOINDEX / NOFOLLOW. No sense in removing the page just in case it has a few links out there.
For the 404 page, the main reason I would suspect Moz seeing this as duplicate content would be because of parameters on it. It would make more sense to have a universal 404 Page and let the server handle the error for you and redirect where appropriate. Moz site.. www.thisSite.com/catalog/ImGoingNoWhere.php
404 Page are usually pretty simple to setup content wise. Your second goal would be to remove all links on your site that would return a 404.
-
Hi there,
If you have a lot of HTML coding on pages, with no content, your HTML coding is what is triggering the duplicate content. Because you have no content, the HTML is all the bots see and if it is really similar, there is nothing to tell them that there is anything unique on the pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content - working with CMS constraints
Hi, We use an industry-specific CMS and I'm struggling to figure out how we can fix duplicate content issues. Thankfully, the vendor has agreed to work on 301 vs 302 redirects. However, they aren't currently able to give us the ability to add rel=canonical tags to page headers (we've put it in their "suggestion box" which tends to take a long time, if ever, to materialize). My understanding is that the tag will not be recognized if it's in the body code, correct? (aka the part of the page we can edit from the CMS) Is there anything else I can do?
Technical SEO | | combska0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Hreflang and possible duplicate content SEO issue
| 0 <a class="vote-down-off" title="This question does not show any research effort; it is unclear or not useful">down vote</a> favorite | Hey community, my first question here 🙂 Imagine there is a page with video, it has hreflang tags setup, to lead let's say German visitors to /de/ folder... So, on that German version of page, everything like menus, navigation and such are in German, but the video is the same, the title of the video (H1 tag) is the same, <title></code></strong> and <strong><code>meta description</code></strong> is the same as on the original English page. It means that general (English) page and German version of it has the same key content in English.</p> <p>To me it seems to be a SEO duplicate content issue. As I know, Google doesn't think that content is duplicate, if it is properly translated to other language.</p> <p>Does my explained case mean that the content will be detected by Google as duplicate?</p> </div> </div> </td> </tr> </tbody> </table></title> |
Technical SEO | | poiseo0 -
Duplicate Content: Canonicalization vs. Redirects
Hi all, I have a client that I recently started working with whose site was built with the following structure: domain.com
Technical SEO | | marisolmarketing
domain.com/default.asp Essentially, there is a /default.asp version of every single page on the site. That said, I'm trying to figure out the easiest/most efficient way to fix all the /default.asp pages...whether that be 301 redirecting them to the .com version, adding a canonical tag to every .asp page, or simply NOINDEXing the .asp pages. I've seen a few other questions on here that are similar, but none that really say which would be the easiest way to accomplish this without going through every single page... Thanks in advance!0 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Errors - 7300 - Duplicate Page Content..Help me..
Hi, I just received the crawl report with 7300 errors of duplicate page content. Site built using php. list of errors will be like this.. http://xxxxx.com/channels/ http://xxxxx.com/channels/?page=1 http://xxxxxx.com/channels/?page=2 I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue? Thanks.
Technical SEO | | vilambara0 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0