Advice on Duplicate Page Content
-
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them.
For example -
http://www.jumpstart.com/common/find-easter-eggs
http://www.jumpstart.com/common/recognize-the-rsWe have many such pages.
Does Google look at them all as duplicate page content? If yes, how do we deal with this?
-
EGOL, Everett,
Thank you both for your very useful suggestions. Sounds like we should do something similar to our PDF documents to represent them as the actual/canonical content on the page. And we'll look at our CMS to see how we might implement the unlinked page name in the breadcrumb. We have done some work already in adding structured data with schemas (including aggregate ratings), so that is hopefully yielding some results already.
However, after an encouraging traffic spike that seemed to indicate that we were on the right track, we saw a very worrisome dip last month.... which then led to a lot of worried hand wringing about Panda.
So these suggestions are very helpful ; thanks again and we'll try them out!
-
Thank you, Everett,
Nice to see you posting in Q&A.
Look forward to seeing you regularly.
-
Hello Sudhir,
Those two pages would not be seen as duplicates. Google is very capable of separating the template from the content.
On a side note, you should look into getting the name of the page/game into the breadcrumb, though it doesn't have to be linked like the previous two pages in the path. For example:
You are here: Home --> Common --> Find Easter Eggs
Allowing visitors to review and rate the games would provide useful, keyword-rich, natural content on an otherwise content-sparse page. Once reviews/ratings are implemented you could also use Schema.org markup to enhance your search engine results by showing star ratings next to each game.
Good luck!
-
Google knows how to separate the template of the site from the content. So you have nothing to worry about if most of the code on your pages is the same code that is used on every other page.
I looked at your two sample pages and saw a few things that would concern me...
This page had very little content. If you have lots of pages with such a tiny amount of content you could have Panda problems.
http://www.jumpstart.com/common/find-easter-eggs
You also have pages like this....
http://www.jumpstart.com/common/recognize-the-rs-view
These have very little content.
I have a site with lots of printable content that is mainly images placed in .pdf documents to control the scale of the printing and the look of the printed page. The pages used to present them to visitors and the pdf documents were all thin content and my site had a Panda problem. That cause the rankings of every page on the site to fall and really damaged my traffic. I solved that by noindexing the html pages and applying rel=canonical to the pdf files using .htacess.
I can't say if this will happen to you but I would be uncomfortable if I had a site with such little content on its pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Duplicate content by php id,page=... problem
Hi dear friends! How can i resolve this duplicate problem with edit the php code file? My trouble is google find that : http://vietnamfoodtour.com/?mod=booking&act=send_booking&ID=38 and http://vietnamfoodtour.com/.....booking.html are different page, but they are one but google indexed both of them. And the Duplcate content is raised 😞 how can i notice to google that they are one?
Technical SEO | | magician0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Duplicate Content on Multinational Sites?
Hi SEOmozers Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me! Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market. They don't want to set up a simple redirect because a) the .com is UK-hosted b) there's a number of regional spelling changes that need to be made However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site? Any help much appreciated! Thanks
Technical SEO | | Coolpink0 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0