Would Google Call These Pages Duplicate Content?
-
Our Web store, http://www.audiobooksonline.com/index.html, has struggled with duplicate content issues for some time. One aspect of duplicate content is a page like this: http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html.
When an audio book title goes out-of-publication we keep the page at our store and display a http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html whenever a visitor attempts to visit a specific title that is OOP. There are several thousand OOP pages.
Would Google consider these OOP pages duplicate content?
-
I'm confused. When a book goes out of print, does the URL change to this long OOP html page? Or does that book's URL then redirect to this page? Or *(shudders) do you make the OOP page re-titled to whatever the OOP book's page was?
If it were me I'd do the first scenario here. It's essentially the same concept as a 404.
-
Yes that is duplicate content, you should make these pages return a 404 instead. or leave the content in place with a sold out banner or something.
Something I don't like is your index.htm on your home page, people how link to you are likely to link to http://www.audiobooksonline.com/ you will then get a 301 redirect to http://www.audiobooksonline.com/index.html
this will leak link juice, as all 301's leak link juice just the same as a link does, 155 if we go by the original published Google algorithm. Also your internal pages link to http://www.audiobooksonline.com and are once again redirected to http://www.audiobooksonline.com/index.html
-
yes larry that is fine. so long as it is a single URL with a single HTML file on it, there is no duplicate issues. If you want to clarify I would suggest (if you aren't an SEOmoz pro member) to use a sitemap generator to ensure it isn't crawling multiple pages... But if that page is only listed once (and from what you are saying here that should be the case) then you have no duplicate content issues.
It's just the same as linking to one page from every page on your website. A redirect doesn't work much differently (although it does drop a small amount of linkjuice.)
You might consider no-crawling that OOP page anyway if you're still concerned. Not sure why you would need that one indexed in the first place.
Good luck to you!
-
We use only one URL for the OOP pages. It is 301 redirected from the each unique OOP title's page. Based on what you said, I am understanding that this is fine. Correct?
-
Hi Larry
Couple of questions - is that the only URL for the OOP pages, or are there other versions of the page and/or URL that exist?
If there are multiple pages, then that is definitely duplicate content. However, that can quite easily be fixed. If you add this code to the head tag of all those OOP pages, it will prevent Google from indexing the pages (thus not seeing them as duplicate):
That way you can keep the page for the user but not have to worry about duplicate content. I would do this anyway even if there is only one version of the page, as the page is thin on content as it is.
If you are displaying that image on other URLs that used to have products on them, but have gone OOP, then those multiple URLs and pages would be duplicate. Again, if you add the above code into the head text, it removes the problem. You could also 301 redirect the URL of the product page to the OOP page. For example, if you had a page for a product called: http://www.audiobooksonline.com/examplerecord.html that is now OOP, you could put in a 301 redirect to the http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html. page and it wouldn't be duplicate. You can learn more about redirection here.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Can i use "nofollow" tag on product page (duplicated content)?
Hi, im working on my webstore SEO. I got descriptions from official seller like "Bosch". I got more than 15.000 items so i cant create unique content for each product. Can i use nofollow tag for each product and create great content on category pages? I dont wanna lose rankings because duplicated content. Thank you for help!
Technical SEO | | pejtupizdo0 -
Google Places Page Changes
We had a client(dentist) hire another marketing firm(without our knowledge) and due to some Google page changes they made, their website lost a #1 ranking, was disassociated with the places page and was placed at result #10 below all the local results. We quickly made some changes and were able to bring them up to #2 within a few days and restore their Google page after about a week, but the tracking/forwarding phone number the marketing company was using shows up on the page despite attempts to contact Google through updating the business in places management as well as submit the phone number as incorrect while providing the correct phone number. And because the client fired that marketing company, the phone number will no longer be active in a few days. Of course this is very important for a dental office. Has anyone else had problems with the speed and updating Google Places/Plus pages for businesses? What's the most efficient way to make changes like this?
Technical SEO | | tvinson0 -
Duplicate Page Content for sorted archives?
Experienced backend dev, but SEO newbie here 🙂 When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url). The pages all have rel=canonical to the archive home. Some of the pages are shorter (have only one or two entries). Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages? No issues with duplicate content are showing up on google webmaster tools. Thanks for your help!
Technical SEO | | jwondrusch0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0