Am I doing enough to rid duplicate content?
-
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough.
My main concern now is a large group of landing pages. For example:
http://www.boxerproperty.com/lease-office-space/office-space/dallas
http://www.boxerproperty.com/lease-office-space/executive-suites/dallas
http://www.boxerproperty.com/lease-office-space/medical-space/dallas
And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance:
http://www.boxerproperty.com/lease-office-space/office-space/atlanta
http://www.boxerproperty.com/lease-office-space/office-space/chicago
http://www.boxerproperty.com/lease-office-space/office-space/houston
Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted?
I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them.
Any advice?
-
Heather,
I'm confused as to what the duplicate content is. The three Dallas pages you mentioned have different content. Sure there's a decent amount that's the same from the site-wide content (nav menus, etc.), but each has different text and information about different locations that are available. How is it duplicate?
Kurt Steinbrueck
OurChurch.Com -
Heather,
First things: 1. Are they still driving traffic? 2. Rel=canonicals are supposed to be used on identical pages or on a page whose content is a subset of the canonical version.
Those pages are very thin content and I certainly wouldn't leave them as they are. If they're still driving content, I'd keep them, but for fear of panda, I'd 302 them to the main pages while I work steadily on putting real content on them and then remove the redirects as the content goes on.
If they're not still driving traffic, it seems to me that it wouldn't be very hard to justifying their removal (or 301 redirection to their main pages). Panda is a tough penalty and you don't want to get caught in that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing server location nearest to visitors? i am confused with the content part.
hi there, currently hosted in Singapore, and target audience is the US, john mueller said keep the url, content and cms the same. i am confused with the content part i have been tweaking the content for a month now because i have changed content on my site a day ago if i change the server the next day? is that bad? what should be done?
Algorithm Updates | | maria-cooper90 -
Duplicate website got indexed: Caused rank drop?
Hi all, We have replica of our website with exact pages and content. That website got indexed by mistake and allowed for bots for more than 10 days. Our ranking dropped now and we moved from 2nd page to 5th page. But previously we had this happened and didn't hurt much. We got punished now? Thanks
Algorithm Updates | | vtmoz0 -
Ranking For Synonyms Without Creating Duplicate Content.
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B. Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us. We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A. Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
Algorithm Updates | | Fuel0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
How do you get great content for a small business?
We always talk about great engaging content being the way forwards for sites. As a small business this is an expensive commodity to outsource when you have probably in the region of 250 pages that could probably all use some work. To that end I have some questions. how do do you make a product or category description engaging? Should they still contain a certain number of words ( personally I hate ready reams of text) As on-page SEO what should we be striving to achieve? I am sure this has all been asked before but what the general consensus right now?
Algorithm Updates | | Towelsrus0 -
Update content
y'all, what is the recommended amount of time in which content on a website should be refreshed? TY
Algorithm Updates | | imageworks-2612900 -
Duplicate content penalisation?
Hi We are pulling in content snippets from our product blog to our category listing pages on our ecommerce site to provide fresh, relevant content which is working really well. What I am wondering is if we are going to get penalised for dupicate content as both our our blog and ecommerce site are on the same ip address? If so would moving the blog to a separate server and / or a separate domain name be a wise move? Thanks very much
Algorithm Updates | | libertybathrooms0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0