Duplicate content advice
-
Im looking for a little advice.
My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages.
Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!).We went through the Panda and Penguin updates unaffected (visitors actually went up!).
So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do.
However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?)What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this.I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity).
I would love to hear opinions
-
Ok i will see what can be done about the duplicate content.
Will adding META NOINDEX, NOFOLLOW to the empty directory pages i have, cure any problems that Google may have? (will NOINDEX also stop these pages from showing as duplicate content in SEOmoz, or will they still be shown?)
I have also spotted that several of the duplicate pages are due to www and non-www
So i have updates the settings in Google WebMaster Tools to only list www and also added a .htaccess 301 redirect so all users are directed to www.
Is this the best thing to do?Thanks
Jon -
duplicate content is more likely to be affected by Panda updates not penguin updates. Even though you have not experienced any issues todate I would definitely reduce the duplicate content to a min. Otherwise you may be penalized in the next panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Ranking For Synonyms Without Creating Duplicate Content.
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B. Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us. We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A. Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
Algorithm Updates | | Fuel0 -
Creating Content for Semantic search?
Need some good examples of semantic search friendly content. I have been doing a lot of reading on the subject, but have seen no real good examples of 'this is one way to structure it'. Lots of reading on the topic from an overall satellite perspective, but no clear cut examples I could find of "this is the way the pieces should be put together in a piece of content and this is the most affective ways to accomplish it". **What I know: ** -It needs to answer a question that precludes the 'keyword being used' -It needs to or should be connected to authorship for someone in that topic industry -It should incorporate various social media sources as reference to the topic -It should link out to authoritative resources on the topic -It should use some structured data markup Here is a great resource on the important semantic search pieces: http://www.seoskeptic.com/semantic-seo-making-shift-strings-things/ ,but I want to move past the research into creating the content that will make the connections needed to get the content to rank. I know Storify is an excellent medium to accomplish this off page, but only gives no follow attribution to the topic creator and links their in. I am not a coder, but a marketer and creating the backend markup will really take me out of my wheel house. I don't want to spend all of my time flailing with code when I should be creating compelling semantic content. Any helpful examples or resources welcome. Thanks in advance.
Algorithm Updates | | photoseo10 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Advice on breaking page 3 SERP?
I've worked extremely hard re-ranking for the key word "kayak fishing" after our site migration and panda/penguin our site www.yakangler.com was lost in the doldrums. I couldn't find us on any of the results page 1 thru page 64. The good news is we have managed to make it to page 3 for the past month but I feel like we have hit a wall. I can't seem to get any more moment in my SERP for "kayak fishing" Google US. Anyone have any recommendations on what I should do to move us up more? We have good content that is updated daily, an active community and forum. site: www.yakangler.com key word: kayak fishing
Algorithm Updates | | mr_w0 -
Been Penalized, Starting from Scratch, Need Advice
Good Morning, We've been penalized for unnatural links on the site : http://goo.gl/JgK1e After attempting to contact many of the sites with links pointing at our site, and getting no response, we decided to start from scratch on a new domain : http://goo.gl/XUH3f The first thing I did once the new domain was up was remove all of the unique content (text) from the original site and place it on the new site... I am still having a difficult time getting the new site to rank in the SERPS. Can you guys please provide pointers as to what steps should be taken to get the new domain, http://goo.gl/XUH3f a high ranking in the SERPS? Thanks!!
Algorithm Updates | | Prime850 -
Redesign, new content, new domain and 301 redirects (penalty?)
We merged our old webshops into one big project. After a few days we received our rankings back and traffic was coming in. Then suddenly we lost almost all rankings overnight. We did not use any wrong seo techniques and have unique content, written by our own writers. Is this a penalty or do we have to wait longer?
Algorithm Updates | | snorkel0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0