Duplicate content advice
-
Im looking for a little advice.
My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages.
Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!).We went through the Panda and Penguin updates unaffected (visitors actually went up!).
So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do.
However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?)What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this.I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity).
I would love to hear opinions
-
Ok i will see what can be done about the duplicate content.
Will adding META NOINDEX, NOFOLLOW to the empty directory pages i have, cure any problems that Google may have? (will NOINDEX also stop these pages from showing as duplicate content in SEOmoz, or will they still be shown?)
I have also spotted that several of the duplicate pages are due to www and non-www
So i have updates the settings in Google WebMaster Tools to only list www and also added a .htaccess 301 redirect so all users are directed to www.
Is this the best thing to do?Thanks
Jon -
duplicate content is more likely to be affected by Panda updates not penguin updates. Even though you have not experienced any issues todate I would definitely reduce the duplicate content to a min. Otherwise you may be penalized in the next panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to unrank your content by following expert advice [rant]
Hi, As you can probably see from the title, a massive rant is coming up. I must admit I no longer understand SEO and I just wanted to see if you have any ideas what might be wrong. So, I read this blog post on MOZ https://moz.com/blog/influence-googles-ranking-factor - where the chap is improving ranking of content that is already ranking reasonably well. I've got two bits of news for you. The good news is - yes, you can change your articles' ranking in an afternoon. Bad news - your articles drop out of Top 100. I'll give you a bit more details hoping you can spot what's wrong. Disclaimer - I'm not calling out BS, I'm sure the blogger is a genuine person and he's probably has had success implementing this. The site is in a narrow but popular ecommerce niche where the Top 20 results are taken by various retailers who have simply copy/pasted product descriptions from the manufacturer's websites. The link profile strength is varied and I'm not making this up. The Top 20 sites range from DA:4 to DA:56. When I saw this I said to myself, it should be fairly easy to rank because surely the backlinks ranking factor weight is not as heavy in this niche as it is in other niches. My site is DA:18 which is much better than DA:4. So, even if I make my pages tiny tiny bit better than this DA:4 site, I should outrank it, right? Well, I managed to outrank it with really crap content. So, I got to rank two high-traffic keywords in #8 or #9 with very little effort. And I wish I stayed there because what followed just completely ruined my rankings. I won't repeat what was written in the blog. If you're interested, go and read it, but I used it as a blueprint and bingo, indeed Google changed my ranking in just a couple of hours. Wait, I lost more than 90 positions!!!! I'm now outside Top100. Now even irrelevant sites in Chinese and Russian are in front of me. They don't even sell the products. No, they're even in different niches altogether but they still outrank me. I now know exactly what Alice in Wonderland felt like. I want out please!!!!
Algorithm Updates | | GiantsCauseway0 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Duplicate Content?
My client is a manufacturers representative for highly technical controls. The manufacturers do not sell their products directly, relying on manufacturers reps to sell and service them. Most but not all of them publish their specs on their sites, usually in PDF only. As a service to our customers and with permission of the manufacturers we publish the manufacturers specs on our site for our customers in HTML with images and downloadable PDF's — this constitutes our catalogue. The pages are lengthy and technical, and are pretty much the opposite of thin content. The URLS for these (technical) queries rank well, so Google doesn't seem to mind. Does this constitute duplicate content and can we be penalized for it?
Algorithm Updates | | waynekolenchuk0 -
Advice on breaking page 3 SERP?
I've worked extremely hard re-ranking for the key word "kayak fishing" after our site migration and panda/penguin our site www.yakangler.com was lost in the doldrums. I couldn't find us on any of the results page 1 thru page 64. The good news is we have managed to make it to page 3 for the past month but I feel like we have hit a wall. I can't seem to get any more moment in my SERP for "kayak fishing" Google US. Anyone have any recommendations on what I should do to move us up more? We have good content that is updated daily, an active community and forum. site: www.yakangler.com key word: kayak fishing
Algorithm Updates | | mr_w0 -
How to retain those rankings gained from fresh content...
Something tells me I know the answer to this question already but I'd always appreciate the advice of fellow professionals. So.....fresh content is big now in Google, and i've seen some great examples of this. When launching a new product or unleashing (yes unleashing) a new blog post I see our content launches itself into the rankings for some fairly competitive terms. However after 1-2 weeks these newly claimed rankings begin to fade from the lime light. So the question is, what do I need to do to retain these rankings? We're active on social media tweeting, liking, sharing and +1ing our content as well as working to create exciting and relevant content via external sources. So far all this seems to have do is slow the fall from grace. Perhaps this is natural. But i'd love to hear your thoughts, even if it is just keep up the hard work.
Algorithm Updates | | RobertChapman1 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0 -
Is this the best way to get rid of low quality content?
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination. However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process). Some advise on how to proceed from here would be fantastico and danke <colgroup><col width="493"></colgroup>
Algorithm Updates | | BrianYork-AIM0