Duplicate content advice
-
Im looking for a little advice.
My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages.
Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!).We went through the Panda and Penguin updates unaffected (visitors actually went up!).
So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do.
However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?)What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this.I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity).
I would love to hear opinions
-
Ok i will see what can be done about the duplicate content.
Will adding META NOINDEX, NOFOLLOW to the empty directory pages i have, cure any problems that Google may have? (will NOINDEX also stop these pages from showing as duplicate content in SEOmoz, or will they still be shown?)
I have also spotted that several of the duplicate pages are due to www and non-www
So i have updates the settings in Google WebMaster Tools to only list www and also added a .htaccess 301 redirect so all users are directed to www.
Is this the best thing to do?Thanks
Jon -
duplicate content is more likely to be affected by Panda updates not penguin updates. Even though you have not experienced any issues todate I would definitely reduce the duplicate content to a min. Otherwise you may be penalized in the next panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Google indexing site content that I did not wish to be indexed
Hi is it pretty standard for Google to index content that you have not specifically asked them to index i.e. provided them notification of a page's existence. I have just been alerted by 'Mention' about some new content that they have discovered, the page is on our site yes and may be I should have set it to NO INDEX but the page only went up a couple of days ago and I was making it live so that someone could look at it and see how the page was going to look in its final iteration. Normally we go through the usual process of notifying Google via GWMT, adding it to our site map.xml file, publishing it via our G+ stream and so on. Reviewing our Analytics it looks like there has been no traffic to this page yet and I know for a fact there are no links to this page. I am surprised at the speed of the indexation, is it a example of brand mention? Where an actual link is now no longer required? Cheers David
Algorithm Updates | | David-E-Carey0 -
Duplicate Domain Listings Gone?
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain. I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...? EX of what I mean by "duplicate domain listings": (in case my wording is confusing here) Search term "Product Item" Pages ranking: domain-one.com/product-item.html domain-one.com/product-item-benefits.html etc...
Algorithm Updates | | jesse-landry1 -
Google webmaster tool content keywords Top URLs
GWT->Optimization->Content Keywords section... If we click on the keyword it will further shows the variants and Top URLs containing those variants. My problem is none of the important pages like product details pages, homepage or category pages are present in that Top URLs list. All the news, guides section url's are listed in the Top URLs section for most important keyword that is also present in my domain name. How to make google realize the important pages for the important keyword?
Algorithm Updates | | BipSum0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0 -
To use the same content just changing the keywords could be seen as duplicate content?
I want to offer the same service or product in many different cities, so instead of creating a new content for each city what I want to do it to copy the content already created for the product and service of a city and then change the name of the city and create a new url inside my website for each city. for example let say I sell handmade rings in the USA, but I want o target each principal city in the USA, so I have want to have a unque url for ecxh city so for example for Miami I want to have www.mydomain.com/handmade-rings-miami and for LA the url would be www.mydomain.com/handmade-rings-la Can I have the same content talking about the handmade rings and just change the keywords and key phrases? or this will count as a duplicate content? content: TITLE: Miami Handmade Rings URL :www.mydomain.com/handmade-rings-miami Shop Now handmade rings in Miami in our online store and get a special discount in Miami purchases over $50 and also get free shipping on Miami Local address... See what our Miami handmade rings clients say about our products.... TITLE: LA Handmade Rings URL: www.mydomain.com/handmade-rings-la Shop Now handmade rings in LA in our online store and get a special discount in LA purchases over $50 and also get free shipping on LA Local address... See what our LA handmade rings clients say about our products.... There are more than 100 location in the country I want to do this, so that is why I want to copy paste and replace.. Thanks in advance, David Orion
Algorithm Updates | | sellonline1230