How detrimental is duplicate page content?
-
We have a local site wherein we have multiple advanced search parameters based on facilities available at a particular place. So for instance, we list a set of fun places to take kids to in a city. We have a page for this. We now have ability to select a list of fun places that have parking facility available or which are "outdoor". Now we use parameters to address these additional search criteria. Would search engines treat them as duplicate pages and in case it would how detrimental would this be?
-
As others had answered before, if the pages with parameters are just a consequence of a filter, but don't actually add nothing relevant (aka: substantially duplicated of the not parametered URLs) or nothing all, than the best idea is having those URLs with noindex meta robots.
This will ensure that those pages, if they were crawled, will disappear from the index.
But this is just a general rule, because can exists many variations to that same rule (and we don't know how really has been developed your site).
For instance, if those pages cannot be physically crawl because the filters are behind a Javascript selector (something that can be verified disabling Java in the browser), then you should not suffer issues and, eventually, using the meta robots "noindex" should be just a prevention not really an intervention to something already happened.
-
If you no-index, any link pointing to that page will waste its link juice.
If you must do that no-index,follow so the link juice can flow back out.
if your site is mainly duplicates then you have a problem, but if it is just a few pages, don't worry.
google will give credit to one page and will disregard the others. -
I guess it depends how much duplication there is. If the pages contain completely duplicate content with no unique content at all then the best move would be to noindex or nofollow them. Otherwise rel=canonical is probably fine.
-
Does rel="canonical" only indicate to Google the preferred page or does it also indicate that the content on the current page is duplicate in nature? Should it be better if we actually remove these pages from the index by providing for a "noindex" on the page?
-
Duplicate content is detrimental but the issue is relatively easy to solve. Just ensure you add rel="canonical" tags to the duplicate pages to allow Google to identify and rank the preferred page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Local Domains and Location Pages Question
Hello Everyone, So we have a priority site (domain.com) but also a geo-specific site for another location we have (domainNYC.com). Assuming both have completely unique content, different contact information and it’s justifiable to have a second domain (i.e. resources, brand/link equity…etc.) would it be recommend to also use the sub-folder approach on our primary (meaning domain.com/nyc)? And then potentially linking to domainNYC.com (just the once, not overdoing it)? Or just play it safe and keep them separate. Our concern is doing both sub-folder and separate domain might cannibalize on local searches resulting in us essentially competing with ourselves for those terms. The benefit would be leveraging the priority domain and driving visitors there. We could always ‘noindex, follow' the sub-folder page so users have access to the address on the primary domain too but wanted to see if anyone had any thoughts or suggestions as well as how it could pertain to linking (scarcely). We have found a lot of information on choosing one over the other but not as much for whether both is recommended so any extra insight would be very appreciated. Looking forward to hearing from all of you! Thank you in advance for the help! Best,
Local Listings | | Ben-R0 -
Found a yelp review by unknowing client of webdesigner who made a one page site. bad seo
One page sites are fine and dandy, but if you are a local biz...Just no! Here's my story with a few questions. I did a search on google site:http://eatfullbellydeli.com/ and it resulted in four pages: Main; menu; hello world; category; uncategorized. I'm not a web designer...I do seo. 1.) How rude would it be for me to reach out to the designer to comment and give suggestions? 2.) or should I reach out to the owner. 3.) Or just close my eyes and say "i hate people that take advantage of others"
Local Listings | | Ohmichael0 -
Placement of products in URL-structure for best category page rankings
Hi! I have some questions regarding the optimal URL-hierarchy placement of products in a marketplace setting where the end goal is to attract traffic to category pages. Let me start off with some background, thanks in advance for the help. TLDR Goal: Increase category page rankings. Alternative 1 - Products and category pages separated, flat product structure. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/listing-1 Alternative 2 - Products and category pages separated, hierarchal product structure. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/product/category/subcat/listing Alternative 3 - Products placed directly under category page. Category page: oursite.com/category/subcategory Product / listing page: oursite.com/category/subcategory/listing I run a commercial real estate marketplace, which means that our potential search traffic is _extremely _geographic. For example, some common searches are (not originally in english): Office space for lease {City X} Office space for lease {Neighborhood Y} Retail space {Neighborhood Z} And so on... These terms are already quite competitive, where the top results are our competitors geographic and type category pages. For example: _competitor.com/type/city/neighborhood , _is a top result, where the user reaches a landing page that shows all the {type} spaces for lease in {neighborhood}. These users are out to find which spaces are available for lease in these geographical areas, and not individual spaces. I.e. users do not search in the same extent for an individual product, in this case a specific empty space. Our approach has been to place an extreme bias towards a heavy geographical hierarchy. This means that basically any search, resulting in a category page, on our site results in a well structured URL like the following: _oursite.com/type/state/city/district/street, _since we are using Google Maps API's, this is easy and relevant for the user. Our geographical categorization beats our competitors both on extensiveness and usability, especially in long-tail search phrases where our competitors don't care to categorize where we are seeing real search volumes. The hierarchy only extends as far down as the user has searched, for example a lot of our searched just end up being _oursite.com/type/state/city/district. _ Now we are wondering how we should place our products, the empty spaces, in this URL structure. Our original hypothesis was that we should include the products in the original hierarchy, resulting in: oursite.com/category/subcategory/product. Our thinking was that we would both be serving the user with an understandable and relevant URL, and also provide search bots with a logical structure for our site and most importantly content for our category pages. Our landing pages are very dynamic, providing information by relaying graphical information on a map instead of in an SEO-friendly manner. I would however go as far as to say that these dynamic pages provide a ton of value for the user, much more so than our competitors, by describing relevant information about the neighborhood kind of like Trulia, just not in a bot-readable manner. This results in trying to rank them on their own merits being a challenge, whereas we were hoping we could create relevancy by placing products / listings and maybe even blog posts on the topic within the same URL-hierarchy. As of right now our current structure is oursite.com/products/category/subcategory/product. In other words, they are categorized in the same geographical fashion but under a separate URL-path. Our results so far is that we basically only rank for the product pages, and rank extremely poorly for our category pages, which is our ultimate goal to enhance. This is why we developed the above hypothesis. However, what we learned when we did some initial research is that very few e-commerce stores place their products directly below their categories. Most of the major websites we studied, and we looked at quite a few, just go for **alternative 1 **from above. The crux is that most of them choose alternative 1 but simultaneously implement bread crumbs that emulate alternative 3, just without the actual URL's. So, what I'm asking is, what are the actual benefits or downsides of the three alternatives? I feel as if I have a pretty firm grasp on how this could be done, I just need to better understand why most seem to choose to flatline their products or listings in the alternative 1 fashion. Thanks, Viktor
Local Listings | | Viktorsodd0 -
Google Plus Pages are No Longer Available From SERPs?
We have noticed that the links to our google plus pages have become inaccessible to all of our client google plus location profiles from search result pages. I am getting lots of questions about this from our clients and am just not sure what to say. Does anyone have any idea what is happening here? I do notice that some large brands have "profiles" links below their information on the right side of SERPs that includes google plus links: https://www.google.com/search?q=american+airlines and some do not: https://www.google.com/?gws_rd=ssl#safe=off&q=nike I can't imagine we are abandoning our business.google.com location profiles but what do we know if anything regarding the plan here is? Whatsthehaps?
Local Listings | | Sans_Terra0 -
SEO best practices for store locator and local pages - 301 or not?
I have been struggling to answer this on my own and now throwing up for the Moz community for a life line. Our company has several location across 6 states. We have local pages that we are working to improve with better content. We also have a store locator that will list the stores but the pages are not the same. See below example. I can't help but feel like I am splitting juice and traffic that should be combined to one page for each location. Any ideas or advice on how we can best combine/funnel the traffic to one optimized page? Here is an example: State local page - http://www.jakesfireworks.com/michigan/ Locator page for state - http://www.jakesfireworks.com/locator/?state=MI City local page - http://www.jakesfireworks.com/michigan/grand_rapids City Locator page - http://www.jakesfireworks.com/locator/?id=183&state=MI
Local Listings | | devonkrusich0 -
New website no ranking, due to 2 duplicate websites . Please help.
Hello Everyone, We have a client that has 2 of the same websites, that have the same content, different phone numbers and different content for the Homepages however, there are a few differences that has been done to both with different functionality. For the last 3 weeks, we have been trying to rank "1. www.websitetoronto.com" as this is the branding of the company, and is for 18+ content. We also have "2. www.websitetor.com" that is PG13 content which is for google places for business aka google+ for business (which is still in pending mode, and is the older site about 7 years old, and was ranked before. The reason for both the sites is that Google Places for Business Listing does not allow any adult content, and our main branding "**1. www.websitetoronto.com" is for 18+ content. Here are the specs for both the sites. ** 1. www.websitetoronto.com New Registered site to be ranked organically (however this is no where in the SERPS) as its serving the older site previously No Google Places for business listings.
Local Listings | | EVERWORLD.ENTERTAIMENT
- This is the actual branding of the company and is for 18+ content We have already started building links, organically and white-hat only. Unfortunately (its not showing for our main keywords in the SERPS) ** 2. www.websitetor.com** 7 Years old Google Places for Business (Still in pending mode) not to be ranked organically blocked from searches - robots sets to
User-agent: *
Disallow: / Removed all the links in WMT including We have change all all the links for point to "www.websitetoronto.com" What is the best way to make this work with google, so that we have both websites performing for the different functions and without losing the google places listings for 2. www.websitetor.com and to rank 1. www.websitetoronto.com a) Do I add any tags? Rel=?
b) Should we remove 2. www.mywebsitetor.com from WMT?
c) any other tips and suggestion to get www.mywebsitetoronto.com to rank would be great! Any help and advice would be greatly appreciated. Also, the owner want this set up this way so, we cannot reverse as he wants the **1. www.websitetoronto.com as the branding. **0 -
Can't Change My G+ Pages Address?
Our zipcode is wrong on our Google Plus page for one of our offices. Exactly, one month ago I corrected it, but it immediately reverted to the wrong one. Then, I think I read the change can take 4 weeks...we'll it still hasn't changed. Two weeks ago, someone from Google Places even called and I told him to change it manually, he said he would...still not changed. What can I do to get this zip code corrected? Thanks, Ruben
Local Listings | | KempRugeLawGroup0 -
.CA and .COM (Ensuring no Duplicate Content)
Hello, I know that this has been answered before. I have a website that has the same content for both
Local Listings | | EVERWORLD.ENTERTAIMENT
http://example.com and http://example.ca/ 1. To Ensure I don't get penalized for duplicate content is there anything else I have to do besides adding the hreflang? Perhaps doing some stuff in the WMT?
2. Where do I add the hreflang? In the header section of the homepage? US site: http://example.com/" /> CA site: http://example.ca/" /> Thanks for you help?0