Penalty for duplicate content on the same website?
-
Is it possible to get a penalty for duplicate content on the same website?
I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there.
Obviously I could no-follow all the filter links but it would take an enormous amount of work.
The site is performing well in the search.
I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
-
Your ranking will be decreased. It is better to use canonical URL to avoid duplicate contents in your website.
-
It wouldn't cause a literal penalty from Google, however, yes, it can hurt your site's rankings.
It sounds like those filters use URL parameters? If so it's possible for all parameterized URLs to be controlled or use canonical tags to indicate duplicate content.
What types of filters are these? EG what is changing on the page when you select the filters? Is this like an ecommerce category experience where you're filtering product listings?
-
Hi seoman10,
there is no penalty for duplicated content but rankings can be impacted negatively because of the duplicated content and your web may have a worst performance on Google.
This is because Google will waste many time searching in all your similar content to decide which page to show in the search results, wasting also your "crawl budget", and probably will also divide the authority between more than one of the similar pages, losing authority in this process.
Also, Google may choose to show in the search results a filter version of the page that is not your preferred one.
I recommend you to use canonicals to solve this problem, pointing all the filter versions of the page with canonical to the non-filter one.
Hope that helps
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think this case would be of a duplicated content and what would be the consequences in such case?
At the webpage https://authland.com/ which is a food&wine tours and activities booking platform, primary content - services thumbnails containing information about the destination, title and prices of the particular services, can be found at several sub-pages/urls. For example, service https://authland.com/zadar/zadar-region-food-and-wine-tour/1/. Its thumbnail/card through which the service is available, can be found on multiple pages (Categories, Destinations, All services, Most recent services...) Is this considered a duplicated content? Since all of the thumbnails for services on the platform, are to be found on multiple pages. If it is, which would be the best way to avoid that content being perceived by Google bots as such? Thank you very much!
Intermediate & Advanced SEO | | ZD20200 -
Backup Server causing duplicate content flag?
Hi, Google is indexing pages from our backup server. Is this a duplicate content issue? There are essentially two versions of our entire domain indexed by Google. How do people typically handle this? Any thoughts are appreciated. Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Website Redesign - Duplicate Content?
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
Intermediate & Advanced SEO | | CamiloSC0 -
Site architecture, inner link strategy and duplicate or thin content HELP :)
Ok, can I just say I love that Moz exists! I am still very new to this whole website stuff. I've had a site for about 2 years that I have re-designed several times. It has been published this entire time as I made changes but I am now ready to create amazing content for my niche. Trouble is my target audience is in a very focused niche and my site is really only about 1 topic - life insurance for military families. I'm a military spouse who happens to be an experience life insurance agent offering plans to active duty service members, their spouses as well as veterans and retirees. So really I have 3 niches within a niche. I'm REALLY struggling on how to set up my site architecture. My site is basically fresh so it's a good time to get it hammered down as best as possible with my limited knowledge. Might I also add this is a very competitive space. My competitors are big, established brands who offer life insurance along with unaffiliated, informational sites like military.com or the va benefits site. The people in my niche rarely actually search for life insurance because they think they are all set by the military. When they do search it's very short which is common as this niche lives in a world of acronyms. I'm going to have to get real creative to see if there are any long tail keywords I can use as supporting posts but I think my best route is to attempt to rank for the short one to three keyword phrases this niche looks for while searching. Given my expertise on the subject I am able to write long 1000-5000 content on the matter that will also point out some considerations my competitors dont really cover. My challenge is I cant see how this can be broken into sub topics without having thin supporting content. It's my understanding that I should create these in order to inner link and have a shot at ranking. In thinking about my topic I feel like the supporting posts can only be so long. Furthermore, my three niches within my small overall niche search for short but different keywords. Seems I am struggling to put it all into words. Let me stop here with a question - is it bad to have one category in a website? If not I feel like this would solve my dilemma in making a good site map and content plan. it is possible to split my main topic into 3 categories. I heard somewhere you shouldn't inner link posts from different categories. Problem is if I dont it's not ideal for the user experience as the topics really arent that different. Example a military member might be researching his/her own life insurance and be curious about his spouses coverage. In order to satisfy this user's experience and increase the time on my site I should link to where they can find more dept on their spouses coverage which would be in a different category. Is this still acceptable since it's really not a different subject?
Intermediate & Advanced SEO | | insuretheheroes.com0 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Penalized for Similar, But Not Duplicate, Content?
I have multiple product landing pages that feature very similar, but not duplicate, content and am wondering if this would affect my rankings in a negative way. The main reason for the similar content is three-fold: Continuity of site structure across different products Similar, or the same, product add-ons or support options (resulting in exactly the same additional tabs of content) The product itself is very similar with 3-4 key differences. Three examples of these similar pages are here - although I do have different meta-data and keyword optimization through the pages. http://www.1099pro.com/prod1099pro.asp http://www.1099pro.com/prod1099proEnt.asp http://www.1099pro.com/prodW2pro.asp
Intermediate & Advanced SEO | | Stew2220 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0