2 URLs pointing to exactly the same content
-
Hi guys
As far as I know if you have 2 websites with exactly the same (100%) content with 2 URLs which are not pointing to any other URL should attract penalisation from google, right?
well, there is such a case and it was online for long time but the bad guys are in top of organic search and it does not seem to bother google at all!
I don't want to list them here; it is extremely annoying and frustrating as I worked hard to get in higher search but seeing this thing is extremely frustrating!
any advice on this?
thanks
-
they are in fact 3 of them
-
Can you add the two URLs of the two domain names you've mention ?
Cheers.
-
hi thanks
I should add that there are 2 different URLs but exactly the same website and **content **so practically an identical website listed on google trough 2 URLs
both URLs domains websites are well situated in google search
-
Hi,
There is no penalization for duplicate content (internal or external).
There are some filters in place and pages, not domains will rank lower if the content is the same with other pages (again - internal or external).
Panda is also a filter - this time domain wide that can filter down a domain if the content is not very good - but it's a filter, dedicated and not on a duplicate content basis but on a niche basis.
The only negative side of duplicare content is:
-
if the duplicate content is within the same domain you might rank with one of the pages that is not your main asset. (google is actually choosing those not based on PR, PA or anything else but by the url format - strange but true)
-
if the duplicate content is domain vs domain - google will choose the one that he thinks is most relevant (and not only based on content but based on page and domain authority, google bounce rate - not analytic bounce rate and all the other major signals).
There are some cases when google is ranking better a 100% duplicate page that even links to the source because google decided that the domain is better, more trustworthy and so on.
well, there is such a case and it was online for long time but the bad guys are in top of organic search and it does not seem to bother google at all!
** You might be looking at this the wrong way. Maybe the "bad" guys are actually better.
If you post both domains maybe people here can asess those and give some objective feedback.
Post the domain without mentioning who is the bad guy in your opinion so people here won't take that in consideration when sending you feedback.
Hope it helps.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Updating Content - Make changes to current URL or create a new one?
I'm working with a content team on a job search guide for 2019. We already have a job search guide for 2018. Should we just edit the content of the job search guide for 2018 to make it current for 2019, which means the job search guide for 2018 would not exist anymore or should we keep the 2018 guide and just create a new web page for the 2019 guide that way both exist. We currently rank very well for the 2018 job search guide.
Content Development | | Olivia9541 -
Tool to identify duplicated content on other sites
Hi does anyone know of a tool that could be used to identify if a site is using our content without permission? Thanks
Content Development | | turismodevino10 -
We want to move Content from one domain to another
We have a large amount of unique content on a domain we are no longer using/promoting. Its been sat there for a couple of years. Its literally wasted content on a non used or promoted domain. We want to move it to a busy site of ours. Are there any best practices or pitfalls we should be aware of?
Content Development | | Simonws0010 -
Duplicate Content In Webmaster Tools
In wordpress on some of our blogs when we have gone to publish them wordpress has shortened the url. In Google webmaster tools the orignal url is coming up as a 404 error. This url is not indexed in Google. Is this something to worry about and can this be avoided? Thank you in advance.
Content Development | | Palmbourne0 -
Duplicate YouTube Script Content - Penalty?
I've been tasked with writing scripts for upward of 100 YouTube videos describing my company's products. In more than a few cases, the products are so similar as to be almost identical; unfortunately, they aren't and will require their own videos. If I create a "template" script, I would save hours and hours of tedium. For example: Video 1: (VOICEOVER) Buy the ABC widget today! Video 2: (VOICEOVER) Buy the XYZ widget today! So, my question is: Would I be looking at a duplicate content issue? Jeff McRichie's terrific Whiteboard Friday about YouTube Ranking Factors mentioned that YouTube has an auto-transcription feature that might expose my self-plagiarism, and I don't want to get dinged. BTW, this isn't a matter of my being too lazy to write individualized content; it's more that 1) the products are almost identical, and 2) I have just about a week to write, produce, and act(!) in all of them.
Content Development | | RScime250 -
Possible to recover from Thin/Duplicate content penalties?
Hi all, first post here so sorry if in wrong section. After a little advice, if I may, from more experienced SEOers than myself with regards to writing off domains or keeping them. To cut a long story short I do a lot of affiliate marketing, back in the day (until the past 6 months or so) you could just take a merchant's datafeed and with some SEO outrank them for individual products. However, since the Google Panda update this hasn't worked as well and now it's much hard to do - which is better for the end user. The issue I have is that I got lazy and tried to see if I could still get some datafeeds to rank with only duplicate content. The sites ranked very well at first but after a couple of weeks died massively. They went from 0 to 300 hits a day in a matter of 24 hours and back to 2 hits a day. The sites now not rank for anything which is obviously because they are duplicate content. The question I have is are these domains dead, can they be saved? Not talking about duplicate content but as a domain itself. I used about 10 domains to test things, they ranged from DA 35 to DA 45 - one of the tests being can a domain with reasonable DA rank for duplicate content. Seeing as the test didn't work I want to use the domains for proper sites with proper unique content, however so far although the new unique content is getting indexed it is suffering from the same ranking penalties the duplicate (and now deleted content) pages had. Is it worth trying to use these domains, will Google finally remove the penalty when they notice that the bad content is no longer on the site or are the domains very much dead? Many thanks
Content Development | | ccgale0 -
Crawl error vs RSS feed content
Hi, I have a BuddyPress Multisite with sites dedicated to specific RSS feeds. I did this to pull in the content for my users. But they generate a 1000s of SEMOZ errors/warnings for dup content, dup titles, missing metatags. etc. So does keeping the content help my site SEO less than the errors? The content is not under my control so I have no way to relate it to my keywords. Any opinions? I can recreate these RSS sites, so I will probably delete them to see what happens. Larry
Content Development | | tishimself0 -
Best way to resolve duplicate content issue?
Not sure about what to do about this - I have a client who has a ton of pages (around 1200) which are all City specific pages, for long-tail search. These are all written with paragraphs in the format such as: Order to [City] today. So every page has essentially the same content. The site also only has 1562 pages, so with 1200 of them being City-specific same-content pages, that can't be good. However the problem is that these pages still rank very well (usually Position 1 or 2) for the terms they're targeting, and bring in enough traffic and revenue to justify their purpose. We also have Country specific pages, and these are all with unique content, rather than the scripted content on the City pages. So for example, for Italy we might have: Italy Page (Unique Content) Rome (Duplicate Content) Milan (Duplicate Content) Venice (Duplicate Content) etc. (Duplicate Content) For a low traffic country (Austria), we tried to 301 the City pages to the Country page, but that only resulted in us seeing a drop in search results for the city keywords, from (usually) Position 1 to more like Page 3 or 4, so quite a drop. So, without writing 1200 pages worth of unique content, what would your advice be?
Content Development | | TME_Digital0