Are bloggs published on blog platforms and on our own site be considered duplicate content?
-
Hi, SEO wizards! My company has a company blog on Medium (https://blog.scratchmm.com/). Recently, we decided to move it to our own site to drive more traffic to our domain (https://scratchmm.com/blog/). We re-published all Medium blogs to our own website. If we keep the Medium blog posts, will this be considered duplicate content and will our website rankings we affected in any way? Thank you!
-
-
As Alick has pointed out, this is considered duplicate content.
Appropriate use of the canonical tag can help you overcome this. Take a look at these articles:
https://help.medium.com/hc/en-us/articles/217991468-Duplicate-Content-and-SEO
https://woorkup.com/medium-seo-canonical-tag/
https://brianli.com/how-to-republish-to-medium-with-rel-canonical-41d1821866e8#.x9hjo2zae
That should give you all you need to rectify this issue.
Good luck.
-
Hi,
Yes that will treated as duplicate content because duplicate content is content that appears on the Internet in more than one place (URL).
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have plenty of backlinks but the site does not seem to come up on Google`s first page.
My site has been jumping up and down for many months now. but it never stays on Google first page. I have plenty of back-links, shared content on social media. But what could i be doing wrong? any help will be appreciated. Content is legit. I have recently added some internal links is this might be the cause? Please help .
White Hat / Black Hat SEO | | samafaq0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Does Google Consider a Follow Affiliate Link into my site a paid link?
Let's say I have a link coming into my domain like this http://www.mydomain.com/l/freerol.aspx?AID=674&subid=Week+2+Freeroll&pid=120 Do you think Google recognizes this as paid link? These links are follow links. I am working on a site that has tons of these, but ranks fairly well. They did lose some ranking over the past month or so, and I am wondering if it might be related to a recent iteration of Penguin. These are very high PR inbound links and from a number of good domains, so I would not want to make a mistake and have client get affiliates to no follow if that is going to cause his rankings to drop more. Any thoughts would be appreciated.
White Hat / Black Hat SEO | | Robertnweil10 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Has my site been penalized by google
Hi all I have noticed a sudden drop in rankings for most of my keywords on kerryblu ,co,uk and was thinking the site may have been manually penalized by google. I have not received any notification of this in webmaster tools but can't think of any other reason for the loss of rankings. I have searched the web for info on this but can't find a definite answer. Is there any way of knowing for sure. At the time of the crash the only real change I made was adding google adsense to my blog. Could this be responsible. Thanks for looking.
White Hat / Black Hat SEO | | Dill0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Am I Being Penalized For Having My Whole Site In A Subfolder Named With A Keyword?
I inherited a client. For some reason, their previous webmaster set up the site so everything is in a subfolder /law/. It's an attorney website. All the urls have the primary domain name /law/ and then assigned url. I can't image this is helping but could the site be penalized for this by Google or Bing? It's set up like this: www.attorneysite.com**/law/**therestoftheurl /law/ is included in EVERY PAGE... even the homepage.
White Hat / Black Hat SEO | | DeltonChilds0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0