Separating the syndicated content because of Google News
-
Dear MozPeople,
I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better:
*A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news?
**B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right?
So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important?
This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed.
Thank you!
-
Hi Lukas.
The main guideline to follow here is isolating your original content for Google News. This means having the non-syndicated content in its own directory, making sure it's the only content you're submitting in the XML sitemap for News, and when you are accepted into Google News, making sure you keep all the syndicated content out of that news subdirectory.
If you do that, it's fine to have all your other syndicated content in the /SYNDICATED directory. I wouldn't about linking to these articles from other parts of your site. Google won't penalize duplicate content that's syndicated, they just attempt to determine the original creator of the content and filter out the syndication partners from the search results. There's no harm at all having this content on your site or linking to it. As for using NOINDEX or a robots.txt disallow on the syndicated content, it's largely up to you. I know some SEOs who prefer to signal to Google to stay out of there and keep it out of the index, and some SEOs who let the content be crawled and for Google to make the call.
The most important thing is to create a clean, news-only section of the site and only submit that for Google News inclusion, and maintain a sitemap just for that section.
Good luck!
Matthew Brown
Moz
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google is catching my website late
Hello, I hope you all guys are doing great. Recently, I published my over my website and within almost 10 mins, it was indexed completely and I also personally checked it in google search console. The URL was indexed but the problem is, it does not appear in Google Search. Sometimes in search result I notice Google shows a result who is published 10-30 mins ago but this is not the case with my website. All articles just show in Google SERP after 1-2 days. What can be the reason behind this, although DA, PA is good (28-31).
White Hat / Black Hat SEO | | HansiAliya0 -
Are bloggs published on blog platforms and on our own site be considered duplicate content?
Hi, SEO wizards! My company has a company blog on Medium (https://blog.scratchmm.com/). Recently, we decided to move it to our own site to drive more traffic to our domain (https://scratchmm.com/blog/). We re-published all Medium blogs to our own website. If we keep the Medium blog posts, will this be considered duplicate content and will our website rankings we affected in any way? Thank you!
White Hat / Black Hat SEO | | Scratch_MM0 -
Google Manual Penalty - Dilemma?
Hi Guys, A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site. This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites). That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now. This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google. This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed. What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
White Hat / Black Hat SEO | | Sandicliffe0 -
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
White Hat / Black Hat SEO | | dotlineseo0 -
Backlinks According to Google
Good Morning, Google has just recognized some links going to my site. I used a seo toolbar downloaded from firefox that informed me of the Links according to Google. My question is that them links have been there for ages and Google has only just recognized them. Is there a reason for this? Does Google only show links quarterly or half yearly? Thanks SEO_123
White Hat / Black Hat SEO | | TWPLC_seo0 -
Does Google Penalize for Managing multiple Google Places from the same IP Address? Can you manage from same google account or separate? Or does it matter since it's created from the same IP?
I manage a number of client's Google Places from the same IP and heard this is not a good thing. Are there Do's and Don'ts when managing multiple Google Places? Create separate google accounts for each or can you use the same account?
White Hat / Black Hat SEO | | Souk0 -
Google Bombing For A Specific URL
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin The page does not contain the word "Beruk". External links to the page do not contact the anchor-text "Beruk" Given the above scenario, how is the page still ranking on first page for this keyword?
White Hat / Black Hat SEO | | rajeevbala0 -
Can you set up a Google Local account under a PO Box?
I have a client that wants a Google local listing in a town he serves but does not have a physical location. Is it an issue to share an address with an existing company? Is is it better to use a P.O. Box? or is there a forwarding address company? Is this considered a black hat Local SEO tactic?
White Hat / Black Hat SEO | | BonsaiMediaGroup0