Is pulling automated news feeds on my home page a bad thing?
-
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom.
After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
-
So what do you suggest I do in this scenario, Brent? What's the right thing to do?
-
hmm..
In this case, for sites that are crawled more frequently by Googlebot, can I say that they might have an unfair advantage?
In the sense that, if they were to scrap or syndicate other sites content but due to Google crawling and finding the content on their site first (since they are crawled more frequently) Google will label them as the original while the actual content creator will be labelled as duplicate (if Google find the content on their site after...)
-
The first indexed version means:
1. When you make an original article and Google first crawls this article it is the "First Indexed Version" which means if another site picks up the content after you have it on your site it is duplicative content.
-
Could you explain a little bit more about what "first indexed version" means?
-
Ideally you want to have unique content on your website.
That is going to work best all of the time.
With News websites it becomes more complex, if you have Wires content or AAP content Google will treat the first indexed version as been the most trust worth version of the copy. Google may treat "syndicated" content in a sense that if it is only on 10 high quality websites it is going to be ok but in the end of the day it is still going to favour original content day in day out, the only benefit of Syndicated content is that it is used by businesses which may not have the time to produce the content.
I hope this helps.
Kind Regards,
James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Bad backlinks is it possible that Google is penalizing me?
Hi guys, since December I'm receiving thousands of bad backlinks from websites that copy my content and content from other websites. I also noticed a drop in the organic visits each month. Is it possible that Google is penalizing me for those backlinks? I know that I can ask the webmaster to remove the links but I don't believe that they will do. Look like it's a robot that does all this automatically. Should I use the Google disavow tool? Any other ideas?
White Hat / Black Hat SEO | | Tiedemann_Anselm
Check images below please.
Thanks! OjAFwgT mni5lke UPVp9bW0 -
Are CDN's good or bad for SEO? - Edmonton Web
Hello Moz folks, We just launched a new website: www.edmontonweb.ca It is now ranking on page 2 in our city. The website is built on Wordpress and we have made every effort to make it load faster. We have enabled the right caching and we have reduced the file size. Still, some of our local competitors have lower load times and more importantly lower ttfb's. Is a CDN the right answer? I've read articles demonstrating that Clowd Flare decreased a websites rankings. Is there a better CDN to use, or a propper way to implement Clowd Flare? Thank you very much for your help! Anton,
White Hat / Black Hat SEO | | Web3Marketing87
LAUNCH Edmonton0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
One Blog Comment Now on Many Pages of The Same Domain
My question is I blog commented on this site http://blogirature.com/2012/07/01/half-of-200-signals-in-googles-ranking-algorithm-revealed/#comment-272 under the name "Peter Rota". For some reason the recent comments is a site wide link so, bascially my link from my website is pretty much on each page of their site now. I also noticed that the anchor text for each one of my links says "Peter Rota". This is my concern will google think its spammy if im on a lot of pages on a same site for one blog comment, and will I be penailzied for the exact same anchor text on each page? If this is the case what could I do in trying to get the links removed? thanks
White Hat / Black Hat SEO | | ilyaelbert0 -
What do you think of our new category page?
Hey Mozzers! We have come up with a new layout design for a category page and would love to have your opinion on it, specifically from an S_E_O perspective Here is our current page: http://www.builddirect.com/Laminate-Flooring.aspx Our new page (pending approval): http://www.builddirect.com/testing/laminate-flooring/index.html Just to brief you in on the key differences b/w old and new layout: Left text link menu is removed in new layout
White Hat / Black Hat SEO | | Syed1
New layout looks funny with JS disabled - long vertical line up of products(Perhaps important keywords/ content in new layout appears way down?)
Lot of 'clunk' has been removed (bits of text, links, images, etc) Thanks for checking this out.0 -
Campaign landing pages
Hi At our company we decided we wanted to reach out to a more global audience. So we bought a bank of domains for different countries, e.g. ".asia". Some are our company name, others are things like "barcelonaprivatejets.com." We then put up single page websites for each of these domains, which link to our main .com site. However, I don't know if this is good for our SEO or bad. I've seen so many different things written but I cannot find a definitive answer. The text will be different on all the pages, but being only one page, and the "design" being the same, will we get penalized in some way or another? I've also added links to 2/3 of them in the footer of our main site but now I'm reading that this is bad too - so should I remove these? If anyone also has any ideas of how better we could use these Country-specific domains I would be welcome to suggestions to that too! I am not an SEO person really, I'm a web developer, so this is all completely different to me. P.S My name is Michael not Andy.
White Hat / Black Hat SEO | | JetBookMike0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1