Is pulling automated news feeds on my home page a bad thing?
-
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom.
After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
-
So what do you suggest I do in this scenario, Brent? What's the right thing to do?
-
hmm..
In this case, for sites that are crawled more frequently by Googlebot, can I say that they might have an unfair advantage?
In the sense that, if they were to scrap or syndicate other sites content but due to Google crawling and finding the content on their site first (since they are crawled more frequently) Google will label them as the original while the actual content creator will be labelled as duplicate (if Google find the content on their site after...)
-
The first indexed version means:
1. When you make an original article and Google first crawls this article it is the "First Indexed Version" which means if another site picks up the content after you have it on your site it is duplicative content.
-
Could you explain a little bit more about what "first indexed version" means?
-
Ideally you want to have unique content on your website.
That is going to work best all of the time.
With News websites it becomes more complex, if you have Wires content or AAP content Google will treat the first indexed version as been the most trust worth version of the copy. Google may treat "syndicated" content in a sense that if it is only on 10 high quality websites it is going to be ok but in the end of the day it is still going to favour original content day in day out, the only benefit of Syndicated content is that it is used by businesses which may not have the time to produce the content.
I hope this helps.
Kind Regards,
James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Targeted keywords in top half of the page or through out the page?
i have created a content, want to include target keywords but where do i place them for maxim seo benefit, i am asking this because i have heard looks doesn’t give much credit if the kws are at the end?
White Hat / Black Hat SEO | | Sam09schulz0 -
404 Errors For Pages That Never Existed
I'm seeing a lot of 404 errors with slugs related to cryptocurrency (not my website's industry at all). We've never created pages remotely similar, but I see a lot of 404 errors with keywords like "bitcoin" and "litecoin". Any recommendations on what to do about this? Another keyword is "yelz". It usually presents like .../yelz/-ripper-vs-steller/ or .../bitcoin-vs-litecoin/. I don't really even have the time to fix all the legitimate 404 errors, let alone these mysterious requests. Any advice is appreciated.
White Hat / Black Hat SEO | | bcaples1 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Bot or Virus Creating Bad Links?
Hey Everyone, We are getting ready to engage a client for some potential marketing/SEO so in preparing for this have ran the site through OpenSiteExplorer. The site is relatively new and there are only two links under the inbound links section. They are relevant and add value, no issues there. Here is where it get strange. When I look under the 'Just Discovered' section there are many (hundreds) new links going back about a month. Virtually all of them have the anchor text 'Louis Vuitton outlet'. Now the client swears he has not engaged anyone for black hat SEO, so wondering who could possibly be creating these links. They do sell some Louis Vuitton items on the site, so I'm wondering if it is possible that some spam bot has picked up the site and began to spam the web with links to the clients site. So far today, 50 or so new links have been created with said anchor text and the clients root URL all on very poor quality, some foreign blog sites. Would like to find out why this is happening and put a stop to it for obvious reasons. Has anyone experienced something similar? Could this be a bot? Or maybe someone with an axe to grind against the client? Anyone could be doing this on their own, but just seems strange for it to be happening to a new site that does not even rank highly at the moment. Any advice or info is greatly appreciated, thanks in advance.
White Hat / Black Hat SEO | | Whebb0 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Getting a link from an internal page with PR 2 of a domain with PR 5 is how much effective?
My website got a link from an internal page with PR rank of 2 but the domain has the PR rank 5. For example - A domain www.example.com with PR rank 5 and internal page www.example.com/extra/1 PR rank 2. I got a link from the internal page, will I benefit from main domain Page rank 5? Thanks, Sameer
White Hat / Black Hat SEO | | KaylaKerr0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Can I just delete pages to get rid of bad back-links to those pages?
I just picked up a client who had built a large set of landing pages (1000+) and built a huge amount of spammy links to them (too many to even consider manually requesting deletion for from the respective webmasters). We now think that google may also be seeing the 'landing pages' as 'doorway pages' as there are so many of them 1000+ and they are all optimized for specific keywords and generally pretty low quality. Also, the client received an unnatural links found email from google. I'm going to download the links discovered by google around the date of that email and check out if there are any that look specifily bad but I'm sure it will be just one of the several thosand bad links they built. Anyway, they are now wanting to clean up their act and are considering deleting the landing/doorway pages in a hope to a. rank better for the other non landing/doorway pages (Ie category and sub cats) but more to the crux of my question.. b. essentially get rid of all the 1000s of bad links that were built to those landing/doorway pages. - will this work? if we just remove those pages and use 404 or 410 codes will google see any inbound (external) links to those pages as basicly no longer being links to the site? or is the TLD still likely to be penilized for all the bad links coming into no longer existing URLs on it? Also, any thoughts on whether a 404 or 410 would be better is appreciated. Some info on that here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=64033 I guess another option is the disavow feature with google, but Matt Cutts video here: http://www.youtube.com/watch?v=393nmCYFRtA&feature=em- kind of makes it sound like this should just be used for a few links, not 1000s... Thanks so much!!!!
White Hat / Black Hat SEO | | zingseo0