Is pulling automated news feeds on my home page a bad thing?
-
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom.
After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
-
So what do you suggest I do in this scenario, Brent? What's the right thing to do?
-
hmm..
In this case, for sites that are crawled more frequently by Googlebot, can I say that they might have an unfair advantage?
In the sense that, if they were to scrap or syndicate other sites content but due to Google crawling and finding the content on their site first (since they are crawled more frequently) Google will label them as the original while the actual content creator will be labelled as duplicate (if Google find the content on their site after...)
-
The first indexed version means:
1. When you make an original article and Google first crawls this article it is the "First Indexed Version" which means if another site picks up the content after you have it on your site it is duplicative content.
-
Could you explain a little bit more about what "first indexed version" means?
-
Ideally you want to have unique content on your website.
That is going to work best all of the time.
With News websites it becomes more complex, if you have Wires content or AAP content Google will treat the first indexed version as been the most trust worth version of the copy. Google may treat "syndicated" content in a sense that if it is only on 10 high quality websites it is going to be ok but in the end of the day it is still going to favour original content day in day out, the only benefit of Syndicated content is that it is used by businesses which may not have the time to produce the content.
I hope this helps.
Kind Regards,
James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[SEO] Star Ratings -> Review -> Category Page
Hello there, Basically, if you put non-natural star ratings on the category page, like in the attached images, you will get manual ban from google right?
White Hat / Black Hat SEO | | Shanaki
(i know it for sure, cause I had clients with this situation) The real question is:
If I put a form that allows users to write a review about the category products on the category page, for REAL, will google still ban? Any advice? Any example? With respect,
Andrei Irh0O kto4o0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
Should we remove our "index" pages (alphabetical link list to all of the products on the site)?
We run an e-commerce site with a large number of product families, with each family having a number of products within it. We have a set of pages (26 - one for each letter A-Z) that are lists of links to the product family pages. We originally created these pages thinking it would aid in discoverability of these pages to search engines, of course as time has gone on, techniques like this have fallen out of favor with Google as it provides negligible value to the user. Should we consider removing these pages from the site overall? Is it possible that it could be viewed by Panda as resembling a link farm? Thanks in advance!
White Hat / Black Hat SEO | | ChrisRoberts-MTI1 -
[linkbuilding] link partner page on webshop, is it working?
Hello Mozzers, I am wondering about the effect of link building by swapping links between websites and adding a link partner page to the web shop containing hundreds of links. I have this new competitor coming in to the SERP of Google competing on the keywords I am targeting. The competitor has way more links than our web shop. The competitor has a page with hundreds of links to other web shops witch on there turn has a link to there web shop. (not all off them link back btw) I always thought it is no use sharing links with other websites this way in creating a huge page with hundreds of links. it is of no benefit for neighter website to do this. Still it does seems to work (?) and tis strategy is used by a lot of web shops in the Netherlands. How are you guys looking at this?
White Hat / Black Hat SEO | | auke1810
Witch of you guy's are using strategy like this?
Should I pick up this strategy myself?0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
Why did Google reject us from Google News?
I submitted our site, http://www.styleblueprint.com to Google to pontentially be a local news source in Nashville. I received the following note back: We reviewed your site and are unable to include it in Google News at this
White Hat / Black Hat SEO | | styleblueprint
time. We have certain guidelines in place regarding the quality of sites
which are included in the Google News index. Please feel free to review
these guidelines at the following link: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#3 Clicking the link, it anchors to the section that says: These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. etc... Now we have never intentionally tried to do anything deceptive for our rankings. I am new to SEOmoz and new to SEO optimization in general. I am working through the errors report on our campaign site but I cannot tell what they are dinging us for. Whatever it is we will be happy to fix it. All thoughts greatly appreciated. Thanks in advance, Jay0