Is pulling automated news feeds on my home page a bad thing?
-
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom.
After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
-
So what do you suggest I do in this scenario, Brent? What's the right thing to do?
-
hmm..
In this case, for sites that are crawled more frequently by Googlebot, can I say that they might have an unfair advantage?
In the sense that, if they were to scrap or syndicate other sites content but due to Google crawling and finding the content on their site first (since they are crawled more frequently) Google will label them as the original while the actual content creator will be labelled as duplicate (if Google find the content on their site after...)
-
The first indexed version means:
1. When you make an original article and Google first crawls this article it is the "First Indexed Version" which means if another site picks up the content after you have it on your site it is duplicative content.
-
Could you explain a little bit more about what "first indexed version" means?
-
Ideally you want to have unique content on your website.
That is going to work best all of the time.
With News websites it becomes more complex, if you have Wires content or AAP content Google will treat the first indexed version as been the most trust worth version of the copy. Google may treat "syndicated" content in a sense that if it is only on 10 high quality websites it is going to be ok but in the end of the day it is still going to favour original content day in day out, the only benefit of Syndicated content is that it is used by businesses which may not have the time to produce the content.
I hope this helps.
Kind Regards,
James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Inter-linking websites together good or bad for SEO?
I know of a website that inter-links a handful of websites together (ex- coloring.ws interlinks to a handful of other sites, including dltk-kids.com, and others). Is this negative for SEO? I was thinking about creating a few related sites and inter-linking all of them together, since they will all be relevant to each other. Any thoughts would be great!
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Competitors Building Bad Back Links
Hi there, I recently checked the back links for my site using Open Site Explorer, and I noticed a huge number of bad back links which I believe a competitor might be building to help lower my ranking for a number of highly competitive keywords. Besides spending time disavowing these links, what else can be done? Has anyone else been faced with the same problem? Any help would be appreciated. cXT0lvd.jpg
White Hat / Black Hat SEO | | bamcreative0 -
Cutting off the bad link juice
Hello, I have noticed that there is plenty of old low quality links linking to many of the landing pages. I would like to cut them off and start again. Would it be ok to do the following?: 1. create new URLs (domain is quite string and new pages are ranking good and better than the affected old landing pages) and add the old content there 2. 302 redirect old landing pages to the new ones 3. put "no index" tag on the old URLs (maybe even "no index no follow"?)or it wouldn't work? Thanks in advance
White Hat / Black Hat SEO | | ThinkingJuice0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
Getting a link from an internal page with PR 2 of a domain with PR 5 is how much effective?
My website got a link from an internal page with PR rank of 2 but the domain has the PR rank 5. For example - A domain www.example.com with PR rank 5 and internal page www.example.com/extra/1 PR rank 2. I got a link from the internal page, will I benefit from main domain Page rank 5? Thanks, Sameer
White Hat / Black Hat SEO | | KaylaKerr0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Page Rank is 0
Hi. Can you please point me in the right direction concerning a site whose default page has a PR of 0? There does not appear to be any errors in the robots.txt file (that I can tell). When I ran a duplicate content check by searching the title tag and first sentance in quotes it did not return more than 2 sites. When I ran a site: it is reporting 287,000 results. Does this mean that they purchased links and have now been penalized? Or where should I go from here? Thank you for any feedback and assistance.
White Hat / Black Hat SEO | | JulB0 -
Sudden Ranking Drop from 1st Page
My client's Website http://countryfeelingholidays.com is experiencing a huge drop of its rankings since Aug 1st. It was at 2nd on 1st page on google.lk for the keyword Holidays Sri Lanka . But When I checked it last it has gone to 20th page. I really cannot find a reason for this drop . Only thing that comes to mind is that we put a comment on a blog but finally it appeared on all pages because of top commentator plugin . huge rise in backlinks in oneday . from next day we lost its ranking on google.lk but on google.com it is still at the same position where it used to be . What would be the reason for this ? Could it be a penelty ? What should we do now to get its ranking back ?
White Hat / Black Hat SEO | | Osanda0