Publishing the same article content on Yahoo? Worth It? Penalties? Urgent
-
Hey All,
I am currently working for a company and they are publishing exactly the same content on their website and yahoo. In addition to this when I put the same article's title it gets outranked by Yahoo. Isn't against Google guidelines? I think Yahoo also gets more than us since they are on the first position. How do you think should the company stop this practice? Please need urgent responses for these questions.
Also look at the attachment and look at the snippets. We have a snippet (description) like the first paragraph but yahoo somehow scans the content and creates meta descriptions based on the search queries. How do they do That?
-
Thank you very much for your advices. Really helped me out here. I will message you sooner or later and tell you how it went, if you are interested. This week I will make a presentation for the team with the reports.
I think this should be addressed ASAP
-
I'd definitely make that point you made in bold.
If you're a paid contributor, it's a matter of does the income outweigh the drawbacks? It's pretty hard to put a tangible figure on that, but there are definite upsides and downsides. Arguably it adds to Moneywise's branding to be seen on Yahoo, but you can't track that. What you can track are clicks through to the site.
And of course it all depends on what the goal of Yahoo inclusion is. If it is just a money-spinner and a worthwhile one at that, don't even put the same content on your site. It's not worth running the risk of duplication penalties and/or link penalties, depending on how Google sees it.
If it is being done to raise brand awareness then (personally) I think it cannibalises your online visibility more than it promotes it - while still presenting SEO problems.
Outside looking in here, but I hope it helps. I'm with you - it's quite a predicament and a delicate situation, so I hope it works out for you. At the very least, my SEO advice can be seen as impartial and without an agenda, which may be useful to bring to a discussion among people with the company's interests, plus their teams'/
-
Thank you for your clear and descriptive response. I really appreciate it. The hardest thing in this case is to persuade the company that the costs outweigh the benefits. It seems that we are getting paid from Yahoo as contributors. I can outline the negative impacts on SEO, definitely will use your points. Need to think something about the returns in terms of potential revenues, also. How do you think?
Or I guess I should just point at that we are losing the overall position as a brand. And content duplication can be one of the main reasons why we are losing many positions.
Right now I will look at the reports. -
Hey there
I can't see any sense in doing this.
At the very least, it detracts clicks to your site, as it promotes Yahoo over your site. It may also look like to a reader that Moneywise is taking content from Yahoo (rather than the other way round), which cheapens the brand.
The worst case scenario would be that your site is seen as duplicating/stealing content - especially given at how poor Google is at identifying the original source for content. It could also think that you're duplicating content for the sole purpose of getting links, which again could lead to penalties.
To me, this doesn't make sense. I'd be much more inclined to keep the content on your own site - get people to come directly to you. You're getting comments on the articles so you already have a solid user base, clearly.
If your colleagues argue that the Yahoo copies of the content bring in new people to the site, pull up a Google Analytics report and look at how many people entered your site via Yahoo over the last 3 months. I can almost guarantee you that hardly anyone will be clicking those links in the article - those links by the way look pretty manipulative/commercial in terms of anchor text, which could prompt another penalty.
And in SEO terms, despite the link coming from Yahoo, if no one is linking or sharing that URL on Yahoo, I can tell you now that the link won't have much value to it.
In terms of your snippet question, it just looks like Yahoo are pulling the title and content from the page and generating a fresh meta description from there. Probably a time saving solution for a website of that size, but certainly not an ideal one. Your meta descriptions look much better.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Resubmitting disavow file after penalty removal
Hi, We had a manual penalty for links removed about a year ago. The disavow file we submitted was pretty extensive and we took the machete approach, as recommended by Matt Cutts. Recently we took a look over the file again and are of the firm conviction that some of the domains are entirely legit and the links are not manipulated. We would like to resubmit the disavow file excluding these domains so Google picks up the links again. Does anyone have experience of this and if so what were the results? Thanks
White Hat / Black Hat SEO | | halloranc0 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Am I Syndicating Content Correctly?
My question is about how to syndicate content correctly. Our site has professionally written content aimed toward our readers, not search engines. As a result, we have other related websites who are looking to syndicate our content. I have read the Google duplicate content guidelines (https://support.google.com/webmasters/answer/66359?hl=en), canonical recommendations (https://support.google.com/webmasters/answer/139066?hl=en&ref_topic=2371375), and no index recommendation (https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag) offered by Google, but am still a little confused about how to proceed. The pros in our opinion are as follows:#1 We can gain exposure to a new audience as well as help grow our brand #2 We figure its also a good way to help build up credible links and help our rankings in GoogleOur initial reaction is to have them use a "canonical link" to assign the content back to us, but also implement a "no index, follow" tag to help avoid duplicate content issues. Are we doing this correctly, or are we potentially in threat of violating some sort of Google Quality Guideline?Thanks!
White Hat / Black Hat SEO | | Dirving4Success0 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Confusing penalties
Dear Mozzers, I've been working on a friend's website that is fighting for pretty competitive keywords (+90,000 gms) and has been relying almost exclusively on $1800/mo of comment spam to rank on the first page. Now that I've taken over SEO my first priorities were to: eliminate duplicate content improve site structure optimize internal links build legitimate do-follows add some keyword density fix titles and H tags Essentially just the basics, right? But since cancelling the comment spam, rankings for their primary keyword have consistently dropped over the last 3 months. I'm using the same strategies that I've used successfully on at least 6 similar websites. At the moment their homepage is still almost entirely duplicate content -- which is obviously a huge problem, but it seems a little odd that they could have been held up exclusively by that comment spam for so long, doesn't it? Even stranger, their authority and trust scores are now higher than any of the competition. Needless to say, my friends are getting pretty antsy and I'm starting to second guess myself. Do you think I should continue to push them to improve content, eliminate penalties, and build legitimate links -- or should I give in and suggest buying links as a short term solution? Advice is really appreciated!
White Hat / Black Hat SEO | | brevityworks0