Publishing the same article content on Yahoo? Worth It? Penalties? Urgent
-
Hey All,
I am currently working for a company and they are publishing exactly the same content on their website and yahoo. In addition to this when I put the same article's title it gets outranked by Yahoo. Isn't against Google guidelines? I think Yahoo also gets more than us since they are on the first position. How do you think should the company stop this practice? Please need urgent responses for these questions.
Also look at the attachment and look at the snippets. We have a snippet (description) like the first paragraph but yahoo somehow scans the content and creates meta descriptions based on the search queries. How do they do That?
-
Thank you very much for your advices. Really helped me out here. I will message you sooner or later and tell you how it went, if you are interested. This week I will make a presentation for the team with the reports.
I think this should be addressed ASAP
-
I'd definitely make that point you made in bold.
If you're a paid contributor, it's a matter of does the income outweigh the drawbacks? It's pretty hard to put a tangible figure on that, but there are definite upsides and downsides. Arguably it adds to Moneywise's branding to be seen on Yahoo, but you can't track that. What you can track are clicks through to the site.
And of course it all depends on what the goal of Yahoo inclusion is. If it is just a money-spinner and a worthwhile one at that, don't even put the same content on your site. It's not worth running the risk of duplication penalties and/or link penalties, depending on how Google sees it.
If it is being done to raise brand awareness then (personally) I think it cannibalises your online visibility more than it promotes it - while still presenting SEO problems.
Outside looking in here, but I hope it helps. I'm with you - it's quite a predicament and a delicate situation, so I hope it works out for you. At the very least, my SEO advice can be seen as impartial and without an agenda, which may be useful to bring to a discussion among people with the company's interests, plus their teams'/
-
Thank you for your clear and descriptive response. I really appreciate it. The hardest thing in this case is to persuade the company that the costs outweigh the benefits. It seems that we are getting paid from Yahoo as contributors. I can outline the negative impacts on SEO, definitely will use your points. Need to think something about the returns in terms of potential revenues, also. How do you think?
Or I guess I should just point at that we are losing the overall position as a brand. And content duplication can be one of the main reasons why we are losing many positions.
Right now I will look at the reports. -
Hey there
I can't see any sense in doing this.
At the very least, it detracts clicks to your site, as it promotes Yahoo over your site. It may also look like to a reader that Moneywise is taking content from Yahoo (rather than the other way round), which cheapens the brand.
The worst case scenario would be that your site is seen as duplicating/stealing content - especially given at how poor Google is at identifying the original source for content. It could also think that you're duplicating content for the sole purpose of getting links, which again could lead to penalties.
To me, this doesn't make sense. I'd be much more inclined to keep the content on your own site - get people to come directly to you. You're getting comments on the articles so you already have a solid user base, clearly.
If your colleagues argue that the Yahoo copies of the content bring in new people to the site, pull up a Google Analytics report and look at how many people entered your site via Yahoo over the last 3 months. I can almost guarantee you that hardly anyone will be clicking those links in the article - those links by the way look pretty manipulative/commercial in terms of anchor text, which could prompt another penalty.
And in SEO terms, despite the link coming from Yahoo, if no one is linking or sharing that URL on Yahoo, I can tell you now that the link won't have much value to it.
In terms of your snippet question, it just looks like Yahoo are pulling the title and content from the page and generating a fresh meta description from there. Probably a time saving solution for a website of that size, but certainly not an ideal one. Your meta descriptions look much better.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty because of logo banner in sidebar of external website?
Dear mozzers, One of our pages is not ranking (well). I wrote another question about this here on Moz. I just discovered that there is an external website that has a banner to our page in the sidebar. The banner is on every 134+ pages of that site. It is in a banner slider and only show for a few seconds every now and then. The link is not "nofollow". It seems that our page dropped from Google slightly after this banner was added. However I am completely sure about this. The link is over here in the banner carousel/slider in the sidebar: http://www.wierszowisko.com/ My questions are: Could this banner log cause a penalty for our page? If so what can we do to undo this? Ask the webmaster to remove the link? Disavow on Google? How does this exactly work?
White Hat / Black Hat SEO | | rdudkiewicz1 -
Competitor ranking well with duplicate content—what are my options?
A competitor is ranking #1 and #3 for a search term (see attached) by publishing two separate sites with the same content. They've modified the title of the page, and serve it in a different design, but are using their branded domain and a keyword-rich domain to gain multiple rankings. This has been going on for years, and I've always told myself that Google would eventually catch it with an algorithm update, but that doesn't seem to be happening. Does anyone know of other options? It doesn't seem like this falls under any of the categories that Google lists on their web spam report page—is there any other way to get bring this up with the powers that be, or is it something that I just have to live with and hope that Google figures out some day? Any advice would help. Thanks! how_to_become_a_home_inspector_-_Google_Search_2015-01-15_18-45-06.jpg
White Hat / Black Hat SEO | | inxilpro0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0