Be brutily honest - What do you think of this old content?
-
My website reports on news relating to certain web hosting providers, and being hit by the latest phantom update a little I am looking back a little more harshly on some of my content with especially the older articles needing alot of work (I know some areas on the site need work), and there are some articles I am just not sure about.
I have listed a few OLD articles below
https://www.besthostnews.com/the-power-of-shared-web-hosting-by-bluehost/
https://www.besthostnews.com/siteground-sponsors-wordcamp-london/
https://www.besthostnews.com/free-ebook-guide-to-starting-a-website-on-a-budget-a-small-orange/
These are 3 old articles that I have previously updated several months ago, but are they good enough. Due to the nature of the site sometimes I will report on certain news \ features that a web host releases, and sometimes there is very little to write about. These are probably a good example where I feel there is a struggle to write enough or the quality is perhaps lower than something that has more information to report on.
I would appreciate some harsh critical view points \ and perhaps suggestions on how to improve my writing style for these kind of generic style posts.
-
I think you are right here... and yes, I think a change of strategy is required in light of the way Google is heading with this latest update and how other similar sites are suffering.
I am starting to formulate a plan, and your response kind of reinforced it.
-
The articles look fine to me.
I don't know much about the hosting industry or how many people are searching for this type of news.
I have a site that has been covering the news of an industry for about 10 years. When I started covering the news, I wrote articles like you are writing. They were a lot of work and the content was not evergreen.
After doing that for a year or so I decided that my time would be better spent by: 1) writing more evergreen articles covering basic information about the industry, and, 2) making brief posts (two sentences and an image) of news items and linking to the original source. That used my time for more profitable purposes (evergreen articles can pull traffic for years) and still satisfied my visitors' interest in news. In fact, the number of people subscribing to the news blog rose rapidly because I linked to six to ten stories per day instead of writing one or two. Lots of people began arriving at my site and clicking the news link. The number of people subscribing to my email and RSS feed quickly rose into five digit numbers.
A couple years ago the site slipped rankings in one of the Panda updates, even though I was purging old news posts once or twice per year. I assume that Google didn't like lots of two sentence blog posts so I noindexed them and got out of Panda within a few weeks.
More recently, I decided to make one post per day that would simply contain a list of about eight links to news items on other sites. This saved more of my time and reduced the amount of skimpy pages on the site. Lots of people still arrive at the homepage and click into the news. People are still subscribing and writing to me with thanks for offering this news as a free service.
I don't know if your audience wants this. I don't know how many competitors you have offering similar news for this industry. I don't know if you are getting lots of traffic from search into your news posts. Those are all items that I would consider. But, I would weigh the opportunity of writing evergreen content instead of ephemeral.
-
Nope, they had a quick spike but if you see my link, they plummeted shortly after. Basically they spoke too soon.
-
No Jonathan. SERoundTable is all safe. Check this post:
https://www.seroundtable.com/google-panda-weekend-20247.html -
Yeh, I follow seroundtable, but they also got hammered by this update... probably much more so then I did.:
http://suite.searchmetrics.com/en/research/domains/organic?acc=154293&url=seroundtable.com&cc=US
-
Hey Jonathan,
I do not find any kind of problem except the size of content which is not a huge problem for me. If you follow Barry Schwartz's SERoundTable blog, he also reports news and releases kind of posts which are generally low in word count.
Are you sure this "Phantom" update causes the traffic down? As far as I know, this update will mainly target to "How-TO" kind of low quality posts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Hi All, I am doing work for a rug company that acts as a third party. They have close to 4,000+ products. Each rug belongs to a collection. The collection has one main description that is the same throughout every rug in the collection. Ex. One Collection has 15 rugs, all with the same description. Should I take the time and change every single description? I think the answer is yes but I wanted another opinion. Thanks
Content Development | | Mike.NW0 -
Site Content Review Please!
I m looking for someone who can review my site and let me about quality of content on my site. Can anyone suggest / know who I can talk to about this ? Nick
Content Development | | orion680 -
Stolen Content and a Panda Penalty
Hey Folks Question for those folks that have spent some time helping people with the recent penalties and the like. I have a client who has a clear Panda Penalty, huge drop in traffic on the initial Panda date and a further drop on the second date. Much smaller incremental drops on subsequent recent updates as well. From digging in it seems fairly cut and dry - copyscape shows another 250 or so sites with content from this site and there are nearly 2000 external URLs with duplicate content across these sites. We are talking complete, shameless copies of all of the text, sometimes the images as well. The client claims the content is all 100% unique and is his content and that the other blogs must have stolen his content resulting in the penalty - which, if it is true, and I have no reason to suspect otherwise, kind of sucks. Now, many moons ago, way before Penguin or Panda (maybe around 2006) I had a client that had suddenly lost all traffic and their historical rankings. No funny business, it was a small company, had been online since around 2000 and they were pretty much the first of their kind and always did very well from organic search. As it turned out, the content from the site had not really changed since it was set up and as lots of companies had sprung up offering a similar service they had seen their content copied wholesale, across many sites, all over the world. We attempted to contact many of these sites and got some results but many were just old, abandoned copy cat sites on advert supported hosting that had ceased to trade so we maybe got rid of about 20%. Well, in the end we just decided to rewrite the content, we did this and sure enough, the site bounced back to it's previous standing and has been pretty much there ever since. Now that was kind of easy, the site had maybe 20 pages, and it needed a sprucing up but in this case the site has around 500 pages so doing a rewrite is not going to be so easy. Problem is, I don't see removal requests being particularly successful either. So, I see the options and steps as being. Contact all the sites and request the removal of the content use the Google content removal facility:
Content Development | | Marcus_Miller
https://www.google.com/webmasters/tools/removals File a DMCA takedown for anything remaining Report Scraped Pages to Google:
https://docs.google.com/spreadsheet/viewform?formkey=dGM4TXhIOFd3c1hZR2NHUDN1NmllU0E6MQ&ndplr=1 Submit a spam report for all sites involved ? Submit a reconsideration request to let Google know what we have been doing (unlikely In a nutshell, do everything we can to get this content removed and then documenting this to Google in the hope we catch hold of someone who hears our plight. Interestingly enough, this is a sensitive one, so no URL but I would welcome any thoughts or experiences any of you may have had with similar problems. There is a little extra info here from Matt Cutts + Barry Schwartz that kind of tallies with my approach above but would really like to hear any feedback. http://www.seroundtable.com/google-stolen-content-13243.html Cheers all Marcus0 -
Outsource Content Marketing
Hi all I'm wondering whether anyone knows of any good reputable Australian companies that specialise in SEO content marketing? Looking for a company that can manage & produce quality content at scale, while being cost effective. If you've had experience dealing with them, please list the pros & cons of your experience. Thanks.
Content Development | | danng0 -
Duplicate content
Hi I keep getting errors for duplicate content and long url, when i look at these pages its all related to the news pages on my sites how do i define each new news article?
Content Development | | emmanis0 -
Is it considered as duplicate content ?
Hello, I see a lot of errors on my webmaster tools because of this ajax code on my questions pages of the site (screen) : www.dismoicomment.fr The code : | / ADD ANSWER FORM |
Content Development | | elitepronostic
| | $("#answer-add-button").click(function () { |
| | $.ajax({ |
| | type: 'POST', |
| | url: '/answers/quelle-assurance-choisir-pour-un-scooter/', |
| | data: $("form#answer-add").serialize(), |
| | dataType: 'html', |
| | success: function(data) { |
| | |
| | if(data=="answer") { |
| | $('.answer-add-message').show().empty(); |
| | $(document).ready(function() { |
| | $(' Vous avez déjà répondu à cette question. ').appendTo('.answer-add-message'); |
| | }); | I have add a line on my robots.txt : http://www.dismoicomment.fr/robots.txt for remove all urls with /answers/. These urls with /answers/ aren't indexed in google. Do you think that it is dangerous and that can be considered as duplicate content ? 1129546035.png0 -
Does content have a shelf life for link building efforts?
Do you think that content (that doesn't have a date attached) has a shelf life? Especially content that is effectively timeless such as a quiz? I've noticed in my link building efforts that most links are achieved within the first couple of weeks, and that there seems to be a point of diminishing returns. Why do you think that may be?
Content Development | | nicole.healthline0 -
Displaying archive content articles in a writers bio page
My site has writers, and each has their own profile page (accessible when you click their name inside an article). We set up the code in a way that the bios, in addition to the actual writer photo/bio, would dynamically generate links to each article he/she produces. Figured that someone reading something by Bob Smith, might want to read other stuff by him. Which was fine, initially. Fast forward, and some of these writers have 3,4, even 15 pages of archives, as the archive system paginates every 10 articles (so www.example.com/bob-smith/archive-page3, etc) My thinking is that this is a bad thing. The articles are likely already found elsewhere in the site (under the content landing page it was written for, for example) and I visualize spiders getting sucked into these archive black holes, never to return. I also assume that it is just more internal mass linking (yech) and probably doesnt help the overall TOS/bounce/exit, etc. Thoughts?
Content Development | | EricPacifico0