Recovering an Almost Dead Blog?
-
Hello,
I wanted to ask this from long time but finally i gathered my energy to ask this long question at moz.
Well, like almost all newbies with little knowledge of SEO & google I started my first blog in 2009, as things were very different that time & with posting more and more, I was getting good results & started to build decent traffic but with poor content ( I really din't care about it ) as I was getting organic traffic.
But things changed with Google Panda Completely after 11th April 2011, Since the time Traffic keep on falling, I never made backlinks so Penguin Updates never hit us but because of Poor & thin Content Site went down lower & lower.
I took some steps like increasing word count of posts, removing some posts but nothing worked so far but nothing worked.
Blog has almost 1200 articles & most important it was my first blog so I was bit attached with it.
Now my Question is, Should I just dispose the blog & move on or There is something which I can try to recover it.
The blog is 6 years old as of of now & received 2 million organic traffic as of now. ( attached organic Traffic screenshot )
My question is, Can something be done Seriously for this blog or I should just let it go.
I will appreciate some genuine advice on that.
Thanks
-
Yep, it's a big, tedious task, but there are no shortcuts here to do it right.
-
I will try to do so, thanks for your tip of keeping the posts privately.
However 1200 posts, its a big task to do.
Can anyone recall something similar with positive results?
-
Honestly, if you're using a CMS like Wordpress, all you should need to do is unpublish the post and let the search engines sort out the rest. If a post is returning a 404, it will get dropped from the index naturally. I can't think of any reason why you'd need to do any more work than that.
Also, a tip, I prefer setting the posts I'm removing to "Privately Published" rather than deleting them entirely. I like to keep removed content as a sort of historical archive, and it returns the same 404 message on the front.
-
Yes, almost 90% posts are not getting traffic, some posts are event posts, so they get some traffic during event & nothing before or after that.
What's best way to Remove Posts, Delete & Request Webmaster's tool to deindex & Remove cache version of site? or something else?
-
Andy, You are asking questions & I am looking for answers..
I am ok if I remove all useless posts, which means almost clearing entire blog. I am also willing to contribute more on this site but thing is does it worth? Will google really start picking my blog.
What if I Remove almost 90% of posts & Just leave 10% Posts with meaningful content?
Also should I do some link building etc?
-
My guess is that most of your blog posts aren't getting any traffic or engagement, but there are probably a few that do. I would start with a content audit, looking at the organic traffic, social engagement and backlinks to each page. You may not have built any links, but that doesn't mean your work hasn't earned them. Keep anything that draws consistent traffic, has been shared more than a few times, and has good quality back links. Let the rest 404. You'll need to make the determination on a case by case basis.
-
Well, you could decimate most of the site and fix many issues, but would this be enough to pull it back for you?
Of those that would remain, would you consider them to be more authoritative posts? Would they stand up in the face of Panda without issue?
-Andy
-
Andy Problem is Most of the articles are Short News, Don't know what can be done for that. Or I may Deindex all those posts, It will be approx 1000 posts ( almost 85% ) of total posts.
Traffic Driving posts are only few, however I have been posting very less like not even 10 posts in last year.
-
As said before, Have not made backlinks for this site at all, all links are natural. I was never hit by penguin, it was Panda all the time
-
Hi Ankit,
All is not lost, but it all depends on the time you have to put in to correcting it.
Have you ever tried to fix the issues with Panda? There is a wealth of information available out there - here are a couple of Google ones to read:
Remember that Panda focuses on thin and duplicate content, which translates to low quality, so if you think that you have ways to correct this, there is no reason you can't pull the traffic back.
-Andy
-
That's disaster. I would suggest you to check each & every backlink. Trying removing the spammy ones or disavow them. Add new posts & link it to old posts. Make it active!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle Breadcrumbs for Blog Posts in multiple categories?
The site in question uses Wordpress. They have a Resources section that is broken into two categories (A or B). Underneath each of these categories is 5 or 6 subcategories. The structure looks like this: /p/main-category-a/subcategory/blog-post-name /p/main-category-b/subcategory/blog-post-name All posts have a main category, but other posts often have multiple subcategories while some posts also fall into both main categories. What would be the easiest or most effective way to auto-populate the breadcrumb based on from where the person reached the blog post? So for example, a way to set Home -> Main Category -> Subcategory 1 as the breadcrumb if they reach it from the Subcategory 1 landing page. Or is this not possible and we should just set the breadcrumb manually based on where we feel it best lives? Thanks.
Technical SEO | | Alces0 -
Is having the same title tag on a blog listing page and blog date archives an SEO issue?
Hi there, Can anyone answer whether having duplicate title tags on the blog listing page (e.g.https://blog.companyname.com/) and the blog date archive pages (e.g.https://blog.companyname.com/archive/2017/10) is an issue? If so why is it an issue and what are the best practices of dealing with this? Thanks! John
Technical SEO | | SEOCT1 -
Moving from www.domain.com/nameofblog to www.domain.com/blog
Describe your question in detail. The more information you give, the better! It helps give context for a great answer I have had my blog located at www.legacytravel.com/ramblings for a while. I now believe that, from an SEO perspective, it would be preferable to move it to www.legacytravel.com/blog. So, I want to be able to not lose any links (few though they may be) with the move. I believe I would need to do a 301 redirect in the htaccess file of www.legacytravel.com that will tell anyone who comes knocking on the door of www.legacytravel.com/ramblings/blah blah blah that now what they want is at www.legacytravel.com/blog/blah blah blah Is that correct? What would the entry look like in the htaccess? Thank you in advance.
Technical SEO | | cathibanks0 -
Internal Wordpress blog ranked and not the main page
hello www.mysite.com/blog is ranked higher than www,mysite.com. i am trying to find the reason for the blog to rank higher which is not my goal. the blog reached the second page and the main domain is no where to be found. is there anything on Wordpress setup that may cause this? thanks
Technical SEO | | ciznerguy0 -
Accessing a pool of blog review sites
Good afternoon, from "yes Andy Murray is into the semi finals" Wetherby Uk 🙂 "Is there an easier way to access a pool of blog writters who speacilasie in reviewing products rather that than the invevitable grunt work of searching through Google." Thanks in advance,
Technical SEO | | Nightwing
David0 -
Guest Blog Posts and PR
Let's say I do a guest blog post on a PR5 site. Let's also say the post is deep within the site, and the page rank that flows to it is only maybe PR1 or 2. Let's also say it doesn't get many links to it so it forever stays that way. Will Google realize even though the PR of that page is low, it's on a PR5 site, thus giving it more link power than it the overall site were a PR2 site? Did that make any sense?
Technical SEO | | UnderRugSwept0 -
Leaving Comments on blogs when html is removed
I found the following blog. It is pagerank 5 do follow http://www.unssc.org/web1/programmes/rcs/cca_undaf_training_material/teamrcs/forumdetail.asp?ID=32 If you attempt to leave a comment with html, the html is removed. There is a button which allows you to leave a comment but if you do it gets redirected to the domain of the blog not your site. However there are still people leaving links with the url of the intended site. As late as today. look at this comment
Technical SEO | | mickey11
Comment posted by : Alex on 09/09/2011 I love to se percorsi on this site very often How is this done, if anyone knows I got the code done to this your keywords The important part being mce_real_href0 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230