Lonely lonely pages
-
On my site I have tons of blog posts that have never been visited. (Falls on floor in tears). I of course know why. The content is mediocre in most cases and when it was average to good I didn't market it more.
My question is should I go and just scrub the non visited pages or spend the time making these pages better and work on making the content above average?
My competition above me do not have as many pages and their ranking is purely (I have researched this to death) from links from sites they have developed - with good authority.
-
Hi Kieran,
I feel the pain you as a business owner would be going through when you don't see all those efforts performing.
I am sure you have done your research on best practices however I would like to highlight the golden rule for online marketing success again for you. The secret is not volume, it's always quality and consistency.
I completely agree with Peter that you should not scrap the old pages with mediocre content unless they are "Not Unique". I have often seen that mediocre content is usually created through article spinning practices.
While there could be many ways to take a leap from here, however since I am not fully aware of your situation and limitations, here are some ideas:
1. List down all articles on your website in a spreadsheet and write down the following information against them:
a) Topic
b) Target Readers (Existing Customers / New Customers / Both)
c) Target Location of Readers
d) Objective of the article
e) Actions you expect the readers to take
f) Is the article Unique and Original? Please note this is not about the quality or creativity, pure uniqueness.
g) Is the quality of article poor, fair, or excellent? This is on the basis of creativity, knowledge and engagement potential of the article.
h) keywords being targeted in this article.
2. Put a filter on content quality and uniqueness and filter down all articles with fair and excellent quality and uniqueness.
3. Now starts the more difficult task of marketing these articles all over again. This is going to be time consuming and would need attention to detail. But hey honestly, there are no shortcuts to success.
a) For each article that you have filtered out, make sure you verify this before proceeding any further. All those article pages you have selected MUST have appropriate ON PAGE Meta info. Ensure you have proper titles, descriptions and keywords for each of those pages. Also try and spend 5 minutes on each article and see if you can build internal links (3-4) for each one of them.
b) For each article that you have filtered out, do a keyword detailed using Moz analysis and find out the links and ranking factors. I am assuming you know how to do this, or please contact me and I will help you through the steps.
c) Now write following for each of those selected articles. A 200-300 word Press Release (teaser), a 30-40 word Facebook post (preferably with a picture), write a twitter post for that article, 50-100 words article preview teaser (inspired from the press release), Email Teaser (50-100 words)
d) Now for each article please do the following, in no order (as per your convenience):
i) Submit press release to sites like PRweb.com if your budget allows or choose sbwire.com or similar sites.
ii) Share the facebook & twitter posts.
iii) Through your moz analysis for the targeted keywords, identify blogs and forums which are relevant for your article and submit your article teaser that you created in previous step to these sites.
iv) Send emails to all your existing customers who are relevant for that article and if possible invest in some email marketing list too. Please do not buy ready made lists however using hoovers / linkedin etc is advisable.
I hope my suggestions help you drive higher and relevant traffic to your website. Please feel free to send me any questions you may have.
Thanks & Regards,
Prateek Chandra
-
Hi Kieran
I definitely think you shouldn't scrap the content as I am sure there's been a lot of work put into those posts over time.
If it's not worked, then first look at how you can improve it in the light of the knowledge you now have (including also by doing some fresh keyword research), then get to work on it.
Maybe there will be some content that is beyond redemption so in those cases retire the articles. But with some re-working of your existing blog content combined with re-marketing of it through social channels, then you should begin to see some return on your investment.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At what point to stop comments on a blog? Do too many comments hurt the page?
I have a page that's ranking pretty well, and driving sales. That page is starting to get 10+ comments per day and is starting to get quite long. I was wondering if there is a point where I should disable the comments? My gut tells me that people interacting with the page, and Google seeing responses with the users SHOULD be a good thing not bad. But, then I think that a majority of the content of the page is no longer the article, but the comments. All the comments are good, non spammy and directly related to the topic. People just asking questions, etc. Good engagement, I should be happy right?
Content Development | | DemiGR0 -
How to write a good case study page?
Hello all, What components go into making a good case study please? Is there a specific structure that works well from an seo perspective - or is it just about making something read well? Thanks William
Content Development | | wseabrook0 -
Does every keyword need its own landing page?
So we're doing a bunch of keyword research. We've identified the big traffic, higher competition keywords and we've identified tons (thousands) of long-tail keywords that would be appropriate. What I'm wondering is: does every keyword need its own landing page (or content page)? Obviously, we'll be building content for all the primary keywords we're targeting. I'm less mystified about that. What I'm more confused about is what to do about the long tail keywords. For there to be any measurable traffic increase, we need to rank well for thousands of long tail keywords. But it's just not realistic to create thousands of quality content pieces to target each of these long tail keywords individually. So how do you go about ranking for large numbers of long tail keywords? I saw somebody post about using an FAQ page to target multiple long tail keywords which makes sense but even with that I'm not going to have a thousand questions. How does one go after large volumes of long tail keywords? Thanks, --eric
Content Development | | EricOliver0 -
One Page Website Blog Content Question
Hi guys, I'm new to the art of SEO and am learning every day from all the fantastic content here, I have a question that I can't find an answer to, hope it doesn't stump you like it has me... I have a one page website (www.neilwilliamsvoiceover.com) that I need to put more content on for SEO purposes but needs to be kept as one page. I've set-up a blog via blogger, and have that on the website but it's in iframe, which I've now discovered is ignored by search engines. So, my question is, is there a way to pull my blog feed into the website and have it recognised by search engines as content for the website? Would I use an RSS feed or feed burner or something else completely?! Thanks for your time and help in advance.
Content Development | | BamMK0 -
Simular product pages
I have 27000 products on my website, showed one by one on a separated webpage. Google index them almost all (+- 25000). But the SEOmoz report shows them as duplicated content. Indeed, most of the page is identical, only changing description and price of the product which is indeed not more than 2% of the total content of the page. On the bottom of the product page are shown the alternatives for this product, mainly other colors. So, within the same family of products that can have 50 products, the site creates 50 webpages showing the product and it's family. That's why nearly everything on the page is identical within this family of products. My guess is, as Google indexed them all, I should not worry about duplicated content. Is my guess correct? Thanks for a soon answer. Rik
Content Development | | noordhout0 -
On page content and PDF - Dup?
Hi We are writing a useful article which we want to put on our site, but we also want to add it as a pdf which people can download - will this be classed as dup copy?
Content Development | | jj34340 -
My WebSite has two sections with overlapping, or redundant articles on the same topics. Google is only listing one or the other article in Search Results. What should I do to have both pages (similiar but unique content ) to be listed?
My Web Site has two sections with overlapping, or redundant articles on the same topics. Google is only listing one or the other article in Search Results. What should I do to have both pages (similar but unique content ) to be listed? Example: http://www.womenshealthcaretopics.com/pregnancy_week_12.htm http://www.womenshealthcaretopics.com/pregnancy_12_weeks.html
Content Development | | docjamesmd0 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230