What is Best Way to Scale RCS Content?
-
SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach. That approach is best described around MOZ as “Real Company S#it”. RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media.
However, our clients and business owners do want to see results and want it quantified in some way. The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue. They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated.
In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS?
Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword. However, that approach can get daunting if you have say 25 keywords that a small business wants to target. In this case is not really a case of scaling down the client expectations?
What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? It seems that you can do overkill and quickly run out of RCS content to produce.
-
I agree with you from an SEO standpoint it quickly seems like overkill. From a content marketing perspective though it does seem endless the amount of content you could produce. In my own research I think I found a very helpful blog post with some ideas: http://moz.com/blog/how-to-build-a-content-marketing-strategy
I think what I'm lacking is a more directed approach to the entire strategy.
-
**However, that approach can get daunting if you have say 25 keywords that a small business wants to target. **
uhh... This sounds like a small project.
**What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? **
Some people have been writing RCS every day for the past several years. They are usually the ones who are doing well in the SERPs.
It seems that you can do overkill and quickly run out of RCS content to produce.
If you really know your RCS... the sky can be the limit in most niches. What appears to be overkill and depletion to an SEO might be just the tip of a large iceberg for the content area expert.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ROI on Policing Scraped Content
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating. My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase. Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
Intermediate & Advanced SEO | | ahirai1 -
Will using 301 redirects to reduce duplicate content on a massive scale within a domain hurt the site?
We have a site that is suffering a duplicate content problem. To help resolve this we intend to reduce the amount of landing pages within the site. There are a HUGE amount of pages. We have identified the potential to reduce the pages by half at first by combing the top level directories, as we believe they are semantically similar enough that they no longer warrant being seperated.
Intermediate & Advanced SEO | | Silkstream
For instance: Mobile Phones & Mobile Tablets (Its not mobile devices). We want to remove this directory path and 301 these pages to the others, then rewrite the content to include both phones and tablets on the same landing page. Question: Would a massive amount of 301's (over 100,000) cause any harm to the general health of the website? Would it affect the authority? We are also considering just severing them from the site, leaving them indexed but not crawlable from the site, to try and maintain a smooth transition. We dont want traffic to tank. Has anyone performed anything similar? Id be interested to hear all opinions. Thanks!0 -
Whats the best way to revive a directory that was 301'd and now I want to remove that?
Last year i 301'd one of my directories on my site, pointing everything to a different directory. Long story short I am going to sell this product line again and would like to just remove the 301 to that original directory, but I am reading that the 301s are also cached in most browsers for a long time. Has anyone successfully done this and if you did what was it that you had to do? Thanks Mike
Intermediate & Advanced SEO | | SandyEggo0 -
Above the Fold Content
How important is the placement of unique content "Above the Fold". Will attention grabbing images suffice or must their be a lot of unique text?
Intermediate & Advanced SEO | | casper4340 -
Joomla duplicate content
My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance
Intermediate & Advanced SEO | | kuavicrea0 -
What is the best way to link between all my portals?
Hi I own 12 different portals within gambling, they do more or less work and feel like this one, Casinotopplisten, what is the best way for me to link between all of them? Since there is alot going on in Google these days I havent linked between the sites at all, but i feel that to be a somewhat waste. So here is my three ideas so far, in ranked order: Add a menu at the topp right of the site, or footer, that links to the 10 different sites with different languages. The text link should then only be "Norwegian, Swedish, English etc.." Basiclly the same as about, but in addition linking to the "same page" in the other languages. As all pages have the same article set for startes this can be done. Dont do any linking between the sites and only link to the sites separately from our company blog/site.. Dont link at all. I should add that all of these sites are on different IPs with different domains and all in different languages. Hope someone can add their 2c on this one.. Thanks!
Intermediate & Advanced SEO | | MortenBratli0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840