How Long Does It Take Content Strategy to Improve SEO?
-
After 6 months of effort with an SEO provider, the results of our campaign have been minimal. we are in the process of reevaluating our effort to cut costs and improve ROI. Our site is for a commercial real estate brokerage in New York City.
Which of these options would have the best shot of creating results in the not too long term future:
-Create a keyword matrix and optimize pages for specific terms. Maybe optimize 50 pages.
-Add content to "thin" pages. Rewrite 150-250 listing and building pages.
-Audit user interface and adjust the design of forms and pages to improve conversions.
-Link building campaign to improve the link profile of a site with not many links (most of those being of low quality).I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh". In fact I have been told the best bet is to improve user interface since it is becoming increasingly difficult to improve ranking.
Any thoughts? Thanks, lan
-
What questions do clients and potential clients have about offices? If they called your company, would you be able to tell them more than 400 words over the phone? Try putting some of that information on your page.
-
Hi Alan,
Can't say your SEO company is completely wrong. As most SEOs normally start with making sure the on-site part / technical SEO is set up OK. Otherwise any efforts you'll put in creating off-site attention and trust are worthless.
-
Hi Jane:
Great observation regarding listing pages on real estate sites!! It is very difficult for me to add unique content for hundreds of listings. There are only so many ways to describe an office or loft space. Writing more than 150-200 words per listing is difficult since the product is generic. So what is the best way around this? My SEO provider suggested I "no-index" these pages. But I am concerned about non-indexing 300 of my product pages, concerned that Google would view 300 no-index on a 650 page site as suspect, like I am trying to hide something.
My SEO provider believes that we were hit by Penguin 1.0 in April 2012. Traffic dropped about 60% at that point. It recovered in October partially as there was a Google update at that time. It has dropped in the last two months, with the drop accelerating as an upgrade of the site was launched in June.
I think my SEO firm is discouraging further SEO as I am maxed out financially after spending more than $25,000 on SEO, coding, design in the last 8 months with nothing to show for it. They may think I hold them accountable (I do, I expect some results eventually) so they are not encouraging me to move forward. I find it very discouraging that after such a major effort there are no results. Maybe results can be achieved but I would need to budget $50,000-$100,000 to get some momentum. But that is out of the question unfortunately.
-
Hi Martijn:
Thanks for the response!! A few questions:
-What do I do about real estate listing pages? It is difficult to create a 400 word description for an office. I guess I could beef up the content to maybe 150 words but the language will be similar.
-We had many toxic links removed in the last 60 days so maybe that is creating the drop. Which will be a faster way to create quality links: post high quality content on our blog or solicit links? I am very surprised my SEO company did not choose to go down this road initially.
Thanks for your assistance. Alan
-
One of my first posts to you was to get a list of all of those pages, go through them one-by-one to find out where they are coming from.
Are they pages that were made by you or company staff, where they spawned as tags or pagination or something else by a content management system, were they made by a hacker posting Ugg boots on your site?
If you want to know where these pages are coming from you gotta spend the time and do some manual work.
-
Hi EGOL:
There is certainly a verifiable, correct answer. Question is how to obtain it. I tend to agree with you, I think some technical issue is going on. My SEO company may or may not have their own agenda or may have over looked this. The jump to 851 pages from 675 in the Google Index in early June is really suspicious. The site map only contains 635 pages. So seems technical. Question is what to I attempt to fix first and at what cost.
In any case, thanks again!!! Alan
-
with all due respect, my SEO provider (MOZ approved) disagrees with you.
That's OK. Lots of SEO providers disagree with me.
You can believe whatever answer you like the best, or whichever one you think is right.
-
-
Hi Egol:
I understand. But I am trying to prioritize what needs to be done first. I have spent about $40,000 in the last 8 months on a series of SEO audits, coding, wireframes without any improvement. Implementing all the suggestions made in the SEO audits. Don't want to continue to waste resources without generating results.
Regarding technical issues, with all due respect, my SEO provider (MOZ approved) disagrees with you. An excerpt form their message to me yesterday below. So based on their suggestions it may make sense to focus on content, but I am not sure, which is why I have posted to this forum. Sorry if my questions overlap at times, but this is so technical that it is better to be safe than sorry. I see that you respond often to these posts and I really appreciate the time you take to respond in such a thorough manner. -
XXXX is not overly concerned that technical/indexation issues are at the heart of the SEO issues. We are seeing conflicting indexation numbers from Google (see screenshot) showing 442 results with an "omitted results" message from Google. Additionally GWT reports conflicting index numbers depending on what report you use: Some at 843, other reports in the 539 range. Also, keep in mind that there has yet to be a Penguin update and therefore Google may not be accounting for some of the positive things that were done in past months.
-
Your developers appear to handling things correctly with the no index issues of listings pages, but we did find the following indexed pages:
-
Subdomain listings.nyc-officespace-leader.com has 37 pages indexed.
-
There are other irrelevant pages indexed on the www subdomain (20-30+), such as:
-
The /listings/search? pages are out of Google's index (this is a good thing).
-
Per our recommendations from our last meeting, you should probably noindex,follow /listings/ pages that have 'stock' (duplicate) content, do not drive much traffic and are low quality pages etc. We outlined this strategy in the deck we provide from our last meeting.
-
Most of your building pages have 80%+ bounce rates: This obviously is a content issue (e.g. no listings for people to view/click on the building page), and this is also an SEO issue (Google does take into account bounce rates from your pages back to the SERPs).
-
Traffic dropped by about 10-15% on 5/19 when Panda 4 came out (see attached screenshot showing drop in Google/Organic traffic on May 20 -- Panda update was May 19). The loss in traffic is probably from lower rankings on some extremely high converting keywords, which would explain the general drop in leads/inquiries/etc. Bottom line, you need to improve content. However, as we've said in the past, there is no guarantee with just a content marketing based approach and it could take months to recover. Thus, content improvement via conversion optimization is the best approach in order to generate more immediate business and still provide longer term organic traffic growth. We could supplement traffic with PPC to generate more immediate traffic, but this is not worth doing until the website is improved since it would likely be wasted money.
-
-
I am only going to say that you should read the feedback given to some of your previous questions before you go much farther. Your site has technical issues and content issues. It could have Penguin issues. I don't know.
If a site is performing at 50% efficiency then you are only going to get 50% benefit out of any content, SEO, marketing, etc. that you put into it.
The smart spend is to fix the issues or start over.
-
Hi there,
I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh".
This is only true if you are currently under a penguin-related penalty. That penalty won't be lifted until the algorithm update is rolled out, but if you are not currently under this type of penalty, link development will help your site's authority and, most likely, rankings. As long ago as 2007, you'd need to wait a few months to see improvements from any type of SEO work you did, but results usually come in a little quicker nowadays. That said, when I worked for an agency, we used three months as the minimum time frame to judge progress and make adjustments. Still, you'd see changes sooner than that in most cases.
If you do not have a good idea of your target keywords and which pages should be ranking for those keywords, I would put that in place, i.e. the matrix. I would have really thought the SEO agency would have done this early on, to be honest, even just for reporting purposes.
If you think you have thin pages, this can be an issue with Google's Panda algorithm, whose purpose is in part to devalue websites with too much thin, "useless" content. Unfortunately, it hit real estate websites particularly hard in some cases due to the nature of the industry. If you have a feed of listings, the content of that feed doesn't differ much property-to-property, and is duplicated extensively if those properties are also shared via a feed with aggregators or partner agents. As such, fleshing out these pages and sections can be very useful, so I would put this in as an important set as far as on-page work goes, after you have identified your primary keywords and developed a content plan to optimise for them.
Make sure you're really creating good content for those primary keywords, however - not "doorway pages" or content with no purpose besides attracting search bots and clicks.
The user interface testing for conversions is absolutely essential but is not an SEO item per se. Conversion rate optimisation can increase sales by hundreds of percentage points very quickly and if you have good internal or external resource to do it, do it now! That said, if SEOs are telling you to do this because "it's too hard to improve rankings" and not because it's an important part of their overall service, I would take this as somewhat of a red flag. It's not too difficult or impossible to improve rankings unless you are penalised or can't afford to invest in SEO (meaning you'd be unlikely to afford good CRO too). I am a big fan of using good SEO and CRO together - the results in terms of improved revenue can be quite astounding.
Cheers,
Jane
-
Hi Ian,
Really hard to say, I would really look in to two issues; adding content to the 'thin' pages. As having 150-200 pages with thin content probably already identifies that there could be an issue with more pages that need to be or excluded from Google or fixed in another stadium.
Besides that I would focus on auditing the user interface way before you start doing any other things on SEO. Otherwise you could end up losing customers you could have had if you've had your sites conversion rate up.
Btw, that links will not have any effect until a next Penguin update is just bullshit.
Hope this helps!
-
Rankings can be improved within a short while (2 months isn't a stretch) and good links are still king. There just isn't a better way at the moment for Google to figure out if one page is superior to another without relying on them as a huge signal. You can add all the content you want but without links your site will stand still or you might get the occasional long tailed search from an organic visit.
I say do the basics before you try anything to difficult. Get your KWs figured out, and apply them in the appropriate format. Ranking a page for more than two keywords and their slight variants has proven super difficult and I'd almost go as far as saying that going beyond two per page is unrealistic. Once you know your KWs then apply SEO onsite best practices, use Mozs on page grader if you have questions about best practices. also don't forget about internal linking as you're optimizing pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webjaguar SEO shortcomings
Hey All. I have a client whose ecommerce site is build in Webjaguar. Does anyone have experience with this platform. It appears to be loaded with technical SEO challenges (duplicate content, weird URLs, etc). Interestingly, when I Google "webjaguar SEO challenges" and things like that....nothing comes up. Suspicious, methinks. I appreciate any thoughts from SEO folks. Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
International SEO
Hi all, The company that I work for is planning to target some french (and some other foreign) keywords. The thing is, in our industry, you can't just hire someone to translate the content/pages. The pages have to be translated by an accredited translator. Here's the thing, it costs a LOT of money just to translate a few thousand words. So, the CEO decided to translate a few of our 'core' pages and SEO them to see if it brings results. My questions are, would it be possible from a technical point of view to simply translate a few pages? Would that cause a problem for the search engine crawlers? Would those pages be 'seen' as duplicates? Thanks in advance guys!
Intermediate & Advanced SEO | | EdwardDennis0 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0 -
Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
We're considering moving a site to a host that also offers hosting for adult websites. Can this have a negative affect on SEO, if our hosting company is in any way associated with adult websites?
Intermediate & Advanced SEO | | grapevinemktg0 -
Are tags an issue in SEO
SEOMoz saw that my tags were duplicate pages. Are tags a serious issue in SEO? Should I remove it entirely to prevent the duplicate pages?
Intermediate & Advanced SEO | | visualartistics0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840