How Long Does It Take Content Strategy to Improve SEO?
-
After 6 months of effort with an SEO provider, the results of our campaign have been minimal. we are in the process of reevaluating our effort to cut costs and improve ROI. Our site is for a commercial real estate brokerage in New York City.
Which of these options would have the best shot of creating results in the not too long term future:
-Create a keyword matrix and optimize pages for specific terms. Maybe optimize 50 pages.
-Add content to "thin" pages. Rewrite 150-250 listing and building pages.
-Audit user interface and adjust the design of forms and pages to improve conversions.
-Link building campaign to improve the link profile of a site with not many links (most of those being of low quality).I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh". In fact I have been told the best bet is to improve user interface since it is becoming increasingly difficult to improve ranking.
Any thoughts? Thanks, lan
-
What questions do clients and potential clients have about offices? If they called your company, would you be able to tell them more than 400 words over the phone? Try putting some of that information on your page.
-
Hi Alan,
Can't say your SEO company is completely wrong. As most SEOs normally start with making sure the on-site part / technical SEO is set up OK. Otherwise any efforts you'll put in creating off-site attention and trust are worthless.
-
Hi Jane:
Great observation regarding listing pages on real estate sites!! It is very difficult for me to add unique content for hundreds of listings. There are only so many ways to describe an office or loft space. Writing more than 150-200 words per listing is difficult since the product is generic. So what is the best way around this? My SEO provider suggested I "no-index" these pages. But I am concerned about non-indexing 300 of my product pages, concerned that Google would view 300 no-index on a 650 page site as suspect, like I am trying to hide something.
My SEO provider believes that we were hit by Penguin 1.0 in April 2012. Traffic dropped about 60% at that point. It recovered in October partially as there was a Google update at that time. It has dropped in the last two months, with the drop accelerating as an upgrade of the site was launched in June.
I think my SEO firm is discouraging further SEO as I am maxed out financially after spending more than $25,000 on SEO, coding, design in the last 8 months with nothing to show for it. They may think I hold them accountable (I do, I expect some results eventually) so they are not encouraging me to move forward. I find it very discouraging that after such a major effort there are no results. Maybe results can be achieved but I would need to budget $50,000-$100,000 to get some momentum. But that is out of the question unfortunately.
-
Hi Martijn:
Thanks for the response!! A few questions:
-What do I do about real estate listing pages? It is difficult to create a 400 word description for an office. I guess I could beef up the content to maybe 150 words but the language will be similar.
-We had many toxic links removed in the last 60 days so maybe that is creating the drop. Which will be a faster way to create quality links: post high quality content on our blog or solicit links? I am very surprised my SEO company did not choose to go down this road initially.
Thanks for your assistance. Alan
-
One of my first posts to you was to get a list of all of those pages, go through them one-by-one to find out where they are coming from.
Are they pages that were made by you or company staff, where they spawned as tags or pagination or something else by a content management system, were they made by a hacker posting Ugg boots on your site?
If you want to know where these pages are coming from you gotta spend the time and do some manual work.
-
Hi EGOL:
There is certainly a verifiable, correct answer. Question is how to obtain it. I tend to agree with you, I think some technical issue is going on. My SEO company may or may not have their own agenda or may have over looked this. The jump to 851 pages from 675 in the Google Index in early June is really suspicious. The site map only contains 635 pages. So seems technical. Question is what to I attempt to fix first and at what cost.
In any case, thanks again!!! Alan
-
with all due respect, my SEO provider (MOZ approved) disagrees with you.
That's OK. Lots of SEO providers disagree with me.
You can believe whatever answer you like the best, or whichever one you think is right.
-
-
Hi Egol:
I understand. But I am trying to prioritize what needs to be done first. I have spent about $40,000 in the last 8 months on a series of SEO audits, coding, wireframes without any improvement. Implementing all the suggestions made in the SEO audits. Don't want to continue to waste resources without generating results.
Regarding technical issues, with all due respect, my SEO provider (MOZ approved) disagrees with you. An excerpt form their message to me yesterday below. So based on their suggestions it may make sense to focus on content, but I am not sure, which is why I have posted to this forum. Sorry if my questions overlap at times, but this is so technical that it is better to be safe than sorry. I see that you respond often to these posts and I really appreciate the time you take to respond in such a thorough manner. -
XXXX is not overly concerned that technical/indexation issues are at the heart of the SEO issues. We are seeing conflicting indexation numbers from Google (see screenshot) showing 442 results with an "omitted results" message from Google. Additionally GWT reports conflicting index numbers depending on what report you use: Some at 843, other reports in the 539 range. Also, keep in mind that there has yet to be a Penguin update and therefore Google may not be accounting for some of the positive things that were done in past months.
-
Your developers appear to handling things correctly with the no index issues of listings pages, but we did find the following indexed pages:
-
Subdomain listings.nyc-officespace-leader.com has 37 pages indexed.
-
There are other irrelevant pages indexed on the www subdomain (20-30+), such as:
-
The /listings/search? pages are out of Google's index (this is a good thing).
-
Per our recommendations from our last meeting, you should probably noindex,follow /listings/ pages that have 'stock' (duplicate) content, do not drive much traffic and are low quality pages etc. We outlined this strategy in the deck we provide from our last meeting.
-
Most of your building pages have 80%+ bounce rates: This obviously is a content issue (e.g. no listings for people to view/click on the building page), and this is also an SEO issue (Google does take into account bounce rates from your pages back to the SERPs).
-
Traffic dropped by about 10-15% on 5/19 when Panda 4 came out (see attached screenshot showing drop in Google/Organic traffic on May 20 -- Panda update was May 19). The loss in traffic is probably from lower rankings on some extremely high converting keywords, which would explain the general drop in leads/inquiries/etc. Bottom line, you need to improve content. However, as we've said in the past, there is no guarantee with just a content marketing based approach and it could take months to recover. Thus, content improvement via conversion optimization is the best approach in order to generate more immediate business and still provide longer term organic traffic growth. We could supplement traffic with PPC to generate more immediate traffic, but this is not worth doing until the website is improved since it would likely be wasted money.
-
-
I am only going to say that you should read the feedback given to some of your previous questions before you go much farther. Your site has technical issues and content issues. It could have Penguin issues. I don't know.
If a site is performing at 50% efficiency then you are only going to get 50% benefit out of any content, SEO, marketing, etc. that you put into it.
The smart spend is to fix the issues or start over.
-
Hi there,
I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh".
This is only true if you are currently under a penguin-related penalty. That penalty won't be lifted until the algorithm update is rolled out, but if you are not currently under this type of penalty, link development will help your site's authority and, most likely, rankings. As long ago as 2007, you'd need to wait a few months to see improvements from any type of SEO work you did, but results usually come in a little quicker nowadays. That said, when I worked for an agency, we used three months as the minimum time frame to judge progress and make adjustments. Still, you'd see changes sooner than that in most cases.
If you do not have a good idea of your target keywords and which pages should be ranking for those keywords, I would put that in place, i.e. the matrix. I would have really thought the SEO agency would have done this early on, to be honest, even just for reporting purposes.
If you think you have thin pages, this can be an issue with Google's Panda algorithm, whose purpose is in part to devalue websites with too much thin, "useless" content. Unfortunately, it hit real estate websites particularly hard in some cases due to the nature of the industry. If you have a feed of listings, the content of that feed doesn't differ much property-to-property, and is duplicated extensively if those properties are also shared via a feed with aggregators or partner agents. As such, fleshing out these pages and sections can be very useful, so I would put this in as an important set as far as on-page work goes, after you have identified your primary keywords and developed a content plan to optimise for them.
Make sure you're really creating good content for those primary keywords, however - not "doorway pages" or content with no purpose besides attracting search bots and clicks.
The user interface testing for conversions is absolutely essential but is not an SEO item per se. Conversion rate optimisation can increase sales by hundreds of percentage points very quickly and if you have good internal or external resource to do it, do it now! That said, if SEOs are telling you to do this because "it's too hard to improve rankings" and not because it's an important part of their overall service, I would take this as somewhat of a red flag. It's not too difficult or impossible to improve rankings unless you are penalised or can't afford to invest in SEO (meaning you'd be unlikely to afford good CRO too). I am a big fan of using good SEO and CRO together - the results in terms of improved revenue can be quite astounding.
Cheers,
Jane
-
Hi Ian,
Really hard to say, I would really look in to two issues; adding content to the 'thin' pages. As having 150-200 pages with thin content probably already identifies that there could be an issue with more pages that need to be or excluded from Google or fixed in another stadium.
Besides that I would focus on auditing the user interface way before you start doing any other things on SEO. Otherwise you could end up losing customers you could have had if you've had your sites conversion rate up.
Btw, that links will not have any effect until a next Penguin update is just bullshit.
Hope this helps!
-
Rankings can be improved within a short while (2 months isn't a stretch) and good links are still king. There just isn't a better way at the moment for Google to figure out if one page is superior to another without relying on them as a huge signal. You can add all the content you want but without links your site will stand still or you might get the occasional long tailed search from an organic visit.
I say do the basics before you try anything to difficult. Get your KWs figured out, and apply them in the appropriate format. Ranking a page for more than two keywords and their slight variants has proven super difficult and I'd almost go as far as saying that going beyond two per page is unrealistic. Once you know your KWs then apply SEO onsite best practices, use Mozs on page grader if you have questions about best practices. also don't forget about internal linking as you're optimizing pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High charts, anyone used them? SEO impact/things to take into consideration
Hi there Moz Community, Our org is looking into implementing Highcharts, interactive charts on our website instead of having regular static chart images. I found this post recently about how Moz implemented them a while back, a couple years ago. https://moz.com/community/q/question-for-moz-developers-highcharts Moz made a pretty good case for them over alternatives. They would certainly keep people on the page longer, which is good, but is there anything else that I should take into account or think of before going ahead with this from an SEO perspective? Does anyone else have experience with them on their websites? Thanks
Intermediate & Advanced SEO | | Brian_Dowd0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
After Ranking Drop Continue SEO or Focus on Improving User Experience Instead?
Six months after starting a marketing campaign and spending a lot of money on SEO audits, link removals, wire frames, copywriting and coding my web site (www.nyc-officespace-leader.com) traffic dropped significantly after I launched a new version of my site in early June. Traffic is down about 27%, but most of the traffic from competitive terms is gone and the number of leads (phone calls, form completions) is off by about 70%. On june 6th an upgraded version of the site with mostly cosmetic changes (narrower header without social media buttons, streamlined conversion forms, new right rail was launched. No URLs were changed, and the text remained mostly the same. But somehow my developers botched up either canonical tags or Robot Text and 175 URLs with very little/no content were indexed by Google. At that point my ranking and traffic. A few days ago a request to remove those pages was made via Google WebmasterTools and now the number of pages indexed is down to 675 rather than the incorrect 850 from before. But ranking, traffic and lead generation have not yet recovered. After spending almost $25,000 over nine months this is rather frustrating. I might add the site has very few links from incoming domains and those links are not high quality. An SEO audit was performed in February and in April a link removal campaign occurred with about 30 domains agreeing to remove links and a disavow file being submitted for another 70-80 domains that would not agree to remove links. My SEO believes that we should focus on improving visitor engagement rather that on more esoteric SEO like trying to build incoming links. They think that improving useability will improve conversions and would generate results faster than traditional SEO. Also, they think that improving click through rates, reducing bounce rates will improve ranking by signaling to Google that the site is providing value to visitors. Does this sound like a reasonable approach? On one hand I don't see how my site with a MOZ domain authority could possibly compete against sites with a high number of quality incoming links and that maybe building a better link profile would yield faster results. On the other hand, it seems logical that Google would reward a site that creates a better user experience. Any thoughts from the MOZ community???? Does it sound like the recent loss of traffic is due to the indexing of the 175 pages? If so, when should my traffic and ranking return? Incidentally, these are the steps taken since last November to improve SEO: SEO Traffic & Ranking Drop Analysis and Recommendations (included in-depth SEO technical audit and recommendations). Unnatural Link Removal Program Content Optimization (Audit & Strategy with 20 page keyword matrix) CORE (also provided wireframe for /visitor-details pages at no-charge) SEO Copywriting for 10 pages New wire frames implemented on site on June 6th Jump in indexed pages by 175 on June 10th. Google Webmaster Tools removal request made for those low quality pages on June 23rd. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
SEO and Internal Pages
Howdy Moz Fans (quoting Rand), I have a weird issue. I have a site dedicated to criminal defense. When you Google some crimes, the homepage comes up INSTEAD of the internal page directly related to that type of crime. However, on other crimes, the more relevant internal page appears. Obviously, I want the internal page to appear when a particular crime is Googled and NOT the homepage. Does anyone have an explanation why this happens? FYI: I recently moved to WP and used a site map plugin that values the internal pages at 60% (instead of Weebly, which has an auto site map that didn't do that). Could that be it? I have repeatedly submitted the internal pages via GWT, but nothing happens. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Victim of negative SEO
Hello, I'm one of those people that got the "unnatural links" message from google. Since i run my site from the very first day and always was a one man business i know all the ins and outs of the website.
Intermediate & Advanced SEO | | Henkiepenkie
Basicly i have never ever used a seo company to submit my site anywhere.
Never ever i used any shady tactic to get better ranks.
The only thing i did was trading traffic/links with real active websites, being daily updated, high traffic, etc etc. So. After i got notified in webmaster tools i started digging into a list (provided by webmaster tools) to find out why i got this message (meanwhile i lost 75% of my se traffic in past 3 months) After searching for hours i found out that in the past 2 / 3 months my site has been spammed on about 150 forums (completely dead forums with nothing but spam on it) There was no logic at all since my site got linked with the most ridiculous and unlogical word phrases.
My site is adult related and all spam links contained word phrases like "bangladesh mobile" or "rock girl" or "animal abuse" or "ringtones" etc etc. if i ever would be so stupid to start spamming my 7 year old strong business i would at least use titles that would make sence. anyway.
I can't do nothing about these forums, i don't own them, i can't erase them and if there are any owners, they simply don't respond.
I made a list of all the forums and send it to google but the only response they come up with is "there are still unnatural links". I hardly believe they did anything with that list i send them. I don't know what else i can do and was hoping that somebody could advice me on what to do here besides sending google messages on a daily base. cheers0 -
What do you think about this strategy?
I am new to SEO but have been hired to handle the SEO for a martial arts school. They had previously attained top three rankings primarily using nofollow on every link on the homepage except footer links which had anchor text of their keywords pointing to their second tier pages. Each of those second tier page also had all nofollow except a single footer link that had a keyword anchor text link going back to the home page. Seemed to work for them. I was going to keep it as as well as focus on creating about 10 separate wordpress blogs. They want to give each blog to a student who will post daily and from each post link to their site via anchor text. Anything wrong with this? Thanks Wiliam
Intermediate & Advanced SEO | | whorneff3100 -
SEO for Log in Sites
Hello, I just lunched a website where you have to sign up and to log in in order to use it. So I have the home, also a blog but then the rest of the pages are let's say it "hidden".How would you do the seo for it? I have been cheking facebook, foursquare and some others and they use different approaches. Facebook uses the same description in every single page for example. My site is similar to foursquare users have profile, stats, history, ranking. Well, what is your advice?? Thanks a lot
Intermediate & Advanced SEO | | antorome0