Billing for results not by the day. Thought?
-
Hi,
We are searching for a new SEO provider for www.compoundsecurity.co.uk and I notice that some SEO providers are now billing against results rather than days spent doing the work.
Considering the high prices and lack of work done for those fees by current provider, this is of interest to me.
Does anyone have experience of working this way and or have any advice please?
Thank you
-
Keri:
You are such an instigator! Sounds like you are angling for a joint blog post from me and EGOL.
In your evll and nefarious way.
NYAH-HAH-HAH.
<<evil grin="">></evil>
-
This subject often comes up in Q&A, both by people wanting to hire using this method, or sell their services with this method. All of your arguments here would make for a good YouMoz post if someone was interested in giving it a comprehensive treatment.
-
** just because previous providers haven't delivered,...**
I know a few people would say the problem is a lack of vetting.
-
I agree with EGOL. I would decline a "pay for performance" model because too much is out of my control: client cooperation, algo updates, new competitors.
Performance and accountability are important. Who could argue with that?
But just because previous providers haven't delivered, it doesn't necessarily follow that shifting to a pay for performance model is the way to go. This often degenerates into the futile pursuit of phoney metrics, eg. ranking for non-competitive terms, social media shares, etc...
You need to find a provider you trust with a track record of delivering results. Limiting yourself to those who will accept pay for performance compensation may limit your search -- and your bottom line results.
-
Are you willing to turn over your entire site to the "SEO provider"?
That is a good idea. If I am going to do SEO on the basis of performance I will start my own website and sell the leads or dump the shopping cart to the highest bidder. Then I get paid for everything that I kill and can move the business to Company B if Company A does not perform. I would also then have complete view of the activity on the site and the transactions that occur there.
Just like being an affiliate or having a drop shipper - which I currently do.
-
Some seo's have been offering this type of billing, on results only, for quite a long time now.
I can see the attraction, although i would never offer it myself, especially since the consequence of a good contemporary seo program extend far beyond ranking results. For example an SEO's efforts sorting out all the social media profiles for the SEO benefit and advice or work on the ongoing social profile management would likely result in more reach, engagement and hence traffic and hopefully sales and increased brand awareness and reputation etc etc. Hence client would likely be receiving high value results from social immediately but not paying anything for it. So i wouldnt be happy working like that.
I would ask what defines a result that justifies billing ?
Is it simply a ranking result for keywords they choose (in which case be very wary since they may not convert) OR keywords you choose based on research OR is it conversions from organic search result to your website OR is it an actual sale tracked back to organic search (& arguably social too if they are doing a holistic 'Inbound' package).
If its the latter and the CPA (Cost Per Acquisition) they propose/you negotiate leaves you with a profit then worth considering.
Interested to hear what other think ??
-
I do have some experience in this area. If you operate a highly measurable marketing program, some web marketing agencies will agree to a "pay per performance" model of compensation, but you will have to work with them for it to be clearly defined, and they will still want a flat rate compensation for their hours spent. At the end of the day, agencies want to get paid period. And they should be. You may end up paying more for their services going this route, so if saving money is your concern I wouldn't recommend it. If ensuring that your agency can deliver and that they have some "skin in the game" to keep them honest, then this could be a great direction.
A typical setup I've seen is the agency will give you their hours at "cost" or a very low rate as a baseline to cover their expenses and time, then if you have very good past historical performance reporting setup, and they are comfortable that they can do what they say they can, you can define a payout based on "results" such as website conversions from organic search sources. So comparing year-over-year, say you got 100 conversions in October 2012 from organic search, you could say for every conversion we get in October 2013 above 100 you get 25% of the revenue, or something like that.
Also keep in mind, the industry is somewhat in free fall right now in my opinion due to the increase of "not provided" keyword data. In the past, you would do a contract like I outlined above specifying that you would not count branded keywords. The last thing you want is to run a magazine ad which increases searches for your brand 2000% and have to pay the agency for the influx of organic search conversions that you would have gotten anyway! With all the organic search data lumped into one bucket now, I don't see how that will work anymore personally.
-
If someone asked me to work on the basis of results I would decline. Why? Because I don't have any control over new companies entering your business niche. That is market risk that belongs to the business owner, not a service provider.
Even if you offered me a percentage of sales I would not take the deal because sales are determined by factors that you control such as retail price level, shipping charged, quality of staff serving the customer and more.
SEOs have a base line value on their time that is determined by how much they can earn by doing other things. If you want the time you gotta pay the price.
Perhaps SEOs who are new to the market or those who will do "anything required" to get your site ranked and collect the fee will be interested. But they might not be able to hold those results once Google figures out that they have spammed.
-
What are "results"?
Are you willing to turn over your entire site to the "SEO provider"? If not, it's truly difficult to pay for results.
It's a two-way street; your SEO firm can only be effective if you're doing your part. The days of paying a company to "go out and do some SEO" are long gone.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Short description about our search results drop + forum moving to subdomain question.
Hello, here is our story. Our niche is mental health (psychology, psychotherapy e.t.c). Our portal has thousand of genuine articles, news section about mental health, researches, job findings for specialists, a specialized bookstore only with psychology books, the best forum in country, we thousands of active members and selfhelp topics etc. In our country (non english), our portal has been established in 2003. Since then, for more than 15 years, we were no 1 in our country, meaning that we had the best brand name, hundreds of external authors writing unique content for our portal and hundreds of no1 keywords in google search results. Actually, we had according to webmaster tools, more than 1.000 keywords, in 1 and 2 position. (we were ranking no1 in all the best keywords). Before 2 years, we purchased the best domain in our niche. I ll use the below example (of course, domains are not the real ones):
Intermediate & Advanced SEO | | dodoni
We had: e-pizza.com and now we have: pizza.com
We did the appropriate redirects but from day one, we had around 20-30% drop in search engines. After 6 months -which is something that google officialy mentions, we lost all "credits from the old domain.. .and at that point, we had another 20-30% drop in search results. Further more, in any google core update, we were keep dropping. Especially in last May (coronovirus update), we had another huge drop. We do follow seo guides, we have a dedicated server, good load speed, well structured data, amp, a great presence in social media, with more than 130.000 followers, etc. According to our investigation, we came to one only conclusion: that our forum, kills our seo (of course, noone in our team can guarantee that this is the actual reason of the uge drop in may-in coronovirus google core update). We believe that the forum kills our seo, because it produces low quality posts by members. For example, psychopharmacology in a very active sections and we believe, google is very "sensitive" in these kind of posts and information. So here is the question: although the forum is very very active, with thousands of new topics and posts every month, we are thinking of moving it to a subdomain, from the subfolder that now is.
This will help our domain authority to increase from 38 that is stuck 2 years now, to larger scales. We believe that althougth this forum gave a great boost to the portal, in the past 10-15 years, it somehow makes a negative impact now. If I could give more spesific details, I d say this: in all seo tools we run, the best kewwords bringing visitors to us, arent anymore, psychology and psychotherapy and mental health and this kind of top-keywords, but are mostly the ones from the forum, like: I want to proceed with a suicide, I m taking efexor or xanax and they have side effects, why i gain wieght with the antidepressants I get etc. 1. Moving our forum to subdomain, will be some kind of pain, since it is a large community, with thousands of backlinks that we somehow must handle in a proper way, also with a mobile application, things that will have to change and probably have some kind of negative impact. Would that be according to your knowledge a correct move and our E-A-T will benefit for google, or since google will know that the subdomain is still part of the same website/portal, it will handle it somehow, the same way as it does now? I have read hundreds of articles about forum in subdomains or in subfolders, but none of them covers a case stydy like ours, since most articles are talking about new forums and what is the best way to handle them and where is the best place to create them (in subfolder of subdomain) when from scratch. Looking forward to your answers.0 -
Desktop vs. Mobile Results
When googling on www.google.ca for "wedding invitations" and in my own geo location market of Toronto, my site - www.stephita.com, will show up differently on SERP on desktop (Chrome & IE) vs. mobile (iPad, iPhone, android, etc.). On desktop SERP, I will show up 6/7 position... (which is relatively a new position, the past 3 weeks - I was previously on page 2) (After a bunch of SEO fixes, I've managed to propel my site back to page 1!) On mobile SERP, I only show up on 1/2 position on PAGE 2 😞 As I mentioned above, I did a bunch of SEO fixes that I think were related to Panda/Penguin algos. So I'm wondering why my MOBILE SERP has NOT improved along the way? What should I be looking at to fix this 5-6 position differential? Thanks all!
Intermediate & Advanced SEO | | TysonWong0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Incorrect URL shown in Google search results
Can anyone offer any advice on how Google might get the url which it displays in search results wrong? It currently appears for all pages as: <cite>www.domainname.com › Register › Login</cite> When the real url is nothing like this. It should be: www.domainname.com/product-type/product-name. This could obviously affect clickthroughs. Google has indexed around 3,000 urls on the site and they are all like this. There are links at the top of the page on the website itself which look like this: Register » Login » which presumably could be affecting it? Thanks in advance for any advice or help!
Intermediate & Advanced SEO | | Wagada0 -
What URL parameter settings in GWT to choose for search results parameter?
Hello,we're about to disallow search results from crawling in robots.txt, but in GWT we have to specify URL parameters. URLs with 'search' parameter look like these: http://www.example.com/?search=keyword So in GWT we're setting the following parameter: search Question, what settings to set for it?
Intermediate & Advanced SEO | | poiseo0 -
How to optimise for search results which are affected by Query Deserves Freshness?
I am looking to rank a clients site for certain keywords which have a huge exact local search volume in the 200,000 region. Many of these keywords are celebrity names like Victoria Beckham, Pippa Middleton. etc. 9 times out of 10 these people are in the news and the first page is taken up by new article/news results. My client is a large media publishing company so their site is very relevant. Does anyone know how to optimise for getting on the first page with these types of queries? Thanks Barry
Intermediate & Advanced SEO | | HaymarketMediaGroupLtd0 -
SEOMOZ duplicate page result: True or false?
SEOMOZ say's: I have six (6) duplicate pages. Duplicate content tool checker say's (0) On the physical computer that hosts the website the page exists as one file. The casing of the file is irrelevant to the host machine, it wouldn't allow 2 files of the same name in the same directory. To reenforce this point, you can access said file by camel-casing the URI in any fashion (eg; http://www.agi-automation.com/Pneumatic-grippers.htm). This does not bring up a different file each time, the server merely processes the URI as case-less and pulls the file by it's name. What is happening in the example given is that some sort of indexer is being used to create a "dummy" reference of all the site files. Since the indexer doesn't have file access to the server, it does this by link crawling instead of reading files. It is the crawler that is making an assumption that the different casings of the pages are in fact different files. Perhaps there is a setting in the indexer to ignore casing. So the indexer is thinking that these are 2 different pages when they really aren't. This makes all of the other points moot, though they would certainly be relevant in the case of an actual duplicated page." ****Page Authority Linking Root Domains http://www.agi-automation.com/ 43 82 http://www.agi-automation.com/index.html 25 2 http://www.agi-automation.com/Linear-escapements.htm 21 1 www.agi-automation.com/linear-escapements.htm 16 1 http://www.agi-automation.com/Pneumatic-grippers.htm 30 3 http://www.agi-automation.com/pneumatic-grippers.htm 16 1**** Duplicate content tool estimates the following: www and non-www header response; Google cache check; Similarity check; Default page check; 404 header response; PageRank dispersion check (i.e. if www and non-www versions have different PR).
Intermediate & Advanced SEO | | AGIAutomation0