Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
-
Hi,
We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc.
The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them.
As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent.
The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is
1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it).
2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors).
Many thanks in advance
-
You might be better with a server side tracker like http://awstats.sourceforge.net/
The answer from Mat probably has the best logic, but the only problem is are you legally responsible for mitigating the possibility of fraud?
I would make sure you add this to the contract, as I am not sure you are going to be able to defeat a proxy or spoofer, just in case the referrer gets smart and decides to work the system.
An anti fraud system can be put into place, but LOL I am not sure you will have the access to the multi million dollar fraud monitoring tools that Google does, that are contstantly updated and algorithmically and systematically monitor as well as have auditors who manually do random checks...
-
Hi - Well we are really just acting on behalf of the client - that's what they want.
Also its only visitors from that specific website (very close niche) - not just any site
-
Google Analytics doesn't report IP Address though - which is another reason to take a different root. Not knocking GA, I love it. However it isn't the right tools for this.
I suspect that the fiverr gigs use ping or something the create the mass of "unique visits". Very easy to spot. Unless you have some fairly sophisticated tools to hand i'd imagine that any method that can deliver 5000 for $5 is going to be pretty easy to spot.
Might try it now though. I love fiverr for testing stuff
-
If you must use Analytics, I would drill down to the source of referral within analytics. This will give you the URL, page, or whatever. I think you can also drill down to the referring IP etc...
You need to log were they come from through them. Export your results every month and see a pattern.
If you get 500 referrals from website B's IP or URL, then its a sure way of knowing they are throwing people at you.
But Mats answer is best, will give you times, not just dates and will also give you more detailed info.
-
My question is: is unique visitors the right metric that you should be measuring? On Fiverr.com I can get 2000 to 10,000 unique visitors for $5. http://fiverr.com/gigs/search?query=unique+visitors&x=0&y=0
Can you tie your metrics to something else that might have more value for you, such as purchases, newsletter signups (still easy to fake, but at least takes a little more time), etc?
-
Google Analytics isn't designed to pull the data in the way you really want to for something like this. It can be done I suppose, but it'd be hard work.
There are only so many metrics you can measure, and all are pretty easy to fake. However having the data is an easy to access form means that you can spot patterns and behaviour, which are much harder to fake.
Probably a starting point would be to measure distribution of the various metrics on the referred traffic v the general trend. If one particular C class block (or user agent, or resolution, or operating system, or whatever) appeared at a different frequency in the paid traffic that would be a good place to look deeper.
Thinking less technically for a moment though, I bet you could just implement one of the many anti click fraud systems to do most of this for you. same idea, but someone else has already done the coding. Googling for click fraud brings up a stack of ads (tempting to click them loads and set off their alarms!!).
-
Hi Mat,
A very informative answer.
If someone is going to try and spoof analytics, then would they not also be able to equally try and fool the script?
If someone was to try this do you know how they would likely try and do it - essentially if I know what is likely to be tried, then I can work out something that could counteract it. Are there certain things that can't be fooled, or are very difficult to fool ? - EG things like browser resolution, location etc - or are this just as easy to spoof as anything else?
many thanks
-
It isn't hard to fake this at all I am afraid. Spotting it will depend on how sophisticated the person doing it is.
My personal preference would be not to use analytics as the means of counting it. Doing that you are going to be slightly limited in the metrics you have available and will always be "correcting" data and looking for problems rather than measuring more correctly and having problems spotted.
I'd have a script on page that logs that checks for a referrer and it if matches the pattern for website B creates a log record instead.
You then have the ability to set your rules. For instance if you get 2 referrals from the same IP a second apart would you count them? What about 10 per hour 24 hours a day? You can also log the exact timestamp with whatever variables you want to collect, so each click from the referring site might be recorded as:
- Time stamp
- Exact referring URL
- User agent
- IP
- Last visit (based on cookie)
- Total visits (based on cookie)
- #pages viewed (updating cookie on subsequent page views )
- and so on
Analytics doesn't give you access to the data in quite the same way. I'd definitely want to be logging it myself if the money involved is reasonable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
New Section On Site Worth It?
We have been kicking around this idea for a while now, and I wanted to get the communities honest opinion before we begin building it. So we create a lot of posts on social media showcasing articles we find on SEO, tips and tricks, reviews, etc. We were thinking rather than always linking out to the other sites, we are going to create a section on our site called "From Around The Web" and have brief breakdowns of what was covered, then provide a link to the full article. Most of these would be between 300-500 words, and be optimized around what we were linking to and writing about. So since the content would not be "in-depth" would this hurt us in any way? To me, it doesnt not make sense to send people to the other article right away, when we can summarize it and link to the full articles from our site. (Most people dont want to read a 3000 word article on SEO, especially small business owners who just want the breakdown) Thoughts? Think it will help, or not be useful enough to invest labor in?
Intermediate & Advanced SEO | | David-Kley0 -
Implications from portfolio site
I'm looking for a bit of advice regarding links coming into main site from another site in the client portfolio. The main site we are working on has been going great, organic traffic has grown considerably. The past few weeks there has been a subtle decline including ranking for a few keywords down a little. What I have noticed is that there is another site in the portfolio (that I am not working on) has had a steady tailspin in organic traffic since Jan and i've been informed it is a dying site in terms of the products offered. This has some links in the main menu going directly to the main site. My gut feeling is to isolate the secondary site from the main (no-follow or remove links), but the impact on slightly dropped rankings on the main site is not directly related to those linked pages. Would you go for it and isolate anyway?
Intermediate & Advanced SEO | | MickEdwards0 -
Why is my blog out-ranking my main site?
Please see attached ranking history chart. On June 5th the chart shows that my main site is not coming up under our main keyword "door hangers" From then on, our blog took over. Any ideas why? Thanks Andrea lpEBciu.jpg
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
URL language on Global Sites
Has anyone looked into a page not ranking as well because the URL is in English when the subdomain is geared for a different country and different language? I can defiantly see this taking away from the user experience, but didn't know if there was any concrete evidence or case studies that would show if it is a big deal or not for rankability? I know this is a backwards question to begin with because the priority over rankability is always UX, but there may not be a way to fix it unless I can prove it is a big deal.
Intermediate & Advanced SEO | | Ryan_Henry0 -
Do follow or no follow on wordpress site?
I have read many different opinions on what links to make do follow on a wordpress website versus which ones to leave as no follow. (internal and external) There does not seem to be any consensus among the inputs to date. Any perspectives on this would be appreciated. thanks
Intermediate & Advanced SEO | | casper4340 -
Best practice for site maps?
Is it necessary or good practice to list "static" site routes in the sitemap? I.e. /about, /faq, etc? Some large sites (e.g. Vimeo) only list the 'dynamic' URLs (in their case the actual videos). If there are urls NOT listed in a sitemap, will these continue to be indexed? What is the good practice for a sitemap index? When submitting a sitemap to e.g. Webmaster tools, can you just submit the index file (which links to secondary sitemaps)? Does it matter which order the individual sitemaps are listed in the index?
Intermediate & Advanced SEO | | shawn810