Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
-
Hi,
We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc.
The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them.
As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent.
The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is
1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it).
2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors).
Many thanks in advance
-
You might be better with a server side tracker like http://awstats.sourceforge.net/
The answer from Mat probably has the best logic, but the only problem is are you legally responsible for mitigating the possibility of fraud?
I would make sure you add this to the contract, as I am not sure you are going to be able to defeat a proxy or spoofer, just in case the referrer gets smart and decides to work the system.
An anti fraud system can be put into place, but LOL I am not sure you will have the access to the multi million dollar fraud monitoring tools that Google does, that are contstantly updated and algorithmically and systematically monitor as well as have auditors who manually do random checks...
-
Hi - Well we are really just acting on behalf of the client - that's what they want.
Also its only visitors from that specific website (very close niche) - not just any site
-
Google Analytics doesn't report IP Address though - which is another reason to take a different root. Not knocking GA, I love it. However it isn't the right tools for this.
I suspect that the fiverr gigs use ping or something the create the mass of "unique visits". Very easy to spot. Unless you have some fairly sophisticated tools to hand i'd imagine that any method that can deliver 5000 for $5 is going to be pretty easy to spot.
Might try it now though. I love fiverr for testing stuff
-
If you must use Analytics, I would drill down to the source of referral within analytics. This will give you the URL, page, or whatever. I think you can also drill down to the referring IP etc...
You need to log were they come from through them. Export your results every month and see a pattern.
If you get 500 referrals from website B's IP or URL, then its a sure way of knowing they are throwing people at you.
But Mats answer is best, will give you times, not just dates and will also give you more detailed info.
-
My question is: is unique visitors the right metric that you should be measuring? On Fiverr.com I can get 2000 to 10,000 unique visitors for $5. http://fiverr.com/gigs/search?query=unique+visitors&x=0&y=0
Can you tie your metrics to something else that might have more value for you, such as purchases, newsletter signups (still easy to fake, but at least takes a little more time), etc?
-
Google Analytics isn't designed to pull the data in the way you really want to for something like this. It can be done I suppose, but it'd be hard work.
There are only so many metrics you can measure, and all are pretty easy to fake. However having the data is an easy to access form means that you can spot patterns and behaviour, which are much harder to fake.
Probably a starting point would be to measure distribution of the various metrics on the referred traffic v the general trend. If one particular C class block (or user agent, or resolution, or operating system, or whatever) appeared at a different frequency in the paid traffic that would be a good place to look deeper.
Thinking less technically for a moment though, I bet you could just implement one of the many anti click fraud systems to do most of this for you. same idea, but someone else has already done the coding. Googling for click fraud brings up a stack of ads (tempting to click them loads and set off their alarms!!).
-
Hi Mat,
A very informative answer.
If someone is going to try and spoof analytics, then would they not also be able to equally try and fool the script?
If someone was to try this do you know how they would likely try and do it - essentially if I know what is likely to be tried, then I can work out something that could counteract it. Are there certain things that can't be fooled, or are very difficult to fool ? - EG things like browser resolution, location etc - or are this just as easy to spoof as anything else?
many thanks
-
It isn't hard to fake this at all I am afraid. Spotting it will depend on how sophisticated the person doing it is.
My personal preference would be not to use analytics as the means of counting it. Doing that you are going to be slightly limited in the metrics you have available and will always be "correcting" data and looking for problems rather than measuring more correctly and having problems spotted.
I'd have a script on page that logs that checks for a referrer and it if matches the pattern for website B creates a log record instead.
You then have the ability to set your rules. For instance if you get 2 referrals from the same IP a second apart would you count them? What about 10 per hour 24 hours a day? You can also log the exact timestamp with whatever variables you want to collect, so each click from the referring site might be recorded as:
- Time stamp
- Exact referring URL
- User agent
- IP
- Last visit (based on cookie)
- Total visits (based on cookie)
- #pages viewed (updating cookie on subsequent page views )
- and so on
Analytics doesn't give you access to the data in quite the same way. I'd definitely want to be logging it myself if the money involved is reasonable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mirror from my site
hi to all i find 2 site they do mirror my site and send back link to all my pages. do you thing its bad for my seo ?? my site is https://android-apk.org mirror sites | Who links the most | fryeboysent.com | 1,342,613 |
Intermediate & Advanced SEO | | moztabliq1
| ficyexp.cl | 934,654 | |0 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
Is SEO as Effective on AJAX Sites?
Hey Everyone, I had a potential client contact me about doing SEO for their site and I see that they have an AJAX site where all the content is rendered dynamically via AJAX. I've been doing SEO for years, but never had a client with an AJAX site. I did a little research and see how you can setup alternative pages (or snapshots as Google calls them) with the actual content so the pages are crawlable and will get indexed, but I'm wondering if that is as effective as optimizing static HTML pages or if Google treats AJAX page alternatives as less trustworthy/valuable. Also, does having the site in AJAX effect link building and social sharing? With the link structure, it seems there could be some issues with pointing links and passing link juice to internal pages Thanks! Kurt
Intermediate & Advanced SEO | | Kurt_Steinbrueck1 -
Real Estate Site Question
I'm working on this site: www.aldodavico.com - who is a real estate agent in Miami. Any ideas/best practices for SEO for a site like this one? It's got about 500 pages. I've never deal with such a huge site before.
Intermediate & Advanced SEO | | mrodriguez14400 -
Bad site migration - what to do!
Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration. Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed. They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs. This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
ReLaunching a very old site
Hi, I am in the process of re-vamping a website that hasn't been touched for years and whose rankings slowly dropped. Any best practice in how to do it making sure that there's not any more loss and - hopefully - it could go back to the old glory? The website is http://www.nlp-world.com Thanks
Intermediate & Advanced SEO | | pdmonline0 -
Getting rid of a site in Google
Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?
Intermediate & Advanced SEO | | JacoRoux0