As businesses become further reliant on digital, connected, technologies the decisions made leveraging measurement via analytics are only as good as the data feeding those decisions. Board room decisions, mergers, acquisitions, and startup investments are increasingly reliant on website performance data. Analytics solutions like Google Analytics, Adobe Analytics (previously Omniture Site Catalyst), IBM Coremetrics, and Web Trends all help to power reports and dashboards for investors. These solutions and the data fed into them are becoming increasingly important for measuring return on investment and accuracy of that data has become pivotal.
Anu Hariharan, Frank Chen, and Jeff Jordan of Andreessen Horowitz share 16 key startup metrics that help investors gauge the health of a business when investing in it. Baremetrics shares insights on Monthly active users (MAU) being one of the numbers that can provide greater insights into the overall health of a business.
Click Fraud - Bots & Spiders
Bots, spiders and crawlers are automated applications that serve a variety of purposes like feeding Google's search functionality. How do you think Google is able to provide visibility to all those websites? Bots can also be used to potentially harm your business via ad fraud, content theft, hacking attempts or other nefarious actions. According to HostingFacts in December 2018 51.8 percent of global internet traffic comes from bots. That results in one automated hit for every human-generated hit to your website. As the percentage continues to grow so does the importance of controlling how this bot traffic is reflected in your Analytics data.
Improving Accuracy for Return on Investment
There currently is no 100% accurate way to extract all bots and spiders for analysis. Many analytics vendors help strip out some bots and spiders by default but analytics teams are challenged with a constant game of cat and mouse to ensure reports are as clean and accurate as possible.
In order for a user to gain accurate reporting on true "human traffic" there are a few tactics that can be used:
- Leverage your vendor
- Bring outside expertise via consultancies
- Do it yourself
- A few tricks: Leveraging IP address of the visitor and the user agent are a place to start. It is possible to populate the agent and IP in PHP with these two variables $_SERVER['HTTP_USER_AGENT'] and $_SERVER['REMOTE_ADDR']. One can then strip out the IP address or the user agent for known bots/spiders from reporting. It is possible to acquire lists of known bots and or spiders from different sources, here is one: http://www.user-agents.org/index.shtml. This method of cleansing will never be 100% accurate because new bots and or spiders pop up every day.
These nuances must be taken into account for accurate analysis of return on investment and other important calculations to your business and investments.