AmCham Slovakia

JOIN US

Web analytics is not quantum physics

In online marketing, we often see that people tend to simplify its functioning regardless of the information they have at their disposal. We have a large amount of data at our disposal, but we tend not to go into depth when analysing it and we mainly base our decisions only on intuition or the recommendations of others.

Although web analytics is not quantum physics, I feel that people don’t use its full potential and, in addition, they keep repeating the same mistakes. Web analytics is not primarily a tool for tracking traffic on your website. It allows you to get to know your visitors better and improve your results – financial or non-financial. It gives you real data, instead of just guesswork.

You don’t have to only rely on your intuition all the time, as it often causes you to make mistakes. This topic is dealt with by Daniel Kahneman, a Nobel Prize winner, in his book “Thinking, Fast and Slow”.

He explains that people are pattern seekers who tend to find models of behavior even where they are none. If you follow only your intuition, you will make mistakes more often than not because you will wrongly classify a random event as systematic.

Fortunately, we can eliminate this problem. Here are the most frequent problems that I encounter in relation to web analytics.
 
We lack information
“We ran A/B testing on our website and found out that the shorter form was filled in by seven people more than the original one, so we changed it on the whole website.”

This problem of a lack of information occurs, for example, with A/B testing. We mostly evaluate it by comparing the number of conversions obtained by testing variants and determine the winner of the test accordingly. It may be that the number of these conversions is low, but we settle for this low number nevertheless and base our decision on it.

This is not the right approach. If, for example, you own a small e-shop that sells 30 products a month, you need to make sure that the information you gather has statistical significance. It is not sufficient that one testing version has three conversions more than the original one. It may be a coincidence that may not be repeated.
 
We seek correlations where there are none
“I added more keywords to the texts on my website last week and the traffic on the whole website has increased.”

We sometimes tend to link certain events together and see a correlation between them that, in fact, is not there. If we make changes to our website and after a time it ranks higher in search engine results, this could be interpreted as a result of the changes we made.

However, the situation may be much more complicated. Page rankings in search engine results are influenced by many different factors. We cannot know for sure if we are the ones who caused the change in position or if it was caused, for example, by an update of the search algorithm. We cannot even be sure about the fact that the change we made had a positive effect. Everything that we cannot measure exactly should be evaluated very carefully in order not to waste time with activities we think are helpful when, in fact, they have no effect at all.

Spurious Correlations is a website devoted to seeking correlations between two events that are absolutely unrelated. One of their charts shows a correlation between decreasing per capita consumption of margarine and a decreasing divorce rate in Maine, USA. We also often compare similar, but not so funny, unrelated events in web analytics.
 
We evaluate only a part of information
“Most people that access our website from AdWords also then leave right away. We’re wasting our money on that. It makes no sense for us.”

Why do visitors from AdWords campaigns leave your e-shop without buying anything? Is it because AdWords is not right for you? Or did someone incompetent design your campaigns? Many people will settle for such simple reasoning that may seem logical at first sight without looking for the real cause.

There doesn’t have to be only one real reason for a failure, there could be many reasons. Visitors to your website might have been discouraged by not being able to navigate through the site. It might have been slow to load and they might not have been patient enough to wait. Alternatively, the site might have been displayed incorrectly in their smartphones or they might not have been willing to pay for a product by bank transfer. We shouldn’t settle for the first number that Google Analytics throws at us. We should analyse as many factors as possible that might have caused this and attempt to identify the most probable ones.
 
What works for one person might not work for another
“I read somewhere recently that it pays off the most to share articles at five o’clock on the Wednesday afternoon. We’re starting doing it that way too.”

The problem with people reading case studies is that, instead of finding inspiration from them, they often see them as a universal guide that will also work the same way on their website, regardless of the fact that the article was written by a company doing business in a different country, selling different products, and with a totally different target group.

I don’t mean to say that case studies aren’t useful. But case studies should primarily serve as inspiration and you should only apply information from them that has been proven in practice. Outputs from case studies need to be adjusted to your conditions and, in particular, tested and real data from your website and its visitors needs to be obtained.
 
Conclusion
The advantage of online marketing is that we can measure almost everything with it. That’s why it would be a pity not to evaluate data in a complex way and in a broader context. It is not rocket science and it can be learned. Evaluating data correctly just requires practice and patience.
 


Marek Šulik, Performance Marketing Director, VISIBILITY s.r.o.