Analyse This: We Know Who You Are

Analyse This

While retailers are already predicting their customers’ shopping needs, a future of eavesdropping set-top boxes and insurance-company predictor models is taking destiny out of your hands.

In the not-too-distant future, Amazon will be sending you items before you’ve even ordered them. Their knowledge of you, as a consumer, will be so sophisticated they’ll be able to predict what you want to buy before you’ve realised you want it. Does that terrify you or excite you?

Your Privacy and The Predictor Model

Either way, you better get used to it because predictive analytics is being used to change the way we live, love, work and play. Four years ago, the US company Verizon filed a patent allowing it to watch you watching TV via your set-top box; as well as collecting data on what you’re watching, for how long and how often you fast forward through the ad breaks, it wants to listen to your conversations so it can stream relevant real-time advertising. Don’t row with your partner in front of the TV – Relate might advertise their services.

Human-resources departments now analyse social-media profiles to predict a job candidate’s level of intelligence, as well as their emotional stability.

They claim they can predict how long the new employee will stay in their job even before they’ve started work.

One well-known company, which operates call centres, found those with the most Facebook connections underperformed compared to those with the fewest. Collecting that data to give an insight into your staff is one thing; using it as a recruitment tool is quite another.

The US retailer Target created a predictor model that suggested anyone buying a specific combination of 25 products was bound to be pregnant.
A father in Minneapolis went into his local store furious that his “innocent” 15-year-old daughter had been sent coupons for discounted maternity products. A few days later he apologised to the store manager; he’d had a long talk with his daughter. In this case, analytics meant that a supermarket chain knew a high-school girl was pregnant before her own father did.

Actually, the algorithm created by Target’s in-house team was only 87 per cent accurate. It identified correlations, but these correlations are being heralded as the end of causality. Imagine an insurance company found a correlation between the time taken to complete an online claim form and the likelihood of fraud. It wouldn’t need to know why time was an indication of fraud, it would just need to know that it was. Don’t stop filling in your form to answer the door to a Jehovah’s Witness – you’ll end up in court.

“What happens if your insurance premium is increased based on your probability to claim in the future even though that future hasn’t arrived?” asks Bernard Marr in his new book Big Data. “What happens when someone is refused a mortgage because some algorithm identifies that person as a high risk even though he’s never actually defaulted on a mortgage before?” At what point does privacy give way to probability?

Target has its pregnancy predictor model; imagine it was possible to identify a number of factors, ranging from the contents of a shopping basket to TV habits, which indicated a person was a future mass murderer. Would it be right to take action before a crime had been committed? Just because someone is likely to behave a certain way doesn’t mean they will. The data revolution is here to stay, but its ramifications have not been fully understood. ML