Humans + Tech - Issue #2
Is your period tracker app sharing your personal information? Are mindfulness apps increasing your stress? Do you trust your calculator? Don't get fooled or conned again!
'Period tracker app spied on me - and told advertisers it thought I was pregnant'
I came across this Twitter thread by @TaliaShadwell and it’s both creepy and infuriating how much apps on our phones share information about us.

The tweet went viral, and Talia wrote about this in the Daily Mirror.
Last week I noticed something curious.
My Facebook feed was suddenly becoming cluttered with mummy and baby advertisements.
Out of nowehere, I was being targeted with sponsored posts for everything from pregnancy health vitamins, to baby clothing and children’s books.
It was bizarre because I have no children — and not that it’s anyone’s business — but I don’t plan on it anytime soon.
Like many women, I use a period tracking app to chart my monthly cycle.
Yesterday, I opened the app to make an update, only to find an alert flashing at me.
It was informing me that my period was very, very ‘late’.
In fact, it wasn’t late at all. I had simply forgotten to log last month’s cycle properly, and, because I have notifications for that app turned off, I hadn’t noticed when I didn’t complete the entry.
But the app had certainly noticed.
I corrected my cycle, and almost instantly the baby ads just stopped.
The ads instantly stopped!
So I pulled up the app’s privacy settings and noticed that while it promised not to share details I ‘entered manually’ with its third-party partners, the terms and conditions’ language cleverly avoided ruling out sharing information about aggregated data – like patterns or trends.
I have chosen not to name the app here – it is not a well-known one, and appears to have been made by a small developer.
First, how many of us read the terms and conditions of every app or service we use — it’s simply not practical to do this. And second, even if we did, would most of us laypeople understand the implications of the legalese in these documents?
In many ways, algorithms have made my life better, more convenient, and more efficient.
But algorithms are rather like the body’s bacteria – you might not be able to see or understand them – but you can be certain they are not always working for you.
Icky stories have emerged in recent years revealing some companies were secretly using period tracking calendars in an attempt to track the moods of their female employees.
But most upsettingly, an overwhelming volume of responses I received from women who had experienced a similar phenomena on their social media revealed it was happening following pregnancy trauma.
Multiple women revealed they were still being slammed with ‘happy baby’ advertisements on their social media feeds after miscarriage or stillbirth.
Chillingly, as several women who have had miscarriages have claimed to me – they received ads for infertility treatment immediately after their pregnant loss.
The more I think about this, the more I see this as an ethical issue. This is also dependent on the situation. If you’re advertising a luxury item to high net worth individuals and the algorithm gets the targeting wrong, it will probably not cause any mental distress to the wrongly targeted customer. However, advertising baby items to someone who has just miscarried is an entirely different scenario.
Advertising platforms are probably happy if their targetting is right, say, 95% of the time. But the additional anguish and pain they cause the remaining 5% who are wrongly targeted for monetary gain is unethical, in my opinion.
Should app developers stop sharing personal data, and should advertising companies stop their use of personal data for monetization in such scenarios? Or at minimum, will they refine their algorithms to prevent false positives like marketing ‘happy baby’ advertisements to women who’ve just had a miscarriage?
It’s no simple task to design empathy into an algorithm, but should they be deployed in such cases before they are more intelligent? If the main motivation is money or profitability, then the answer is “Yes.” If the motivation is what is ethical and what benefits humans, then the answer is “No.”
Buddhism scholars: Meditation apps are fueling tech addiction, not easing stress
Mindfulness practices, as pursued by the Buddhist apps, involve guided meditation, breathing exercises, and other forms of relaxation. Clinical tests show that mindfulness relieves stress, anxiety, pain, depression, insomnia, and hypertension. However, there have been few studies of mindfulness apps.
The current popular understanding of mindfulness is derived from the Buddhist concept of sati, which describes being aware of one’s body, feelings, and other mental states.
Here, mindfulness enables one to appreciate impermanence, not become attached to material things, and strive to attain greater awareness so that one can ultimately become enlightened.
Mindfulness apps, on the other hand, encourage people to cope with and accommodate to society. They overlook the surrounding causes and conditions of suffering and stress, which may be political, social, or economic.
An interesting perspective from two Buddhism scholars, Gregory Grieve, and Beverly McGuire, who argue that these apps only mask the symptoms of stress, and don’t address the underlying cause. Instead, they are making us more addicted to our phones and lead to more stress.
I haven’t tried any of these apps myself, but if you do use these apps, then please let me know in the comments below if you agree with their point of view or not.
Bizarre Calculator Experiment Shows How Reliant People Are on Technology Being 'Honest'
In a study published this week, Texas Tech University researchers tested how university students reacted when unknowingly given incorrect calculator outputs.
Some students were presented with an onscreen calculator that was programmed to give the wrong answers, whereas a second group was given a properly functioning calculator.
Participants could also opt not to use the calculator, but most chose to use it – even if they had good numeracy skills. Researchers found most participants raised few or no suspicions when presented with wrong answers, until the answers were quite wrong.
This is an understandable scenario, because, through years of use, we know for certain that calculators don’t make mistakes. They’ve been tested thoroughly. The author acknowledges that:
Perhaps if they were completing their income tax forms, or applying for a loan, they may have been more thorough in checking their results. More importantly, there’s no reason an individual ought to feel suspicious about a calculator, so the participants were acting in accord with what we might expect.
But how much trust do we put in our other devices, apps, and services?
However, new technologies pose new challenges. Is the person you’re talking to online a real person or a bot? Are you developing a real romantic relationship on your dating app, or being conned in a romance scam?
To what extent do people blindly accept their technologies are safe, and that everyone online is who they claim to be?
Similarly, as autonomous vehicles become the norm, they too pose ethical concerns. Not only do we need to be worried about the programmed moral choices on whom to harm if an accident becomes inevitable, but also whether criminals can hack into these vehicles and alter programmed decisions.
Also, there have been reports of benign-looking USB cables being rigged with small WiFi-enabled implants which, when plugged into a computer, let a nearby hacker run commands. We even need to think about the safety of health devices, such as pacemakers, which can now be hacked.
Awareness is key here. Thinking about what data our devices are communicating to us and analyzing this information critically is vital to identify when something doesn’t look or seem right.
Don’t get fooled or conned again — here are the 5 tactics to look out for
I came across this via Josh Spector at For The Interested. It’s a great newsletter, and I highly recommend subscribing to it.
People and businesses routinely use five techniques to get us to do what they want, says presenter and broadcaster Alexis Conran.
“Magic and sales and scams and political beliefs all happen in the mind of the spectator,” Conran points out in a TEDxBerlin talk.
Because the process of being fooled takes place inside our minds, it’s up to us to realize when we’re being taken.
Watch the video, where Conran explains the five tactics used by people to get you to buy into their stories: misdirection, time pressure, opportunity, social compliance, and social proof.
Use this knowledge to understand how news headlines, political speeches, media, and people are using these techniques to manipulate you.
As Black Friday and Cyber Monday are on the horizon, be aware of these tactics used by companies and salespeople to get you to buy things that you don’t need.
As Conran says in his talk:
For too long, we have been ignorant of these things. Of how we function. And for too long, politcians, ad men, hustlers, magicians, have been scruplously studying these conditions and taking advantage of them. So it’s time we woke up to these techniques.
Man pleads guilty to remotely controlling his girlfriend's car with a computer
An Australian man pled guilty this week in the Magistrates Court to stalking his ex-girlfriend, largely through the use of an app that tracked and controlled her car.
During the couple’s six-month relationship, the man allegedly helped his then-girlfriend buy a Land Rover. He allegedly obtained the VIN number and used it to set up an account on an app that allowed him to turn the car on and off and adjust the windows. The service also sent him email notifications that showed the location of her car.
It shouldn’t be this easy to gain control of a car via an app. VIN numbers are not that difficult to obtain.
ABC reports that the victim told the court that one night she woke up and saw the man standing at the edge of her bed. She reportedly said he stood quietly for what “seemed like an eternity,” then said in a low voice “you’re lucky it’s just me and not a robber or a bad person to do you harm.”
What! This is a scene right out of a horror movie. I can only imagine what she must have gone through.
Car companies need to be more stringent with their customer’s privacy and have better approval processes in place to control cars via these apps.
Privacy protection is not moving at the same pace as technology advancement. I believe companies need to make privacy a priority and part of the decision-making process when developing new technologies.
I hope you enjoyed this issue and learned something new. If you find something interesting that relates humans and technology, send me a link at hello@humansplustech.com. I may post it here.
I hope you have a brilliant Sunday. Spend some time in nature if possible - it’s the best digital detox there is (only if you don’t take your devices with you 😄).
Thanks,
Neeraj