🐓 The chicken came first. 🥚 The egg followed. / Humans + Tech - #21
+ Robots help humans communicate better in teams + AI is nudging people to change their behaviour + Tech in the age of coronavirus
We are back to regularly scheduled programming this week after focussing entirely on the coronavirus in last week’s newsletter. I’ll maintain a coronavirus + tech section each week to discuss all the latest tech-related coronavirus news. I hope for everyone’s sake this is a short-lived segment 🤞.
Okay … onto the topic of the subject line which is probably why you opened this email.
🐓 The chicken came first. 🥚 The egg followed.
I came across this interview with Jonathan Rossiter [OpenMind], a robotics professor at the Bristol Robotics Laboratory. He talks about how they are working on soft robots made out of compliant biodegradable material that can feed on pollution and turn it into electricity to power the materials it’s made out of. They are also working on soft robots that we can eat so that they can go into our bodies and solve issues like indigestion, deliver chemicals inside our body, and even kill cancer cells.
Later in the interview, he is asked if the robots will be able to reproduce and die? He replies:
Robots are really interesting because at the moment they can’t reproduce. They are born when we make them in a factory and they die when they stop working and then we throw them away. But when we throw them away it takes a lot of energy to recycle them and to reuse those components. Nature, on the other hand, does that really efficiently: it creates organisms through the process of birth and these organisms live in the environment through a process of homeostasis. So, they keep going, they keep living on a day to day basis. Then, when they’ve run out of energy, they die. But, of course, throughout their life they reproduce… So yes, we are working on robots that can reproduce, live longer and die. Reproduction is coming.
With that answer, I suddenly got my answer to the question, “Which came first? The chicken, or the egg?”
When we create something, we first make a prototype. If the prototype is useful, we make more of them. Eventually, we want to take ourselves out of the equation so that they can build more of themselves without our input. That’s when we begin to think about the reproduction system.
I assume the higher intelligence that created us — I like the term higher intelligence as it encompasses evolution, gods, goddesses, universal energy, nature, source energy, mother earth, or whatever you consider our creator to be — used a similar train of thought while creating all the species on this earth. That intelligence probably first created the chicken and then considering it worthy of reproduction, designed a way for it to accomplish that on its own. So the egg followed.
If you don’t agree with my conclusion, let me know why by replying to this email :)
🤖 Sorry, I made a mistake. I’m only robot after all.
A study led by Yale University [Science Daily] showed that when humans were teamed up with a robot that expressed vulnerabilities such as admitting mistakes, the humans on the team communicated with each other more and also reported having a more positive group experience. In groups where a robot on the team merely recited the game’s score or voiced other neutral statements, the humans on the team were not as communicative with each other.
"We know that robots can influence the behavior of humans they interact with directly, but how robots affect the way humans engage with each other is less well understood," said Margaret L. Traeger, a Ph.D. candidate in sociology at the Yale Institute for Network Science (YINS) and the study's lead author. "Our study shows that robots can affect human-to-human interactions."
👉 Ay, I think you should consider doing this
AI trained on historical data picks up human biases since they are reflected in the data. Amazon once tried to create an AI hiring tool hoping to eliminate human bias from the selection process. After training the algorithm on Amazon hires from the previous 10 years, the AI exhibited gender bias as it learned that Amazon preferred men over women from the data set. Amazon eventually scrapped the project.
Humu Inc., a startup in California is betting on AI nudging humans to help address identified problems [The New York Times]. Their system provides nudges that communicate subtle alternatives to influence human behaviour.
With Humu, if data shows that employees aren’t satisfied with an organization’s inclusivity, for example, the engine might prompt a manager to solicit the input of a quieter colleague, while nudging a lower-level employee to speak up during a meeting. The emails are tailored to their recipients, but are coordinated so that the entire organization is gently guided toward the same goal.
Unlike Amazon’s hiring algorithm, the nudge engine isn’t supposed to replace human decision-making. It just suggests alternatives, often so subtly that employees don’t even realize they’re changing their behavior.
The part that troubles me here is “employees don’t even realize they’re changing their behaviour.” If these nudges are based on psychological manipulation, they could also be used in nefarious ways. And if the people don’t realize that their behaviour is changing due to that manipulation, it is dangerous.
+ Artificial Intelligence Will Do What We Ask. That’s a Problem. [Quanta Magazine] - Stuart Russell, a computer scientist at Berkeley is working with a team to teach robots how to learn the preferences of humans who never articulated them and perhaps aren’t even sure what they want. This is a really good read and brings up important questions in understanding humans.
📱 Tech in the age of coronavirus
The most worrying development this week on the technology front of the coronavirus battle is the invasion of privacy by various governments worldwide to combat the virus. For regular readers of this newsletter, you know that I’m a big advocate of privacy. And this news worries me.
📍 Your privacy is up for grabs
🇮🇱 Israel’s prime minister Benjamin Netanyahu said the government would approve emergency regulations to allow for the use a treasure trove of cellphone location data for 30 days [The New York Times], with the attorney general’s permission. This data has been gathered over multiple years to combat terrorism.
The idea is to sift through geolocation data routinely collected from Israeli cellphone providers about millions of their customers in Israel and the West Bank, find people who came into close contact with known virus carriers, and send them text messages directing them to isolate themselves immediately.
🇩🇪 Germany’s main public health body suggested that smartphone location data could be used to track people as a tool for curbing the spread of the coronavirus there. What about GDPR you say? Germany recently inserted wording into its GDPR enabling the processing of personal data in the event of an epidemic, natural, and man-made catastrophes. Convenient.
🇬🇧 In the UK, telecom provider O2 is in discussions with the government [The Telegraph] to use cellular data to generate anonymous heat maps to help authorities understand the movement of people in certain areas. The data will be used to build models to curb the spread of the virus. They say all data provided will be anonymised and fall in line with GDPR and privacy regulations.
🇺🇸 The U.S. government is also in discussions with the tech industry including Facebook and Google, as well as health experts on how to use location data to combat the coronavirus [The Washington Post]. They are actively discussing the possibility on how they can use aggregated anonymised location data to gain insights into coronavirus hotspots, where it’s likely to spread to, and the impact of social distancing.
The China and South Korea links below were shared in Issue #19 of this newsletter, but I’m re-posting them here as they are on the same topic.
🇨🇳 China is using mass surveillance to try and contain the coronavirus [The New York Times] - This doesn’t bode well for the future. History of other monitoring tools launched by China during major events like the 2008 Beijing Olympics and the 2020 Shanghai World Expo show that these tools remain in use long after the events are over.
🇰🇷 South Korea is watching quarantined citizens with a smartphone app [MIT Technology Review] - The South Korean government is using GPS via the app to ensure those who have been ordered to stay at home do not break quarantine. South Korea is also sending “safety guidance texts” throughout the day, reminding people to wash their hands, etc. When a new person is discovered to be infected, the texts provide a link to see the locations that the person visited before being diagnosed. This is fuelling accusations of extra-marital affairs and other personal and private issues are being exposed [The Guardian].
❗️ The dangers of taking privacy for granted
I can see the reasoning behind using location data to help prevent the spread of the virus. We are in unprecedented times and many lives are at stake. If it is solely used for the purpose of curbing the virus and stops immediately after, I would fully support the use of location data. My concern is that this invasion of privacy will outlive the virus and remain entrenched as a way of life.
Look at the air travel security regulations still in place long after 9/11. Not allowing liquids over 3.4 ounces on a flight made absolutely no sense and the restriction has suddenly been lifted at the blink of an eye [The Slate] so that people can carry a 12-ounce bottle of hand sanitizer to combat the coronavirus.
The TSA can declare this rule change because the limit was always arbitrary, just one of the countless rituals of security theater to which air passengers are subjected every day.
History shows that any surveillance measures in place to address a particular need are never rolled back once the requirement is no longer there.
👤 Anonymised location data is almost meaningless
There are only so many people who stay at your home for more than 8 hours in a day. And even less who travel between your home and your place of work regularly. To figure out exactly who that anonymous dot is via a small amount of location data is trivial. If you’re still not convinced, read this article - Twelve Million Phones, One Dataset, Zero Privacy [The New York Times].
Anonymised + aggregated data is better, but neither Facebook, nor Google, nor telecom companies, nor governments have the best track record when it comes to protecting or respecting your privacy. I fear that once this line is breached, it will be the new normal, long after the virus is eradicated.
Albert Gidari, director of privacy at Stanford Law School’s Center for Internet and Society, sums this up perfectly in a thread on twitter:
He ends the thread with this tweet:
🗞 More articles on Coronavirus + Tech:
+ The Coronavirus Simulator [The Washington Post] - Some very cool simulations to understand how social distancing helps to “flatten the curve.”
+ Doctors Have Now Mapped How Our Immune System Fights COVID-19 [Science Alert] - This is useful for two reasons. First, the information helps us develop vaccines that imitate the body’s natural immune response. Second, for future disease outbreaks, predictions on which patients are likely to have mild symptoms and which are at risk of dying, will have higher accuracy.
+ Coronavirus Roils 2020 Race, Forcing a Virtual Campaign Season [Bloomberg] - The US presidential race may be fought almost entirely online.
+ Facebook Is Clamping Down On Coronavirus Misinformation In English, But Hoaxes Are Going Viral In Other Languages [Buzzfeed] - An endless game of cat and mouse.
+ US Just Started The First Human Trial of a Vaccine For The New Coronavirus [Science Alert] - Fingers crossed this works.
+ Safaricom waives Sh1,000 M-Pesa transaction fees [Business Daily Africa] - The most popular mobile money service in the world is cutting transaction fees to help their customers go cashless and help curb the spread of coronavirus.
+ How the CDC Botched Basic Science in Its Attempt to Make a Coronavirus Test [Vice] - “It actually boggles the mind,” one expert wrote on Twitter, describing the magnitude of the CDC’s mistake.
Quote of the week
“The way things are supposed to work is that we're supposed to know virtually everything about what they [the government] do: that's why they're called public servants. They're supposed to know virtually nothing about what we do: that's why we're called private individuals.”
― Glenn Greenwald
I wish you a brilliant day ahead :)