I Called Off My Wedding. The Internet Will Never Forget / Humans + Tech - #75
+ In an AI world we need to teach students how to work with robot writers + Encryption Has Never Been More Essential—or Threatened + Other interesting articles
Hi,
I don’t ask often, but if you enjoy Humans + Tech, please share it with those that you think will enjoy this newsletter.
I’m only interested in getting engaged users, not mass subscribers, so please send it only to those that you think will definitely enjoy these types of topics.
Thank you in advance, and a special thanks to those of you who have already been sharing previous issues.
Onto this week’s articles.
I Called Off My Wedding. The Internet Will Never Forget
Lauren Goode relates how technology constantly reminds her of her wedding that she and her ex called off even after a year. From Pinterest to Google Photos to Apple Photos to Facebook and many apps that she used for wedding planning, they all constantly bring up memories from her past that she would rather forget [Lauren Goode, WIRED].
Omar Seyal, who runs Pinterest’s core product, explained it to her in this way:
“We call this the miscarriage problem,” Seyal said, almost as soon as I sat down and cracked open my laptop. I may have flinched. Seyal’s role at Pinterest doesn’t encompass ads, but he attempted to explain why the internet kept showing me wedding content. “I view this as a version of the bias-of-the-majority problem. Most people who start wedding planning are buying expensive things, so there are a lot of expensive ad bids coming in for them. And most people who start wedding planning finish it,” he said. Similarly, most Pinterest users who use the app to search for nursery decor end up using the nursery. When you have a negative experience, you’re part of the minority, Seyal said.
When engineers build ad retargeting platforms, they build something that will continually funnel more content for the things you’ve indicated you’re interested in. On average, that’s the correct thing to do, Seyal said. But these systems don’t factor in when life has been interrupted. Pinterest doesn’t know when the wedding never happens, or when the baby isn’t born. It doesn’t know you no longer need the nursery. Pinterest doesn’t even know if the vacation you created a collage for has ended. It’s not interested in your temporal experience.
Lauren’s story is captivating, and it’s an extremely well-written article. Anyone who has had painful memories can empathize with her on the torment and pain these constant reminders cause her.
It’s also a stark reminder that most internet services are geared to serve advertisers and marketers rather than their users. This realisation from reading Lauren’s story saddened me.
In an AI world we need to teach students how to work with robot writers
Natural Language Processing systems like GPT-3 from OpenAI can produce text that is almost as good and sometimes on par with human-written text. These systems are now available to the public. Lucinda McKnight argues that instead of penalising students for using them, the education system embraces them and instead teaches students how to write with machines [Lucinda McKnight, The Conversation].
Tools such as Turnitin — originally developed for detecting plagiarism — are already using more sophisticated means of determining who wrote a text by recognising a human author’s unique “fingerprint”. Part of this involves electronically checkinga submitted piece of work against a student’s previous work.
Many student writers are already using AI writing tools. Perhaps, rather than banning or seeking to expose machine collaboration, it should be welcomed as “co-creativity”. Learning to write with machines is an important aspect of the workplace “writing” students will be doing in the future.
She also brings up the challenge of the inequalities this may cause. Well-off students will have access to more sophisticated AI writers that sound more natural unless schools provide all students access to the same systems. There is also the challenge of who gets credit and who should be held legally liable for content created—definitely interesting points to ponder upon.
Encryption Has Never Been More Essential—or Threatened
In the last couple of decades, and increasingly in the last decade, as communications have continuously transformed to digital, governments worldwide are seeking backdoors to technology devices and apps. Even among democracies. The problem with this is that once a backdoor is created, the bad guys will also find it and exploit it [Will Cathcart, WIRED].
Some governments are honestly trying to fight crime and looking to the dramatic increase in technology in our lives as a potential source of new evidence. Their criticism is that end-to-end encryption makes it harder for law enforcement to find evidence of a crime, and harder for companies to monitor people’s calls and messages to refer to law enforcement. But this is looking at a problem in isolation. It was never possible or easy to access most people’s private conversations when they were happening physically instead of digitally. We should not assume that just because technology makes something easier to do, we should do it.
We intuitively understand this when we think of physical spaces. Some of the most tragic crimes happen in the privacy of people’s homes. That doesn’t mean we would let the government put a surveillance camera in every house with a remote-controlled on/off switch. For the same reason, we should not build a means to silently monitor billions of private conversations just because we could.
But the most dangerous reason that Cathcart points out is what it would do to our behaviour as individuals if we were constantly being watched and there was no privacy in any conversation we had, written or spoken.
In the last century, Hannah Arendt helped us understand totalitarianism as the elimination of privacy by the state. I fear that if we abandon or weaken the tools that preserve our privacy and security, censorship will come not from above, but from within.
Imagine if your government, or a foreign one, could see every transaction you made, or if your boss could see every text message you wrote or photo you sent. What if your friends could see every question you asked your doctor?
That’s the greatest risk of all: No matter how well-meaning the motivation, surrendering our privacy would paralyze us.
+ The UK Is Trying to Stop Facebook's End-to-End Encryption [Gian Volpicelli, WIRED UK]
Case in point. The UK Home Office is attempting to dissuade Facebook from rolling out end-to-end encryption to all its messaging apps.
Other interesting articles from around the web
😳 Scientists create online games to show risks of AI emotion recognition [The Guardian]
Emotion recognition is finding its way into many products, but it’s dangerous as it is inaccurate and racially biased. To create awareness of the inaccuracy of these systems, researchers have created an online game that you can try out.
A team of researchers have created a website – emojify.info – where the public can try out emotion recognition systems through their own computer cameras. One game focuses on pulling faces to trick the technology, while another explores how such systems can struggle to read facial expressions in context.
Their hope, the researchers say, is to raise awareness of the technology and promote conversations about its use.
The site notes that “no personal data is collected and all images are stored on your device”.
🤘 There’s a new Nirvana song out, and it was written by Google’s AI [Vanessa Bates Ramirez, Singularity Hub]
If I had heard this song before reading the headline, I wouldn’t have been able to tell that this wasn’t really Nirvana. The words, the music, the melodies are just like Nirvana’s songs. Listen to the song at the link.
Here’s how a computer was able to write a song in the unique style of a deceased musician. Music, 20 to 30 tracks, was fed into Magenta’s neural network in the form of MIDI files. MIDI stands for Musical Instrument Digital Interface, and the format contains the details of a song written in code that represents musical parameters like pitch and tempo. Components of each song, like vocal melody or rhythm guitar, were fed in one at a time.
The neural network found patterns in these different components, and got enough of a handle on them that when given a few notes to start from, it could use those patterns to predict what would come next; in this case, chords and melodies that sound like they could’ve been written by Kurt Cobain.
This is all part of a project by a Toronto-based organisation called Over the Bridge as part of a focus on mental health in the music industry. The song is part of an album called The Lost Tapes of the 27 Club.
Cobain isn’t the only musician the Lost Tapes project tried to emulate; songs in the styles of Jimi Hendrix, Jim Morrison, and Amy Winehouse were also included. What all these artists have in common is that they died by suicide at the age of 27.
Quote of the week
The internet is clever, but it’s not always smart. It’s personalized, but not personal. It lures you in with a timeline, then fucks with your concept of time. It doesn’t know or care whether you actually had a miscarriage, got married, moved out, or bought the sneakers. It takes those sneakers and runs with whatever signals you’ve given it, and good luck catching up.
—Lauren Goode, from the article, “I Called Off My Wedding. The Internet Will Never Forget” [WIRED]
I wish you a brilliant day ahead :)
Neeraj