Facebook learnt how to make its services addictive from Big Tobacco / Humans + Tech - #47
+ AI system detects loneliness in natural speech patterns + Russian scientist still plotting to create more CRISPR babies + Zoom’s and Twitter’s algorithms are biased against black faces
I was unable to send a newsletter last week due to some other commitments. Here are the interesting articles I found over the last couple of weeks.
Former Facebook manager: “We took a page from Big Tobacco’s playbook”
Tim Kendall, Director of Monetisation for Facebook from 2006 through 2010, speaking to the US Congress [Ars Technica]:
"The social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity," Kendall said in his opening testimony (PDF). "At the very least, we have eroded our collective understanding—at worst, I fear we are pushing ourselves to the brink of a civil war."
As director of monetization, he added, "We sought to mine as much attention as humanly possible... We took a page form Big Tobacco's playbook, working to make our offering addictive at the outset."
"There's no incentive to stop [toxic content] and there's incredible incentive to keep going and get better," Kendall said. "I just don't believe that's going to change unless there are financial, civil, or criminal penalties associated with the harm that they create. Without enforcement, they're just going to continue to be embarrassed by the mistakes, and they'll talk about empty platitudes... but I don't believe anything systemic will change... the incentives to keep the status quo are just too lucrative at the moment."
Several Facebook employees have also recently quit the company publicly, citing as reasons for doing so, the company’s indifference to inflammatory, racist, and violent posts from Donald Trump and that they profit off of hate.
AI system detects loneliness in natural speech patterns
Researchers at the University of California San Diego School of Medicine have developed a method to detect loneliness with 94% accuracy in older adults through Natural Language Processing (NLP), an AI technique [New Atlas].
Natural language processing (NLP) is an umbrella term encompassing a variety of techniques that process or analyze large volumes of unstructured natural speech and text. As artificial intelligence and machine learning systems have advanced, a number of fascinating preliminary studies have begun to suggest conditions such as psychosis, PTSD, bipolar disorder and depression may all be detected just by analyzing a person’s natural speech.
Now, a team of researchers is investigating whether these NLP tools can detect loneliness, a growing health concern that has been described as a bigger factor on premature mortality than obesity. Ellen Lee, senior author on the new research, suggests loneliness is a particularly difficult psychiatric condition to measure and because doctors generally struggle to quantify loneliness in patients there is a pressing need for some kind of objective measure.
Other interesting findings that came out of this study are that men and women use different verbal cues that indicated loneliness. The researchers are working on further studies to personalizing these tools to help people in real-time.
“Eventually, complex AI systems could intervene in real-time to help individuals to reduce their loneliness by adopting in positive cognitions, managing social anxiety, and engaging in meaningful social activities,” the researchers boldly conclude in the new study.
Russian scientist still plotting to create more CRISPR babies
There was a worldwide public outcry in 2018 when Chinese researcher, He Jiankui, announced he had created a pair of gene-edited twins using CRISPR. Now, Russian biologist Denis Rebrikov of Pirogov Medical University in Moscow states that he still plans to gene-hack human embryos, in an attempt to prevent congenital deafness [Futurism]. And unlike most other countries, Russia doesn’t even ban the practice outright, so he may even get away with it.
“We are still planning to correct the inherited hearing loss mutation in [the gene] GJB2, so that a hearing baby is born to a deaf couple,” Rebrikov told the magazine.
The International Commission on the Clinical Use of Human Germline Genome Editing was created by a team of doctors after Jiankui’s announcement. They recently proclaimed that human gene editing is still unsafe, especially when bringing an embryo to term. They suggest that it should only be used for life-saving purposes.
Zoom’s and Twitter’s algorithms are biased against black faces
Colin Madland was trying to figure out why Zoom was removing his colleague’s head when using a virtual background, but then stumbled upon Twitter’s algorithms also exhibiting bias.
When a large picture is posted on Twitter, Twitter’s algorithms crop the image to show only a small part that fits into the Tweet. The full image can be viewed by clicking on the cropped images.
After his discovery, many Twitter users tried this with different types of pictures, including cartoon characters. The algorithm always favoured the white faces for the cropped images.
Twitter is looking into this and has promised to improve its algorithms. I encourage you to read Colin’s thread as he highlights many articles discussing the systemic racism through algorithms that black people continue to be subjected to, even leading to wrongful convictions.
Quote of the week
Instead, the social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding—at worst, I fear we are pushing ourselves to the brink of a civil war.
—Tim Kendall, Director of Monetisation for Facebook (2006 - 2010), in testimony to the US Congress, 24 Sep 2020
I wish you a brilliant day ahead - this is much more likely if you stay off of social media :)