Controversies

Surprising conclusions from Twitter on a recent controversy about a bias of their image cropping algorithm towards white people and women.

We considered the tradeoffs between the speed and consistency of automated cropping with the potential risks we saw in this research. One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.

Via The Register.

In a press release, NVIDIA announced that is apparently starting to change their GPUs to reduce their Ethereum hash rate and hence try to prevent them from being used for crypto mining.

The Financial Times notes how the technology has been appropriated for crypto mining, causing shortages for video-gamers:

Nvidia’s graphics processing units, or GPUs, were first designed to handle the demands of rendering video images in real time, a challenge given the large amount of data that needs to be processed simultaneously. The same technology has since been adapted to become the workhouse of artificial intelligence, one of the most data-intensive computing tasks, as well as crypto mining.

As the FT further points out, NVIDIA created a new line of chips called Cryptocurrency Mining Processors.

An article from The Economist mentions how drone technology gets transported from the battlefields of Iraq to American cities:

The notion of putting cameras on orbit­ing drones to catch malefactors was born on the battlefields of Iraq, where American armed forces wanted to nab people leaving bombs on roadsides. Ross McNutt, a for­mer air­force engineer, founded Persistent Surveillance Systems (pss) to offer the same service to American cities (and oth­ers, such as Juárez) struggling with high murder rates. PSS drones flew over parts of Baltimore, most recently in May­-October 2020. St Louis, among America’s most viol­ent cities, also considered but is poised to reject PSS’s services, which raise difficult questions about how much surveillance Americans are willing to tolerate in ex­change for the promise of safer streets.

This Bloomberg article deals with an interesting controversy surrounding the introduction of a feature called “App Tracking Transparency,” which aims to give users the option to opt-out of tracking in apps. The article reports on newspaper full-page ads which Facebook published in response. Furthermore, Mozilla has joined the debate by publicly applauding Apple for the feature.

The Markup reviews reviewed different examples of controversial uses of machine learning algorithms in 2020:

Every year there are myriad new examples of algorithms that were either created for a cynical purpose, functioned to reinforce racism, or spectacularly failed to fix the problems they were built to solve. We know about most of them because whistleblowers, journalists, advocates, and academics took the time to dig into a black box of computational decision-making and found some dark materials.

One of the nice things about podcasts is that usually you can usually still subscribe to them via RSS without being tracked. Advertisers however are trying to find new ways to track your listening habits and target users for ads:

Advertisers are projected to spend more than 800 million on podcasts in 2020, and companies are devising ways to provide them with data that will persuade them to spend more. The most common tactics include using IP addresses to identify users, adding tracking URLs to ads, and abandoning RSS in favor of proprietary platforms that already track their users.

Read the full article from The Markup here.

The Register reports on a controversy surrounding the automatic image-cropping functionality of Twitter:

When previewing pictures on the social media platform, Twitter automatically crops and resizes the image to match your screen size, be a smartphone display, PC monitor, etc. Twitter uses computer-vision software to decide which part of the pic to focus on, and it tends to home in on women’s chests or those with lighter skin. There are times where it will pick someone with darker skin over a lighter-skinned person, though generally, it seems to prefer women’s chests and lighter skin.

It seems Twitter has not come up with a technical fix, but is instead resorting to. Read the full article here.

Another example of a controversy around surveillance practices. Adam Molina at Soundguys “Headphones are collecting too much personal data”. He seems to balance some of the conveniences that surveillance capitalist apps bring, but he is dismayed when he doesn’t see how his headphones tracking his music could benefit him=

On the flip side, I don’t know what I get in return for letting my headphones know what I’m listening to. Furthermore, I can’t think of a single reason why a pair of workout earbuds need access to someone’s menstrual history. We should just call it what it is because, at that point, it doesn’t feel like a transaction anymore. It’s just spying.

Caught this post on Hacker News by chance on how “With questionable copyright claim, Jay-Z orders deepfake audio parodies off YouTube”. The article discusses a controversy regarding copyright of deepfake audio that are created by a Youtube channel. The videos itself are absolutely fascinating, a technological showcase mixed with humour and creativity.

All videos can also be seen here.

Politico Europe reports on a controversy surrounding the use of food labels in Europe. A method developed in France that uses a “traffic light scheme” is being questioned by the Italian government. This “Nutri-Score” uses an algorithm to produce a score based on the types of nutrients. The coding scheme is gaining traction in several Member States. But according to the Italian government it gives some Italian gourmet food a bad score, and they are therefore proposing their own alternative which “resembles a charging light-blue battery” to classify food. Great example of standards in the making and ontological politics.

Read the full Politico.eu articles here and here.