Controversies

Article on ChatGTP that is worth reading, but is likely flawed. The fact that Noam Chomsky contributed and included his own theory of language is noteworthy.

In short, ChatGPT and its brethren are constitutionally unable to balance creativity with constraint. They either overgenerate (producing both truths and falsehoods, endorsing ethical and unethical decisions alike) or undergenerate (exhibiting noncommitment to any decisions and indifference to consequences). Given the amorality, faux science and linguistic incompetence of these systems, we can only laugh or cry at their popularity.

The following video provides a fascinating look into the gradual demise and transformation of the electronics giant Philips. Since I’ve seen the brand so much throughout my entire life, the video had a different impact on me even though the story is probably similar to that of many other older tech companies:

Interestingly, recent events have sparked attention in global politics to ASML, a major player in the microchip industry and one of the companies connected to Philips. See this article on policito.eu for information on the government’s announcement to impose new export restrictions on AML’s products to China.

This article from The New York Times discusses a controversy about the restitution of prehistoric objects by Indonesian government against a Dutch Naturalis museum.

While art museums have been grappling since the 1990s with claims that they hold or display looted Nazi art, and ethnographic museums have faced repatriation claims from African nations and Indigenous people worldwide, the Java Man case pushes restitution into the realm of the natural history museum — where it hasn’t been much of an issue until now.

Two interesting pieces on the dangers and uselessness of blockchains:

  • Bruce Schneider responding to comments made by Matthew Green on a letter written by academics and professionals to the US Congress urging them crically approach cryptocurrencies.
  • A rather polemical lecture from a computer security course by Nicholas Weaver on blockchain and cryptocurrencies: [http://web.archive.org/web/20220625191347/https://www.youtube.com/watch?v=J9nv0Ol-R5Q].

Clive Thompson makes some interesting remarks about the story of the Google engineer Blake who became convinced that Google’s conversation technology LaMDA. In a blog post he references the work of Sherry Turkle who showed how humans perceive robots as more real when robots seem needy:

This is something I’ve learned from the work of Sherry Turkle, the famous MIT scientist who studies the relationship between humans and machines. Turkle has studied a ton of robot-human interactions, and talked to a lot of users (and designers) of robots that are designed for human companionship— i.e. toy-robot babies, or toy-robot animals.

One thing she noticed? The more that a robot seems needy, the more real it seems to us.

Facebook/Meta is shutting down its facial recognition system. They explain their choice in this blog post.

But the many specific instances where facial recognition can be helpful need to be weighed against growing concerns about the use of this technology as a whole. There are many concerns about the place of facial recognition technology in society, and regulators are still in the process of providing a clear set of rules governing its use. Amid this ongoing uncertainty, we believe that limiting the use of facial recognition to a narrow set of use cases is appropriate.

This article from The Economist is worth a read to understand their position on brain research and the use of primates in studies:

And were laboratories in China and Japan to come up with treatments for neurological diseases such as Alzheimer’s as a result of their studies of monkeys’ brains, it would be near impossible for Western countries to refuse to buy them to treat their citizens. Leaving others to do the dirty work of generating knowledge using means you consider to be unethical, while at the same time encouraging it by adding to demand is not taking the moral high ground. It is hypocrisy. Better for Western countries to carry out the necessary but troubling research them- selves, working to the standards they deem necessary.

Surprising conclusions from Twitter on a recent controversy about a bias of their image cropping algorithm towards white people and women.

We considered the tradeoffs between the speed and consistency of automated cropping with the potential risks we saw in this research. One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.

Via The Register.

In a press release, NVIDIA announced that is apparently starting to change their GPUs to reduce their Ethereum hash rate and hence try to prevent them from being used for crypto mining.

The Financial Times notes how the technology has been appropriated for crypto mining, causing shortages for video-gamers:

Nvidia’s graphics processing units, or GPUs, were first designed to handle the demands of rendering video images in real time, a challenge given the large amount of data that needs to be processed simultaneously. The same technology has since been adapted to become the workhouse of artificial intelligence, one of the most data-intensive computing tasks, as well as crypto mining.

As the FT further points out, NVIDIA created a new line of chips called Cryptocurrency Mining Processors.

An article from The Economist mentions how drone technology gets transported from the battlefields of Iraq to American cities:

The notion of putting cameras on orbit­ing drones to catch malefactors was born on the battlefields of Iraq, where American armed forces wanted to nab people leaving bombs on roadsides. Ross McNutt, a for­mer air­force engineer, founded Persistent Surveillance Systems (pss) to offer the same service to American cities (and oth­ers, such as Juárez) struggling with high murder rates. PSS drones flew over parts of Baltimore, most recently in May­-October 2020. St Louis, among America’s most viol­ent cities, also considered but is poised to reject PSS’s services, which raise difficult questions about how much surveillance Americans are willing to tolerate in ex­change for the promise of safer streets.