against the term "critical ai" (not the general idea though)

The term "AI" is one of those words so well-worn by corporate marketing departments that all the meaning has been worn out of it. It has been worn until it has frayed and torn apart and ended up as little more than a term to be picked up and tacked onto things by corporate marketing departments to sell whatever technology they have thought of next.

It's not exactly a co-incidence that this has happened; I don't have numbers on this to hand but I'm certain that the product labelled as an "AI-powered car saving lives" sells better than the "statistical model in a box on wheels responsible for not killing you;" the "AI-enhanced camera" is probably more popular than the "over-exposed camera that will mash up your photos in otherwise interesting ways;" perhaps in more sinister fashion, the "AI-powered border safe-keeping technology" is more palatable to most than "a bunch of cameras and barbed wire intended to keep refugees out."

"AI" has become an anodyne term that does more to obscure than it does to illuminate and explain. This should probably be the first observation that any "critical AI" scholar makes – at least some consideration of what the term means, where it came from, and what it is used for. Blindly accepting the term "AI" as given, and using it as the basis for analysis of modern data processing and storage systems, benefits those who use the term to obscure what they are doing, or (worse) as euphemism for the horrible consequences that their software systems can bear.

As such, the notion of the "critical AI" scholar feels like an oxymoron. AI has several problematic connotations; the idea that if machine systems are "intelligent" that their output should not be subjected to the same level of scrutiny as we would other industrial systems (such as planes, chemical refineries, nuclear reactors, etc); the idea that it is something new – that we are on the frontier of some brave new world that technology is cutting through – which is usually not true (even if the current techniques are new, what they do is in effect what computer systems have always been intended to do – process large volumes of information in order to derive usable output); the conflation with what the term used to mean and has now been renamed "GAI" (general artificial intelligence).

The term AI covers up and elides meaning. It carries unspoken assumptions – assumptions that the people using it want to implicitly perpetuate. Surely the purpose of scholarship should be to make things clearer, to improve our understanding of things. Accepting the term "AI" as given does not achieve this.


You'll only receive email when they publish something new.

More from Teymour Aldridge