Sign up to our newsletter

Sign up to our newsletter for all the latest new and updates.

Become a member

Membership of Type allows unlimited access to our online library. Join to support new research and writing on the design of the built environment.

You can read more about membership here.

Become a member

Already a member? Login to your account to avail of unlimited downloads.

Black boxes, knowledge gaps, and mystic abysses

Felix Hunter Green

22/5/2023

Future Reference

‘Black Boxes’ serve a unique role in the contemporary imagination. From theatre design to aviation and AI platforms, the appearance of the language of black boxes tends to signify that a knowledge or understanding gap has either emerged or been engineered. This article uses both physical and digital examples to explore what the various faces of this fluid metaphor can teach designers about expectations of control and accountability in emerging digital contexts.

Kazimir Malevich, Black Square (1915). Public domain, via Wikimedia Commons

The computational metaphor of a black box is not associated with colour or form, but with the notion that a system’s output can not necessarily be deciphered by analysing its inputs. It operates as an unknowable function in the passage of information.

The public release of OpenAI’s artificial intelligence (AI) chatbot Chat-GPT has recently brought AI to the forefront of the public imagination. Alongside mass fascination with its capabilities and potential uses, its rollout has been accompanied by ardent discussions around the legibility, trustworthiness, accountability, and even agency of AI programmes. For specialists, these issues are far from new, and the design-inflected question of AI explainability has been a pressing concern for programmers and user-interface experts for some time [1].

These recent debates have seen a resurfacing of the language of ‘black boxes’ in a broad public forum. In this context, the phrase is often used critically to conceptualise an understanding gap between a system and its users. It refers to an unknowable space that emerges when a system cannot easily ‘show its working’ to either its users or designers. For many, an accusation of a platform either being or incorporating a black box relates to the impossibility of full control or oversight over it. This typically arises from a lack of comprehension of the inner workings of that system. Prompts go into a black box style algorithm, and information comes out, but the connection between the two cannot be fully understood, even by its programmers [2].

In other words, the computational metaphor of a black box is not associated with colour or form, but with the notion that a system’s output can not necessarily be deciphered by analysing its inputs. It operates as an unknowable function in the passage of information. The sense of it performing like a ‘box’ has little to do with storage, but rather relates to an intractable containment of hidden knowledge that creates ethically-significant problems of causality (cause and effect) and accountability. For similar, largely symbolic, reasons, the terminology of black boxes finds another well-known (mis)use in the field of aviation. Again, the persistent metaphor is associated with the containment of something, in this case, the rarefied information about events that transpired in the final minutes of an ill-fated aircraft. In both, a black box metaphor appears at a moment of uncertainty between causes and effects.

The design of theatre auditoriums can help to conceptualise some of the consequences of living with black boxes at a human scale and in a spatial sense. In his influential book Suspensions of Perception: Attention, Spectacle and Modern Culture, cultural theorist Jonathan Crary points to the adaptations that Richard Wagner made to the design of the Festspielhaus in Bayreuth as a turning point in the dramatist’s ability to dominate audience attention [3]. This purpose-built festival hall, opened in 1876, saw Wagner make now-famous infrastructural interventions that would, he hoped, encourage his audiences to engage with the fictional worlds presented onstage in a more absorbed, even hypnotic way. Removing the sideways facing booths from the seating, visually shielding his orchestra from the audience and dimming the lights in the auditorium are perhaps the best cited examples of the type of adaptations he demanded.

Festspielhaus Bayreuth. User: 4077 at wikivoyage shared, CC BY-SA 1.0, via Wikimedia Commons

Crary, however, emphasises the significance of a less well-known innovation, an optical effect that would go on to be known as Wagner’s ‘mystic abyss’, in achieving a desired totalising engrossment of his audience in the presented scene [4]. This effect – the ‘mystic abyss’ – refers to the intentional insertion of unknowable distance between the stage space and auditorium achieved by separating the two with a series of receding, perspective-distorting proscenium arches. This intervention disrupted all continuous sight lines between stage space and the auditorium, thus perceptively and epistemologically severing the visual bonds between real space and fiction. In so doing, the mystic abyss demanded that audience members undergo a more fully-realised abandonment within the scene presented. They were encouraged to ‘pick a side’ between fiction and reality in a perceptive sense.

Contemporary black box theatres, arguably and ironically, represent a move away from these hallucinatory priorities. While on the one hand, some elements carry an inheritance from Wagner and early modern scenographers (their blank flexibility, typically low house-lighting and matt-black surfaces that visually privilege the fictional space on stage) on the other hand, their frequent ‘in the round’ layout means that their audiences tend to be more self-aware and often have the impression of sharing the event space with the performers. Again, the metaphorical name black box does not refer to their colour or shape, but rather to a more generalised aesthetic of containment of a space of fiction in a self-consistent interiority (box), supported by a humility of the playing space that bends to meet the various fictions that inhabit it (black). Unlike Wagner’s passive, hypnotised audiences, stripped of autonomy – if we are to follow Crary on this – these groups inhabit the same forum as the performers [5]. In this case, the ‘suspension of disbelief’ tends to be requested rather than insisted upon as the border of the theatrical universe is situated close to the entrance to the auditorium rather than between proscenium arches.

In a theatrical black box – unlike an AI-powered chatbot or a flight responder – the human element is ‘on the inside’, sharing a space and collaborating somewhat in the event that is live theatre. It might be hard to convey the full essence of what happens within a temporary theatrical universe to someone who never saw the show, but each event is always a joint venture.

The question of explainability in AI is not a settled issue in computer science, with some developers believing that too much potential is lost in the process of making an algorithm fully explainable to humans. In the context of these decisions being made away from the public forum, it is important for the rest of us to consider what costs must be paid in terms of accountability and autonomy in exchange for the enchantment and wonder earned across a mystic abyss.

AI-generated image created using Open-AI’s DALL-E platform. Prompt: ‘people inside a black box on a German hillside’.

In a theatrical black box – unlike an AI-powered chatbot or a flight responder – the human element is ‘on the inside’, sharing a space and collaborating somewhat in the event that is live theatre. It might be hard to convey the full essence of what happens within a temporary theatrical universe to someone who never saw the show, but each event is always a joint venture.

Future Reference is a time capsule. It features opinion-pieces that cover the current developments, debates, and trends in the built environment. Each article assesses its subject through a particular lens to offer a different perspective. For all enquiries and potential contributors, please contact cormac.murray@type.ie.

Future Reference is supported by the Arts Council through the Architecture Project Award Round 2 2022.

References

  1. AI Explainability or Explainable AI (XAI), as a principle refers to the practice of developing AI systems whose decision-making and reasoning can be accessed and interpreted by human users.
  2. A very thorough outline of this discussion can be found at: https://jolt.law.harvard.edu/assets/articlePDFs/v31/The-Artificial-Intelligence-Black-Box-and-the-Failure-of-Intent-and-Causation-Yavar-Bathaee.pdf.
  3. Wagner made adaptations to architect Gottfried Semper’s unrealised designs for a similar theatre, reportedly without Semper’s permission. See: https://en.wikipedia.org/wiki/Bayreuth_Festspielhaus.
  4. J. Crary, Suspensions of Perception, Cambridge MA, MIT Press, 2001, p. 252.
  5. J. Crary, 2001, p. 253.

Contributors

Felix Hunter Green

Felix Hunter Green works for the Irish Architecture Foundation in Dublin. Prior to this, he completed a PhD in Cultural Studies from the Edinburgh School of Architecture and Landscape Architecture. His research focused on immersive technologies, immersive environments, and related design practices. He taught modules in Architectural Theory (MA) and Design Informatics (MA). Earlier still, he worked in theatre design and production.

Updates

Website by Good as Gold.