Open for Interpretation: a Black Box Gallery
ONGOING
Public gallery at SEASAgainst unassuming black facades, the spirit of experimental theater thrives in the black box, daring tradition, forging the avant-garde in the performing arts, and thrashing against fourth walls until they break. Here, abstractions and intimacy beget a creative, conceptual freedom; the as-is serves as the best prop for expression.
Yet the abstractions presented by computational fields present a much more obfuscated, confused reality: through complex algorithms and convoluted processing, black-box algorithms classify and predict without us needing to understand system internals. However, given greater understanding of the biases and risks present in this black-box approach, we are no longer satisfied with this compromise. Interpretability and explainability – two subfields of machine learning dedicated to the development of algorithms whose calculations and outputs can be understood using explicit language – have recently emerged as new ways of disambiguating the translation of human input into machine output.
With the opening of our student-directed gallery, Conflux asks visitors, artists, and technologists alike to think outside the white cube and inside these black boxes. We transform the museum into a participatory environment where visitors not only view artworks but also interact with them, thereby changing their initial conditions for the next viewer and fostering a more dynamic relationship between the artists’ intention, the viewer’s interpretation, and the methods of co-creation. Furthermore, we hope the space provides students with a venue for interrogating whether open-sourcing the artistic process enables the community benefit in creative fields as it does in research fields: by showcasing extensive documentation of process, we hope to inspire more reflective approaches in data/code art and build a repository that helps students get started in making their own data/code art, all while inspiring campus discourse on algorithmic interpretability and explainability.
The principles for operation are:
- All methods (e.g. code, construction) must be documented and made open-source to visitors (but can be access-restricted to members of the Harvard community depending on the wishes of the artist)
- No gallery labels may be used
- Artworks should place an emphasis on interactivity and participatory viewing
- All art shown must be open to interpretation
Team
(alphabetized)Project Lead & Concept
Sofia Chen ‘26
Hannah Nguyen ‘27
Technology Lead
John Vizhco Leon ‘26
Aida Baradari ‘25
Ida Chen ‘27
Peggy Yin ‘25
Advised by Prof. Martin Wattenberg