Talking With the Curators of Local Exhibition ‘Against the Machine’

In a minimal white space at 1809 Chapel Hill Road, works by ten multi-disciplinary local artists line the perimeter of a room. A large branch sculpture makes its way across the room; on the other end of the room, a stack of zines titled iQuit sit on a shelf, offering guidance on “kicking your iPhone addiction.”

One striking piece, by the artist Hiva Kadivar, incorporates ink and natural fibers on paper, demonstrating the “influence of indigenous weaving traditions on early computer science,” an allusion that ties into the A.I. industry’s learning models that offer “computational depth without social and historical depth.” 

This is Against the Machine: art in the age of A.I., fascism, and climate disaster, an animating exhibit at the People’s Solidarity Hub campus curated by artists Cassandra Rowe and charla rios, the center’s curators-in-residence. The exhibit’s summer run coincided with Amazon’s early-June announcement that it is investing $10 million into a Richmond County data center, as part of an effort to expand its A.I. infrastructure. 

“Generative A.I. is driving increased demand for advanced cloud infrastructure and computer power,” a company blog post reads, above the smiling image of a young woman wearing braces and safety goggles. “This deployment of cutting-edge cloud computing infrastructure will strengthen North Carolina’s position as an innovation hub.”

The announcement gives shape to a technology that has been slowly seeping into the corners of our lives for years, adding sector-specific jobs and replacing scores of others across industries, all while shifting fundamental premises about what it means to learn, determine, and create. 

Open since May, the exhibit space for Against the Machine has had intermittent drop-in hours on Tuesday afternoons; an opportunity to still the noise and take thirty minutes to browse art on a topic that becomes more timely—and ubiquitous—by the day. The exhibit runs through August 22. 

A piece by Rowe, “the wayback machine / you can’t take my memories,” has stuck with me since I visited. An acrylic painting of two girls in soccer jerseys is overlaid with strips of transparency film constructed from banned media; accompanying text details that the painting depicts Rowe with Lauren, her childhood best friend. Though she had a falling out with Lauren in her twenties, the text reads, the friendship was “changed—but not severed.” 

At its core, A.I. is designed to acquiesce—to respond to what it believes a user wants and needs. Rowe’s painting poses a pretty deep question: How could A.I. meaningfully engage with a friendship that didn’t survive in its original form, yet remains formative? Rowe isn’t simply questioning the ethical implications of supplementing or optimizing memories—she’s questioning the fundamental ability of the technology to engage with the nuance of such memories.

Ahead of an artist talk on the exhibition at 6:30 July 16, the INDY spoke with rios and Rowe about censorship, the environmental impacts of A.I., and how artists can protect themselves from having their work stolen by it.

Visitors at the opening reception for Against the Machine standing in front of Derrick Beasley’s sculpture, “Conduit.” Photo by Chris Charles.

INDY: I’d love to hear how the exhibition came to be and what kind of things you had in mind while curating it. 

ROWE: We got the approval [to be curators-in-residence] in late 2024, beginning of 2025. At that time, we were making a lot of connections between A.I., fascism, and climate disaster. Several months before, we’d had Hurricane Helene, which devastated our state, and at the beginning of this year, the wildfires in LA were horrifying. I remember learning more about the energy usage of A.I. data centers and how it is contributing to climate change. I had always felt uneasy about generative A.I. and seeing how many people were using it without necessarily being very imaginative about its impacts, both in creative work as well as in the climate. 

Making those connections about the water usage and energy usage and seeing these real-time climate disasters happening—that connection started to feel really strong as Trump was coming back for his second term in office, and seeing his relationships with all of these tech giants who are all very involved in the A.I. industry. In addition to the climate, I was thinking about our roles as artists—part of the insidiousness of A.I. is that it learns from human beings and what we create, including artwork. 

rios: There are a total of ten of us as part of the exhibit who are exploring different elements of A.I. and generative A.I. in our work—you saw a lot of the pieces yesterday and got a feel of what is happening. But also, from this exhibit, we’ve expanded into doing some educational efforts, because A.I. is so nebulous and huge, but also ubiquitous, as we’ve just talked about. We have had the chance to have an in-person talk that folks attended related to A.I. in the age of creativity, and had Dr. Emily Wenger from Duke share a little bit there. We are also hosting a film night in August. 

I wanted to ask about the talk with Emily Wenger—what did y’all learn about the ways that artists can protect themselves in regards to A.I.? 

rios: One thing we learned was about the app that Emily co-created with others, called Glaze. Cassie had one of her works glazed, using the app—it’s essentially a free app that artists who create visual art can use to distort the image [and] protect it from being scraped by A.I., if they place it online. 

The art itself appears essentially the same to the human eye, but to A.I. bots, it’s unable to read it as they can’t learn from it and copy your style. Which is what has happened to many artists: A.I. bots are scraping, learning from their style, and replicating [it].  

With “the wayback machine   you can’t take my memories”—you have excerpts from banned books. I’d be curious about your process for making the piece and how you conceived of the relationship between censorship, banned books, and Generative A.I. 

ROWE: One question that I asked myself as I was going about my artwork for this exhibit was, “What can A.I. not take from us?” I tried to answer that in different ways, with both of my pieces, but with “the wayback machine / you can’t take my memories,” I started with my memories. Even that is kind of a difficult one, because we do feed the internet so much information about our lives, including photos and memories. 

There is an essence of relationships that really cannot be summarized easily by humans, let alone robots. That friendship is one that I think of—like you said, it was very formative for me. I still dream about this person. I still think of a lot of things we laughed about when we were in fifth grade together. And I will always, always love her, even though our friendship will never be what it once was. 

I decided to add the overlay of the strips of transparency film because I also work at a domestic violence organization, and we receive federal funding. I do programming around housing for survivors of domestic violence, and some of that work involves research, and—since this second Trump administration—so much research that I might use in my daily work, and that many people who are in similar fields [might use], has been removed from the internet. Both on federal websites, because it contains language or topics being attacked or banned by our federal government, as well as nonprofits, including the one that I work for, because we don’t want to become targeted by the federal government. 

There’s become a dearth of important information on the internet, which once really democratized information and knowledge. So much of that is being removed and what we’re getting, if we do a search on Google, is an A.I. overview of the research and information that is still left on the internet. Now the A.I. overview is [drawing] from more and more limited information, because so much of this information is being taken down.

Some of the strips have information from webpages that have been removed from the internet because they’re available on the Wayback Machine. Some are from banned books, because I think there’s very little distance between banned books and information being removed from the internet. I think it’s all about fascism and the monopolization of information and knowledge and trying to, you know, dumb down the public, in addition to targeting efforts that are related to violence and racism and equity. 

I think the message that I’m trying to convey is: We must archive our knowledge, our memories, our information, and keep them and protect them. 

charla, you have pieces that touch on surveillance and A.I. Can you tell me a little bit about the process of making them and what you learned? 

rios: There’s images in history that forever stick in my mind and probably in society’s minds, as well, and one of those is, during the inauguration, that image of tech billionaires on the first row—Sam Altman and others being amongst them. There’s an image within the collage of these people in suits with eyes for faces that are kind of dictating all that’s happening on the other elements of the canvas. 

It’s not just this beautiful ether somewhere with my photos and memories just floating around.

So much of what’s happening in our society is in the hands of a few, and it’s happening to so many. I think we would be kind of dismissive of what’s actually happening to say like, “Oh, no part of it has touched my life in some way”—so many people have lost jobs, been detained, been laid off, can’t afford this thing, can’t afford that thing, all in the span of seven months. 

Through the artwork and through the exhibit overall, [I wanted to] bring an awareness to A.I. and the ways that it’s being used. In some ways, on its surface, [it does seem] very harmless. The cloudy feature of that piece, my naivete, admittedly, was like, “Oh, things are stored in the cloud. It’s just in the ether somewhere.” The cloud is a data center, and the data center takes up acres of land and uses a lot of water. It’s not just this beautiful ether somewhere with my photos and memories just floating around. I think a lot of people aren’t aware, necessarily, of what it is.

You mentioned that the first exhibit at the People’s Solidarity Hub was the Mothers for Ceasefire exhibit. There are three or four pieces that mention Palestine in this exhibit. I’d love to hear about the inclusion of those pieces and the connections that you made with them and A.I.

ROWE: Our friend Jac M. included three digital collages that they created for the exhibit. And she’s also a part of Mothers for Ceasefire, and has also raised thousands of dollars in mutual aid for Palestinians, whom she’s formed relationships with over the past year. I think helping folks in Palestine has become the cause of their life. So, you know, I think it was important for her to speak to the genocide that is happening.

There are a lot of tech companies that help fund the genocide, including Amazon and Microsoft Azure. There’s a lot of military funds that are supporting Israel, which include A.I. surveillance and the targeting of different civilians through A.I. I think those pieces are trying to speak to that connection there. There’s another piece that we included, which was on one of the shelves, and it spoke to how A.I. surveillance is being used in food distribution centers. I think now we’re all aware that food distribution centers have become a killing field of Palestinians. 

Civilians who are starving are being forced, without their consent, to be scanned through A.I. surveillance to access a parcel of food. We included a write-up about that and a camera placed in the kitchen where, during the opening, we had food and drink available for folks. We wanted people to see the image of people getting food and eating, enjoying food, and next to this, information about how Israel is using surveillance and controlling access to food. 

Follow Culture Editor Sarah Edwards on Bluesky or email [email protected].

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top