Autre Magazine

View Original

An Interview of OpenAI’s First Artist in Residence, Alexander Reben

Installation view of Disruptive Reflexivity in the Flux of Becoming (2024) in the Write a convoluted exhibition title for Alexander Reben’s show in the basement of the Charlie James Gallery exhibition. Courtesy of the artist and Charlie James Gallery, Los Angeles. Photo © 2024 Yubo Dong; photo credit @ofphotostudio Yubo Dong.



interview by Mia Milosevic


See this content in the original post

MIA MILOSEVIC: Can you talk a little bit about your timeline as an artist and as a scientist, from attending MIT and studying social robotics and applied math to becoming an artist?

ALEXANDER REBEN: I'm not sure there's a point where one becomes an artist, or if it’s just always happening. Certainly, even in research I was doing creative things and my thesis work while in social robotics was also looking at filmmaking and documentaries and how people open up to and respond to technology in different ways. Even as an undergrad I had a couple exhibitions. I'd say it has always been in parallel. All my education was more on the science, engineering, and math side of things, but I’ve always been interested in creativity.

MILOSEVIC: Can you speak on your creative process for your current show at Charlie James Gallery? 

REBEN: The process is quite different for every work. I'm almost as much a process or conceptual-based artist as I am a technology-based artist. It doesn't fit really into any of those camps. I mean, if it was very conceptual, then the object wouldn't matter, it's really just the idea. But to me, the object does still matter. A lot of what I'm talking about is process, because some of what I'm talking about are issues and ideas around automation, which in itself is about how objects are made. Where are the human and the machine coming together? In this show in particular there's quite a wide variety of works through various years.

I think the oldest piece in there is probably Deeply Artificial Trees, the “Deep Dream” video, that I have from back in the day. The newest work is the large metal sculpture I made with the big robots and Machina Labs, Disruptive Reflexivity in the Flux of Becoming. As well as the speaking dental phantom, Artificial Musings of the Null Mind. Some works come because there's an interesting new modality to working with technology. Some of the works come just from random thoughts that I think are interesting, or I think it's something that the public should experience in some way because it could be an upcoming thing that might be changing how folks work with technology.

MILOSEVIC: Did you collaborate with ElevenLabs for Artificial Musings of the Null Mind?

REBEN: I wouldn't call it a collaboration, but they helped me with credits. It was my voice that was trained on ElevenLabs. I had AI generate kind of idle, empty thoughts and musings that the work continuously spurts out. Some of them are quite hilarious and funny. Some of them are poignant and meaningful. Some of them are kind of ridiculous and wrong. (laughs) It's a conglomeration of a bunch of technology, the actual physicality of it is an antique from the 1940s and 50s where they would use these aluminum and steel phantoms to practice dentistry. The ones they have today are plastic and silicone. It speaks to an artificial human simulacrum for scientific use which is being repurposed here.

MILOSEVIC: In your artist bio for Charlie James Gallery, it says you “spent over a decade creating work that probes the inherently human nature of the artificial.” How can we demarcate the difference between the real and artificial? 

REBEN: Part of what I mean in the bio is that technology is inherently human, right? It's very much what we make. It's not like spurting randomly out of nature, it's the way we interface with and modify the world, and we wouldn't be who we are today if we didn't have technology. We probably wouldn't have evolved the way we have without inventing even like, taking a bone back in the day and using it as a tool could be considered a technology, or that's kind of an artificial use of something. It led to being able to hunt better, get more protein, which led to things like inventing science and philosophy and language.

We're fundamentally who we are because of the things that we invent and come up with. I think technology is often seen as a separate thing from us for some reason. We feel like it's a different thing, but to me, it's the physical manifestation of humanity. If you look at it through that lens, I think you can analyze it and appreciate it in different ways and look at how it affects you personally. It’s also something that means very specific things to different folks, and everyone uses technology in different ways. 

Alexander Reben
Artificial Musings of the Null Mind
Antique dental phantom, microphone, amplified speaker, truss, electronics
Dimensions variable
2024
Courtesy of the artist and Charlie James Gallery, Los Angeles. Photo © 2024 Yubo Dong; photo credit @ofphotostudio Yubo Dong.

MILOSEVIC: Can you speak on your just experience as being an of the first artist and residence at OpenAI and what still makes you excited about some of the things you worked on there?

REBEN: I have been working with OpenAI and folks internally since about 2019. I got access to GPT Beta back then before it was public, even before Chat GPT was a thing. That's where I made the plungers piece, A Short History of Plungers and Other Things That Go Plunge in the Night. I was getting GPT to write these ridiculous but fun wall labels. It was kind of just a natural shift for my relationship with them. It was just more like, Hey, maybe we should allow Alex to come in and produce some physical work. I think that for OpenAI it was also kind of a trial to have an artist come in and be hands-on like that.

While I was there, I really focused on tool building, because then I could use those later on after the residency. So there were three main things I worked, the first being a way to produce these massive, high-resolution AI images using outpainting. They're super huge works which I print out at like 1200 DPI, so the details are higher than the eye can see. I thought it'd be interesting to create something with AI that was super complex, super detailed, really high-resolution, sort of getting away from the single image, but also doing something that would be near impossible to do by hand just because of the sheer amount of detail in that image.

That was the theme I wanted to continue with the other tools, using AI as a tool to go past what I might be able to do or others might be able to do on their own. The second thing I had worked on and am still working on is this idea of a conceptual camera, so using photography as an interface versus language. I built a little app for myself that has multiple modes and in one mode you can take a picture of a group of objects and it will come up with a wall label to justify that group of objects as an artwork. It'll print out a wall label with all the info you would need to call that thing an artwork.

There's another mode where you can take a picture of something and it will reinterpret that thing as an absurd situation of whatever that thing is, and then print out a Polaroid of that. In another mode you can make a sketch or a drawing and take a picture of it and it will reinterpret that sketch or drawing as a scene. The reason I called it a conceptual camera is because whatever you take a picture of it translates it into another language as it tries to describe that image.

Once you're in that language space, you can change settings of that image with concepts. So you can be like, given this description, make it more absurd. That's something that a camera usually can't do. You can think of it like a physical knob, like you'd have for exposure. Instead it’s a serious-to-absurd scale that you could tweak, which to me was very interesting because it became a camera that doesn't really do what usual cameras do. I'm still playing around with all the different ways to use that, but I think that just kind of speaks to the ways I think AI is gonna be used in the future. It's gonna plug into a lot more of the natural and creative interfaces folks can use beyond just writing text.

The last thing I worked on was using Sora video to create clips of sculptures that would rotate around their center. If you make things that rotate around their center, you can use computational photography, specifically things like NeRF, which is an NVIDIA algorithm, to extract the 3D model from those viewpoints. The interesting thing I found about Sora was that it preserved relationships and 3D outputs, so you actually could pull a 3D model out of the video. I did that for a few sculptures and did a few 3D prints of those. 

This process still needs a human with knowledge of 3D editing to go and turn that into a usable, high-resolution entity. That sculpture was given to Monumental Labs, which does robotic marble carving, and it was turned to a large-scale marble. We're not too far from text-to-object, which I investigated with those big robots and the sheet metal, now on view at Charlie James Gallery

Alexander Reben
A Short History of Plungers and Other Things That Go Plunge in the Night
Plungers, cotton pigment print, aluminum label holder
Dimensions variable
Edition of 5 and 2 APs
2020
Courtesy of the artist and Charlie James Gallery, Los Angeles. Photo © 2024 Yubo Dong; photo credit @ofphotostudio Yubo Dong.

MILOSEVIC: I know Sora is expected to be released relatively soon. How do you expect it to be integrated into the global artistic landscape?

REBEN: I know everyone in Hollywood is keeping a strong eye on this. There's still a lot of work to be done in that space in order for it to be used for cinematic, full-length work. But my guess is it's just a matter of time before the tools get good enough for those sorts of things as well.

MILOSEVIC: What would you say AI creates space for more of?

REBEN: There's a lot of resources being put towards this technology. Not everything is gonna make it into the future, but a lot of it probably will. And like the web, it's gonna influence society in a huge way. Similar to the Industrial Revolution, it’s about this automation of thinking. The Industrial Revolution was really about automation of the physical.

The more interesting things revolve around how to expand your own creative practice and your own knowledge. My hope is that it allows people like that to be more creative, to speed up maybe their process, and allow them to do more of what they want to do. I also think on the flip side, folks who don't have artistic backgrounds who might wanna express themselves can use it as a tool to do that. The sketch-to-image mode of the conceptual camera really blows a lot of people's minds because it just doesn't take just the exact sketch you make, it tries to get the idea of what you're trying to express from your sketch and then make an image of that. It's a way for those folks to come up with ways to communicate with others where it might have been hard for them before.

MILOSEVIC: I wanted to go back for a second to A Short History of Plungers and Other Things That Go Plunge in the Night, which I know has received a lot of media coverage. The piece is accompanied by the philosophy of “Plungism” which is defined as when “the mind of an artist is in a state of flux and able to be influenced by all things, even plungers.” I feel like it speaks to a lot of people's fears about the application of AI to art, where maybe artists become too easily susceptible to the mind of some foreign entity.

REBEN: Yeah, that label's funny because that was like GPT-2 Beta before it was out there. And funny enough, the reason it’s a repeated plunger multiple times is a result of a bug they had in the model. So even the little mistakes or dead ends, things in these models create fun outputs—less useful if you're trying to write a resume, more useful if you're trying to do creative writing.

At the end of the day, these systems are like pattern machines. They learn from the internet, right? The question I would pose is: Is the interpretation an AI makes of an artwork any more or less valid than the interpretation a curator or writer makes of a work? And if not, where's the distinction? 

MILOSEVIC: Our most recent issue that has just come out is called Citizen, it's all about citizenship and all what it means to be a citizen right now in the current climate. Could you talk about how AI makes you not only a better artist, but maybe even a better citizen, or what that might look like for people?

REBEN: Because I'm an artist who's always worked with technology, my work is about technology. AI is making me a better artist in so much that it's giving me a new, very interesting thing to dig into and work with. It's more that it's just an extremely fruitful thing to look at and research and think about. I think there's a lot of hype out there right now, so we’re still coming to terms with how it's making me a better citizen. If it's makes people more inquisitive, gets 'em to ask more questions, or allows them to learn or research things better or just become more educated, I feel like that's really a lot of what makes a better citizen.

MILOSEVIC: I think your work has just made a positive correlation between art and these innovations of technology that people have generally found frightening. 

REBEN: I try to stick to the neutral to slightly positive route. I do have work that questions, Do we actually want this thing? Do we want this to be this way? How do you want this to go? My work doesn't look to answer questions specifically, because how you experience technology is a very different thing from how I experience it and what it means in your life is different from what it means in mine. It's a highly personal question. At the end of the day, what do you want from technology?