The AI debate is fracturing the creator communityDaniel Kwan (of The Daniels) and I talk about the Creators Coalition on AIJOE: Ready? Then are we recording? All right. Hey, it’s Joe. This is Dan Kwan, filmmaker extraordinaire, Dan of the Daniels who made Everything Everywhere All at Once, amongst other films, and Swiss Army Man. We haven’t talked about Swiss. DANIEL: Oh my god, you’re bringing up Swiss Army Man? You know this year is the 10-year anniversary of its release. No, it’s Sundance next week. Yeah, it’s kind of crazy. I’m going to Sundance. JOE: It’s going to be the 20th anniversary of Mysterious Skin, 10-year-old, an 8-year-old, a 3-year-old. All right, so it was probably like six months ago, seven, eight months ago, you started talking to me about this idea for what has become the Creators Coalition on AI. Yeah, and we’re having this conversation right now because we made the announcement. What was it, late December? DANIEL: ate December, yeah. JOE: Yeah, we were gonna launch a little later. DANIEL: Yeah, and now we’re flying the plane as we build it. I think that’s the only way to do anything these days. JOE: It merited a response. So we announced just towards the end of last year, and we wanted to have a follow-up. And, you know, there’s been like a lot of really lovely positive encouragement and support for the Coalition. A lot of people have signed their names, and then there’s been a lot of questions. Yeah, so we thought we would just kind of record a conversation where we answer a bunch of these questions. DANIEL: Yeah, it’s really important to us. I think transparency is one of the hardest things to be fighting for right now when it comes to conversations around the tech industry. And it only feels right that if we’re gonna be fighting for transparency, we should also be modeling it. And so we at the CCA—I’d like to believe that these conversations—we should be having a lot of conversations. Like, right now the problem in Hollywood is that a lot of conversations are happening in secret and behind closed doors. Yeah, and everyone is kind of having these narrow conversations in isolation. And I believe that we’re not going to be able to move fast enough if that’s the way we do this, because obviously this is very scary. There’s a lot of potential risks. There’s a lot at stake. And I wanted to create CCAI with a bunch of other like-minded people, specifically to create a space where we could have these hard conversations between a spectrum of different voices and a spectrum of different experiences. Because if we’re only having these conversations in two places, and those two places are either behind closed doors or online—you know, where, even with the best intentions to have a nuanced conversation online, it’s going to get swirled up by the algorithm and then chewed up and polarized. And I knew that this conversation was too important to risk that kind of— JOE: That kind of outcome. So yeah, and frankly, that kind of polarization only plays to the advantage of the predators. Yeah, frankly, because I think there’s a divide-and-conquer strategy. They want us to not be talking, or to be fighting the whole time. And that’s fighting amongst each other. DANIEL: Yeah, fighting each other instead of fighting the people who actually have the power and the control over how this technology is being deployed. Which, most of—you know, if anyone’s been following my journey into the AI world—I believe that the way that this technology is being deployed is completely wrong, and we really need to be pushing back on it. And I believe there is a better way. We just need to work together to make that happen. JOE: Yeah, and I agree with you. And at the same time, I’d like to maintain some kind of optimism, because I think the technology itself has the potential to be great. Yes. But the way it’s being—especially turned into businesses nowadays—is leading us down some potentially really dark paths. And that’s why we have—I think we have the time now. Now is the time. If we can have these conversations, we can hopefully do some course correction and be like, let’s take this technology that could be incredible but is currently being leveraged in an ultimately damaging and power-concentrating way. Yes. But if we can course-correct, maybe it could be something that genuinely is good for everybody. DANIEL: Yeah. Yeah. Yeah. I would love to talk more about that kind of dichotomy. Like, you know, we spend so much time talking about the risks, it’s really hard to talk about the benefits. And I think that’s for good reason, because—this is a bit of a crude analogy, but I’ve been using it recently and it’s been helpful in conversation. When you’re in a relationship with someone and the other person suddenly wants to invite a third in, like a throuple, you know, a threesome, right? JOE: Polyamory. DANIEL: Polyamory, or just a one-night thing, who knows. If your core relationship is not one that has foundational trust and ideas around consent, ideas around a shared understanding of what you guys are stepping into, bringing in a third is incredibly—it’s chaotic and dangerous. And no one—I was so tempted to ask if you’re speaking from experience, but I’ll skip that. But you don’t want to talk about the fun stuff, right? Unless you feel—unless you have some trust. And our industry is trying to talk about the fun stuff. We’re trying to talk about the positive, we’re trying to talk about all the benefits this could have. But we haven’t even established basic rules of trust and consent. And until we have that safe conversation, I don’t blame people for their knee-jerk reaction to be like, hell no, I don’t even want to hear about the positives. So I just want to acknowledge that I do believe that there are positives, but there’s so much work that we collectively need to do together to make each other feel safe before I feel like, all right, let’s do that. JOE: Yeah. This is me, personally. Yeah, which is why probably almost all—most of the time I’m raising my hand or making a video about AI, yeah, it is more about the concerns. Yeah. But yeah, it is important not to get too completely pessimistic. Yes. Ultimately, if you want to head in a positive direction, you’ve got to have your eye on that too. But do you want to talk a bit about what we’ve done so far, what the announcement said, just for those that maybe hadn’t seen the announcement a month ago? DANIEL: Yeah, amazing. So we’ve spent the last, you know, six or seven months—a group of us, filmmakers from all different areas of the industry. We have actors, producers, writers, but we also have VFX artists, we have voice actors, we have people from tech-adjacent spaces who understand the technology but are very critical of how big tech is implementing it. JOE: And so we’d also add there’s quite a few people who have shown a lot of enthusiasm who are not in the Hollywood film and TV world, because I think that this is just as important, if not more so, on YouTube, for example, or in the podcasting space, etc., etc. DANIEL: Exactly. Because of how decentralized this problem is and how widely distributed this technology is, to only have a conversation about Hollywood would be foolish. And so we’ve also been intent on inviting in and having conversations with online creators and things like that. But the initial impulse was to bring everyone together on the same page because my—one of my fears is we continue on the default path. JOE: Yeah. DANIEL: If we don’t coordinate. The path of least resistance looks something like this, because we’ve seen this happen and play out in other industries. We’ve seen it happen in our own industry. It’s one in which the tech industry comes out with a new shiny toy, and there are a lot of really exciting, interesting things about it. They deploy it onto an industry, and it disrupts things, right? It’s the “move fast, break things” model. And at first, the relationship’s really wonderful and exciting. I remember when Uber first came out, I was like, oh my gosh, so affordable, so convenient. This is amazing. But what ends up happening is the disruption of the model really breaks some fundamental things, like protections, the ethical concerns around any of these technologies, in a way that allows the tech industry to consolidate a lot of power and a lot of control. They hold all the cards. Once they capture a large enough market share, no one else can compete. And then once that happens, they dictate the rules. JOE: They set the terms. And Uber is now high-priced, whereas when they first started, they had these low prices to get everybody hooked. DANIEL: Exactly. It’s high-priced for the user, low wages for the driver. And on top of that, the drivers—if you look at the taxi industry, that used to be an incredibly strong, powerful job. The labor protections around that were really powerful, and the requirements to get into that field were really high. You had really incredible drivers who knew how to do their job. Now—and this is no offense to any Uber drivers—but some of you are terrible at driving. Some of you, like, it’s stop-and-start, it’s stop-and-start. It’s like, the number of times I feel like drivers are really poorly treated— JOE: Exactly. I was in an Uber the other day, driving from the airport, and it cost, I think, something like $90-something to get to my house from the airport. And the Uber driver said to me, “What is it charging you?” I said, “It’s $90.” He said, “You know, I’m getting $30.” Exactly. That’s crazy. That’s crazy. Doing that whole drive, and he’s only getting a third of the money. DANIEL: So they’re being mistreated. We’re getting a worse service, and we’re paying more, and a lot of it’s getting siphoned up to the tech companies. And this has happened with Spotify and musicians. You can see what happened with Airbnb and housing. Even when you look at streaming—like, obviously it’s a very complex thing that happened to us—but when we chased after what the rest of the tech companies were doing within our industry, we accidentally created a streaming bubble that devalued our product, our stories, in a way that changed the relationship our audiences had with the theatrical experience, which suddenly made some of our business model no longer make sense. And so now we’re struggling after that pop, in a way where less productions are happening in the U.S. Budgets have really gone up into places where it’s really hard now for people to make movies and for audiences to come and actually support them. And so, my—the default path—this is a long-winded way to say the default path is one in which the tech industry sets the terms for our industry, and suddenly the creators are no longer at the table and no longer have any power or any agency within our careers and within our industry. JOE: But the truth is that these tech companies need the creativity of humans. Their products don’t work. Yeah. Their generative AI services don’t generate anything at all without all the content and data. DANIEL: Exactly. Exactly. And what’s really interesting about this situation is |