The future is sometimes best understood from the rearview mirror. Current speculation about the coming AI-bot takeover of all things notwithstanding, we misread AI’s novelty and overlook our own role in AI’s prehistory. The bot threat to end paying careers for artists, designers, and writers underscore the high stakes. We rightly target one culprit, the wholesale pre-data-scraping of all online images, ideas, music, video that fed the bots in the first place. Platform data-miners unfairly vacuumed up this ocean of free IP as preemptive cultural plagiarism. Without consent, right?
Not entirely. For two decades we’ve rehearsed endless ways to give our best ideas and images away for free on social media. Voluntarily. The tacit deal: we share content at scale to enhance the long odds that someone will discover and monetize us later. This deferred-payoff behavior required great trust. It now feels far less enabling. Many artists and writers seeking careers feel jilted by their platform’s cloaked duplicity. Faith in uploading morphs into IP regret. Or litigation. AI-bots grind-down the win-win deal many creators once imagined.
The new AI culture-bots may also blindside us because they herald graduation from our utopian training in AI-boot camp. There we learned to create and work for free, while social media let us master how to justify creating for free. We need to unpack this pre-history of IP-gifting. Especially the volunteerism and personal consent implied by content-gifting, which undercut our self-righteous protests against replacement AI-creators.
Culture made long-odds creator payoff a habit long before AI’s creator-mining. Investor economies leveraged the artworld’s resilient suffer-until-discovered pose, given its lottery promise, and disregard for creative labor as collective. Art’s lone-wolf posturing shape-shifted into a corporate culture skill-set for visionary market innovators. Financial speculators loved that self-financing and debt fueled many “disruptors.” In effect, an MBA gambit–corporate entrepreneurialism–hijacked the personal expression ethos that once defined vanguard art schools. CEOs engineered personal vision into brands. VC-wooing startups postured as culture-gifting’s avant-garde.
Management also stokes the endlessly deferred payoff pose. Firms often employ a paying-your-dues rationale to justify prolonged career precarity for personal assistants and company underlings. Other rising players master fake-it-until-you-make-it posturing to leverage self-reflection as personal branding. Finally, social media’s likes and ranking economies make wooing attention rather than producing durable content to distribute the only goal. All these frameworks provide symbolic “pay” to compensate for free creator gifting. We now accept that creative work and pay operate in two different time zones. We create effusively now. Yet wait endlessly for a lottery payoff down the road.
Culture-gifting’s Ur-prototype, “spec-work,” churned long before AI-bootcamp. It included unpaid creative works by aspirants and rising professionals alike to get actual production work and pay. Once “on spec” meant writing the “spec-script” as a calling card to land writing jobs. Yet writers now hear that even screenplays are not enough; that they need to shoot sample scenes just to get meetings. Veterans tell young designers to publish “look books” and pre-project style swag to snag jobs. Mentors teach rising filmmakers to print 4-color “leave behinds” if they hope to get a call back from producers after pitches. Hopefuls stage public “table-reads” on their own dime. Career-aspiring writers, producers, and composers self-finance their IP on parallel platforms (novels, plays, musical theater) to attract agents and eventual funding. The end prize: re-versioning spec-IP for big-screen career payoff. Hollywood’s sea of shadow spec-production is vast, time-consuming, and costly.
Below this world of underutilized creative professionals, YouTube, TikTok, and Instagram place acute pressures on creators to give content away freely. There, aspirants must first buildout personal fanbases before they can be discovered and monetized. Costly workshops teach creators analytics to boot-strap their channels, to master speed production, to avoid costly production values (like lighting and sound), to upload content daily, in quantity. And when all that fails, to trick the YouTube algorithm, with automated content hacks and counter-algorithm triggers. Huge amounts of soft family and social capital have long pre-subsidized this content flood gifted by aspirants. Yet their giant partners never admit that these cloaked aspirant pre-subsidies feed the platform’s stock valuation. Creator doublespeak rules AI-boot-camp.
Boot-camp training–in both old media and new–has softened us, normalized and greenlit AI-bots to generate culture for us. Why not halt this culture-extraction scheme? Platform beneficiaries have long over-paid creators in symbolic cultural distinctions (soft capital, competitions, ranking schemes) precisely because they vastly underpay aspirant and rising creators in actual money (hard capital) for screen content. Platform pedagogy in AI-boot-camp was also bidirectional. For two decades, while we learned free culture-gifting in the data-mine, the data-mine learned to ape human creators as we’ve disclosed oceans of free (or at least hijackable) content. We greased the skids of our preemption, complicit with a partner that now generates culture without us.
The long-odds human predicament faced by anyone wanting careers creating screen media have long troubled me as an educator. I began researching production to discern why companies gave their overworked 25-to-35-year-old VFX workers midcareer “sabbaticals” in the 1990s to physically survive their workstation “masters” and 24/7 “digital sweatshops.” Years later, I wrote Specworld after viewing scores of online down-in-flames quittingmy-channel videos by angry teen and post-adolescent makers who raged that their perverse giant platform hosts had cruelly demonetized them, making their ascent toward actual production careers impossible.
Asking how career-threatening pressures trapped 20-and-30-something creative pros in the 1990s was one thing. Discerning how and why aspiring 21st-century adolescent social media creators would rail against similar career-ending stresses proved more alarming.
How should those of us in the arts and humanities respond? Litigation against cloaked AI culture-mining offers one recourse. Wholesale IP culture-vacuuming on platforms was indeed the original sin. However, unpacking the masquerade of quasi-consent in our culture-gifting deal-with-the-devil, may suggest more lasting regulations for the online creator swamp. Including revisions of IP, surveillance, media regulation and ownership laws. Requiring “watermarks” for AI content and IDs for any entities scraping or monetizing online content would at least make unintended culture-gifting more transparent. Yet the most needed (but uninvited) form of oversight in the world of teen creators are child labor laws. Way too much money is being made there by the culture-miners. Recent news outed the lies that social media creator world entails neither labor or predation. Under the hood of digital platforms the writers glimpse a celebrity fighting pit of young wannabes fueled by child abuse and shadowy monetizing corporations.
For decades, we’ve normalized free-culture gifting for creators on social media, shoring up our reluctance to regulate. If we are honest, our deal with the devil doesn’t just entail undisclosed artist culture-mining by giant platforms. Our own habitual culture-gifting as social media creators helps fuel and sustain a shroud cloaking our shared AI-boot-camp. After graduating, we can only snark at the unwanted creator-avatars we have unwittingly trained. Semper Fi AI.
About the Author
John T. Caldwell is Distinguished Research Professor in the Department of Film, Television and Digital Media, UCLA. This article has been adapted from his recently published book Specworld: Folds, Faults, and Fractures in Embedded Creator Industries (University of California Press, 2023). That work, and this essay, come from 40 years of experience in film and media, and as an arts and humanities professor.