AI in the Creative Process: Friend, Foe, or Co-Author?
Creativity at a Crossroads
Creativity—the means by which we capture emotions, tell stories, and push boundaries—has always been a distinctly human trait. Until now, that is. You see, AI has entered the conversation, writing scripts, composing music, generating art, and even winning creative competitions.
So what does this mean? Is AI a collaborator, a threat, or something in between? Does it enhance human ingenuity or dilute originality?
The answer isn’t as straightforward as some might believe. AI isn’t a villain bent on stealing creativity; it’s also not a magic wand that automatically improves everything. Like any tool, its value depends on what it’s made of and how we use it.
AI as the Ultimate Brainstorming Partner
For many creatives, AI is a game changer. It’s like having an assistant ready to bounce around ideas, generate concepts, and help break through creative blocks—at lightning speed.
Imagine staring at a blank page, waiting for inspiration to strike. Instead of agonizing over the first sentence, you provide an AI tool with a prompt, and suddenly, you have three different drafts to riff off. You’re still in control—you’re just working faster.
This phenomenon is occurring across industries. Writers are using AI to generate outlines and dialogue, designers are experimenting with AI-assisted concept art, and musicians are training AI models on their own sounds to push creative boundaries. AI is less about replacing creativity and more about remixing, expanding, and accelerating the process.
Consider Coca-Cola’s AI-powered ad campaign—it combined human storytelling with a nod to a holiday campaign from 1995, enhanced by AI-generated visuals. However, the campaign and brand didn’t receive the expected response, as some individuals even described it as "creepy.” This underscores the novelty of these tools, as even major brands are experimenting and learning what works and what doesn’t in their application. (NBCNews)
But here’s where it gets tricky. Just because AI can generate content, does that mean it should?
Created with DALL-E. A human artist with an AI assistant and scales of justice. The human figure is distorted with its torso backward, representing the fallibility of this generative tool.
The Ethical Wild West of AI Creativity
The biggest concern with AI-driven creativity isn’t that it exists—it’s how we regulate it. Right now, we’re in uncharted territory.
Consider originality. AI doesn’t create in the same way we do; it learns from existing material and remixes it. So if AI draws from an ocean of past content, are we truly getting something new, or just endless variations of what already exists? Furthermore, if AI is trained on copyrighted material, does that make its output plagiarism?
This is already a legal and ethical debate unfolding in real time. An AI-generated painting has won an art competition, and courts are still trying to determine who owns AI-created work—the user, the AI company, or no one at all. (The Verge)
Then there’s bias. AI only knows what it’s trained on, meaning it absorbs human biases. If the data it learns from is flawed—whether culturally, racially, or historically—then AI will reflect and reinforce those flaws. It’s like baking a cake: if the ingredients are bad, the final product will be too.
That’s why clean data is critical. But who decides what’s “clean”? Who ensures AI isn’t recycling harmful narratives, reinforcing stereotypes, or even creating misinformation?
What about identity? AI isn’t just creating stories; it’s replicating people.
AI Cloning and the Commodification of Identity
This is where things get even murkier. AI is being used to digitally clone voices, faces, and personalities—sometimes with consent and sometimes without.
Consider William Shatner’s collaboration with StoryFile, where his likeness and voice were preserved so future generations could interact with an AI version of him. It’s an incredible concept for legacy preservation, but it also raises serious ethical questions.
If an AI version of someone continues to speak long after they are gone, who decides what that AI says? And if someone’s voice or likeness can be replicated without their control, how can we prevent AI from being used to manipulate, deceive, or exploit?
We’ve already seen this debate with deepfake videos, AI-generated musicians, and even posthumous movie appearances—such as Carrie Fisher’s CGI presence in Star Wars.
At what point does AI stop being a tool for creativity and start becoming a means for erasing human identity agency?
The Future of Creativity is Hybrid—But It Needs Boundaries
AI isn’t going anywhere. It’s already woven into film, advertising, music, journalism, and digital art. The question isn’t whether AI belongs in creativity—it’s how we use it responsibly.
For AI to truly be a creative co-author rather than a creative thief, we need clear ethical boundaries:
Transparency – AI-generated work should be clearly labeled so audiences know when they’re engaging with machine-created content.
Ownership Clarity – Laws need to define who owns AI-generated works to protect both creators and the integrity of art.
Bias Checks – AI requires more diverse, accurate, and ethically sourced training data to prevent misinformation and bias.
Consent and Rights – People’s voices, faces, and likenesses shouldn’t be used by AI without permission.
The best creatives of the future won’t be replaced by AI—they’ll be the ones who know how to collaborate with AI.
Where Do You Stand?
The way we approach AI creativity today will define its role in the future. If we let AI take over, we risk losing the emotional depth and originality that make storytelling powerful. But if we completely reject AI, we might miss out on a revolutionary tool that could push creative industries to new heights.
So, what do you think? Is AI a friend, foe, or co-author in your creative process?
Let’s talk. Drop a comment!