Search
Close this search box.

AI Meets Storytelling: Why Creative Pros Are the New AI Developers | Tech Innovation Series

Firefly Fantasy setting, modern cartoon style of a Storyteller manipulating data to create new stori
Discover why creative professionals are at the forefront of AI development. In this first episode of our groundbreaking series, explore how storytelling shapes AGI and why the fusion of creative and technical skills will define the future of digital media. Essential listening for creators, developers, and innovation leaders.

For the latest in AI news, analysis and tools, subscribe to the newsletter!

Transcript

This show is the first of 5 episodes that will analyze  why creative professionals are AI developers just as much as technologists.

 

Circa 2024 we have simultaneous revolutions happening in AI and the creative industries. While many people view writers, actors, singers, graphic designers and many other creative professionals as part of a self-contained ecosystem, the interaction of advanced technology with creative work often serves as a leading indicator for how technology changes sweep across the rest of the economy. When the Internet first launched, most people heard about it through the lens of how Internet technology affected how we created media. Soon enough, Internet outgrew the media industry to become a general phenomenon across multiple industries.

Generative AI is following a similar path only faster and deeper. When Chat GPT launched publicly in November 2022, it enabled a web browser moment in which the masses gained access to incredibly powerful AI technology through an incredibly intuitive natural language interface.

 

Chat GPT and other Generative AI services turn natural human language into computer programming language to enable creators to make any kind of digital media. As Gen AI matures, the two communities must deal with each other in new and unfamiliar ways. Because if we don’t find a better working relationship between content creators and technologists, the future will be dire for storytellers.

 

AI will suck too.

Over the next five episodes, I will unpack some of the moving parts behind this transformation. This first show will focus on Artificial General Intelligence or AGI and storytelling. Future shows will explore content creation and AI training data, how storytellers must engage new audiences that aren’t human, which sacred cows in media and tech are under threat, and how storytellers and technologists can drive to a preferred future rather than react to a disruptive one.

So let’s explore AGI. Sam Altman baldly states Artificial General Intelligence is the goal for OpenAI.

OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.

These views are echoed to a greater or lesser degree by nearly every leader of a major AI technology company. The quest for AGI goes beyond AI systems that are better at chess than human beings, or specialist AIs that fold proteins to discover new therapies. AGI means creating machines that understand, learn and think about the world like a human would.

Throughout AI’s history, biological and brain centric metaphors like “genetic algorithms” or “neural networks” have nudged research and development into AGI. The newest and most powerful versions of AI — transformer architectures based on deep learning — mimic neural pathways we associate with biological brains. The results speak for themselves — humanlike prose, striking images, realistic video and audio.

But some argue that transformer architectures that built Generative AI aren’t the only route or even the best route to AGI. And that’s assuming there’s a consensus on what AGI means practically and philosophically.

 

Geoffrey Hinton (often referred to as one of the godfathers of AI) called AGI a “serious though ill-defined, concept”. There are multiple ongoing debates in tech companies, think tanks, and governments over the term. Regardless of how these communities describe what AGI isit’s clear they’re all motivated by the idea that AGI is BIG. It means more than making tools. The prize is an actual thinking machine.

Just Add Brains?

But if biological mimicry is the preferred route to AGI, it also begs the question: is this how human beings and other species really  understand, learn and thinkabout the world?

 

If all that mattered for developing intelligent autonomous systems was to reproduce the functional architecture of biological brains, whither socialization, culture, aesthetics, art? Are these universal human behaviors byproducts of our intelligence or are they intrinsic to our intelligence?

 

Building a general intelligence that understands the physical world and can navigate the human experience will require more than algorithmic understanding. It will require contextual understanding.

Context rather than content will drive AGI. It’s important to understand the concept. When we try to figure out what one thing means in relation to another thing, we’re trying to understand its context. When humans interact with other humans, we employ situational information or context to the ideas we exchange and the reactions we expect to result. Ranging from my family’s jokes around the dinner table all the way to the most complex professional jargon, context allows human beings to distinguish what’s explicitly stated from what’s implicitly understood. The future of product and service design hinges on AGI’s ability to understand a person’s contextual situation, make a suggestion for the next best step for that person to take, and then coordinate and deliver any and every service to help them reach their goal.

This is the real test of practical AGI. Learn about me as an individual and tune my world accordingly. I don’t believe we can code our way to AGI like we coded IT systems or evolved today’s online world from a set of Internet protocols. If we are to take seriously the AGI challenge of creating systems that understand the meaning of data rather than its manipulation — we should reach back to the oldest tool in Homo Sapiens kit bag for encoding knowledge, making it useful, and perpetuating knowledge over time.

 

Storytelling.

The rub comes from the fact that the stories currently being used to train AI about the world and humans role in it come from what exists on today’s digital networks. In other words, AI has learned what the Internet has to teach.

 

And if that doesn’t scare the living shit out of you, you’re not paying attention.

 

Part 2 of the Storytellers Are Developers will explore AI training data and media. Not only has feeding the beast become a lot more expensive, the digital diet has changed.