What Creatives Actually Need to Know About AI

<aside> 🎤 LaSalle College Vancouver Keynote | January 14, 2026 | Kris Krug

</aside>

I didn't ask for this.

None of us did. One day you're a photographer, designer, writer, filmmaker—building a career on skills you spent years developing. The next day, someone in Silicon Valley has scraped your work into a dataset, trained a model on it, and is selling the ability to generate something "like yours" to anyone with twenty bucks a month.

My work is in there. I released thousands of photos on Flickr under Creative Commons over the past two decades, and when I checked a dataset search tool, I found roughly 1,800 of my images in a major training corpus. That doesn't mean every AI model used them—training pipelines vary, and the tool shows evidence, not certainty—but it means the consent story around AI is exactly as casual as critics claim.

I'm sympathetic to the people who want to resist. I understand the impulse to refuse, to opt out, to fight back. But I've also spent the last two years working with hundreds of creative professionals who are trying to figure out what to actually do in this moment. And what I've learned is that neither pure resistance nor uncritical adoption gets you anywhere useful.

So I'm asking you to walk forward with both hands full


The Non-Consensual Moment

Let's name what's real.

The consent problem is real. AI systems were trained on the creative output of humanity—photographs, writing, code, music, art—largely without permission. Some of us released work under licenses that technically allowed it. Many didn't. The fact that companies are now settling billion-dollar lawsuits over training data tells you everything you need to know about how legitimate those practices were.

The junior pipeline problem is real. The entry-level work that used to train senior creatives is disappearing fast. You'd get hired at a studio, spend six months converting approved assets into different formats, and in the process learn the systems, the clients, the back-and-forth. That's how juniors became seniors. When AI handles that work, we lose the pathway—and we don't yet know what replaces it.

The dependency question is real. If you don't use a muscle, you lose it. What happens when we stop doing the first stages of creative work ourselves? When we outsource idea generation, research synthesis, early drafts? We're all participants in an experiment we didn't sign up for, and the results aren't in.

These aren't hypothetical concerns. They're happening now. Anyone who tells you they've got it all figured out is either selling something or not paying attention.

And yet.

I've spent two years rebuilding my creative practice around these tools, and I've never felt more capable. Not because the tools are magic—they fail constantly, in ways I'll get to—but because learning to work with them has forced me to get clearer about what I actually bring to the work. The parts that are mine.

Both things are true. The system that created these tools violated consent on a massive scale, and working with them has made me a better creative professional. I'm asking you to hold that contradiction and keep walking.


Agency Via Workflows

Here's where most AI advice goes wrong: it's all about tools.