Exploring Generative AI in Animation: Q&A with Jeremy Higgins, Creator of Migration


Jeremy Higgins recently released Migration, an animated short film capturing the wonder of exploration and the circle of life. What sets Migration apart is its use of Generative AI to create animated backgrounds. The project is the first release from Runways Hundred Film Fund. I sat down with creator Jeremy Higgins to discuss his creative process, working with Runway’s AI tools, and what it might mean for the future of animation.

A composite image with AI and 2D animation.

Noah Kadner: What inspired you to create Migration?

Jeremy Higgins: I love to experiment. Back in college, I was pushed toward more experimental work. I had a teacher who would bring in bananas and ask us to make a book out of them. Doing projects like that pushed me to think differently. When AI came around, it was just another thing to play with and see how to mix it with what I was already doing. And that’s how Migration came about.

Noah Kadner: What was your approach to incorporating AI into the pipeline for this film?

Jeremy Higgins: I already had this character, The Explorer. When AI tools first became available, I began generating images and putting characters into those environments, experimenting with animating over AI-generated imagery. By the time I started Migration, I already knew what I wanted to do regarding incorporating AI. I used Runway’s Gen 3, their latest update, to animate all the backgrounds. It was about finding a way to blend this character into an AI-generated world.

The Explorer character.

Noah Kadner: How does using AI-generated backgrounds compare to a traditional animation workflow?

Jeremy Higgins: Honestly, it’s pretty similar. We used AI-generated imagery for the backgrounds because the story takes place on an AI planet, and our explorer is from our world and diving into this universe. Using AI for the backgrounds was like doing matte paintings or prop design—it’s similar. We used AI to add subtle movement. I tried to do something similar to traditional animation, where you have a relatively static background and do camera work within that environment. We went with a more 2D route for this—workflow from AI background to composite with hand-drawn animation.

Noah Kadner: What were some challenges you faced in the process?

Jeremy Higgins: The most challenging part was making the character feel like he was in this world. With traditional animation, you have foreground objects to layer in front of the character. With these AI backgrounds, it’s more like a static background plate. You have to learn to blend the character in with 2D compositing tricks—lighting, masking, etc. I didn’t make any compromises, but it makes certain things more straightforward, like asking for a camera movement. Still, you have to do the heavy lifting.

Noah Kadner: What was your process like when using AI tools to generate these backgrounds?

Jeremy Higgins: Evan Johnson co-wrote the film with me and did the AI generation. After he finished writing, there wasn’t much for him to do during animation, so he took over writing prompts for the AI. We did a lot of versions—hundreds of prompts—to get the right image. It’s all about iterations. You’re not going to get the best result on the first try. Even if it looks good initially, it could always be better after the 200th iteration. We also used image references instead of relying entirely on text prompts. Since this is an AI world, I wanted everything to be AI-generated without using any specific references to an artist or style.

Noah Kadner: Do you have any advice for animators who want to experiment with AI?

Jeremy Higgins: Be open to the weirdness that comes with AI. It has its visual language. Sometimes, it almost looks real; other times, it’s strange. But if something unexpected happens, sometimes you have to embrace it and go with the flow. When you want to be more traditional, you can be, but you have to find a way to work with the randomness. You just have to keep trying things until you start to feel good about it.

Storyboards from Migration.

Noah Kadner: What’s been the audience reaction to Migration?

Jeremy Higgins: The reaction has been positive. I love when people watch something and don’t know how it was made. I think that curiosity adds to the experience. I wanted to create something that would put a smile on people’s faces, and even though it’s an AI-generated world, it still feels relatable. The ending has this twist, where the creature eats what he had a beautiful moment with—it’s confusing, funny, and just the circle of life. I hoped for That kind of response: a mix of surprise and enjoyment.

Noah Kadner: Have any other AI-driven projects inspired you recently?

Jeremy Higgins: A few caught my attention, especially those that blend live-action with AI elements. I saw one where they combined live-action people over a green screen with AI-generated backgrounds and 3D simulations. It was beautiful and showed how AI can be integrated into different mediums. I haven’t seen a lot of projects like mine with 2D animation. It feels like animators are still figuring out how to use AI effectively, but I think there’s a lot of potential there. That’s one of the reasons I made Migration– to show that AI has a place in 2D animation, and it can be fun and strange simultaneously.

Noah Kadner: Where do you see AI in animation going in the next few years?

Jeremy Higgins: In the next few years, I’d love to see higher resolutions—like going from 720p to 1080p—and longer generation times. I’m curious what a one-minute generation would look like. We will see AI working its way more into the film industry, in visual effects and other areas. It’s just another tool to add to your arsenal. The human element is still essential—we’re the ones in control, and we need to make it good.Rough animation test over Gen AI background.

Noah Kadner: What’s next for you?

Jeremy Higgins: For me, I’m always in mixed media land. I can appreciate where purely prompting is going, but I’ve always combined different mediums in my projects. AI is just one more layer I can incorporate. I’m currently working on something that blends live-action footage with generative AI elements. I want to push that combination—make something that feels familiar and dreamlike.


  • Directed by Jeremy Higgins
  • Written by Evan Johnson/Jeremy Higgins
  • Art Directors: Jeremy Higgins/Britton Korbel
  • Animators: Suejee Lee, Haolun Liu, Britton Korbel
  • AI Animation: Evan Johnson Type: Min Kim

For more information about Migration, please check out this blog post from Runway: https://runwayml.com/customers/behind-the-scenes-of-migration-with-director-jeremy-higgins 

-Presented By-