The fear that AI marks the end of human creativity is a failure of imagination. I have never been able to be as creative as I want to be. Client restrictions, logistics, access, cost… AI changed all that. This year I got an amazing opportunity - a fellowship at The AI Exchange. This short film is my fellowship project. It’s about my great, great, great, great grandfather, Rufus. My objective was to show what level of storytelling I think is possible with AI tech today. I also wanted to try to connect personally with some ancestry research my father had done. My dad’s stack of family research included the following information: Rufus Archer lived from 1812-1909 in Salem, MA. He was a Cooper (barrel maker) and volunteer fire fighter. There were two photos of him and one of his gravestone. There was also one of those family org charts. First, I built a chronological timeline full of historical context in Perplexity. Then I built out Rufus’ persona with ChatGPT. I took that information and built a custom GPT that assumed the role of Rufus. I then interviewed RufusGPT and used his responses to build the story, just like I’ve done a hundred times producing videos. I asked him things like “why is your life hard” and learned that he was worried about keeping his kids safe in a scary world and about the trade he loved becoming obsolete because of new technology. I could relate. His words informed the prompts that became the images. I built a Midjourney Director of Photography GPT influenced by the images of Dorothea Lange and Vivian Maier. We broke down outstanding opening sequences in cinema from classics like Lawrence of Arabia and Bladerunner, which helped me build my shot list. The images were brought to life with some incredible new tools, as was Rufus’ voice, the soundscape, and the accompanying narration. This project would not have been possible pre-AI. How could it? None of the images are real except one. Ken Burns said that we have to find creative equivalencies when we don’t have the material we need. I wonder what he’d think of this project. If my goal was to get to know my ancestor, I definitely feel like I did that. That part was really cool. It was really fun to share it with my family, especially my dad, who had done the initial research. He’s a total non techie, so when I showed it to him it blew his mind. It meant a lot to him, which made me feel great. I’m incredibly grateful to Kayla and the AIX community, to Kristin my cohort partner, and to Rachel for the mentorship on the project. This experience was a gift. I’m gonna come right out and ask YOU to let me know what you think, love it or hate it. RUFUS Written by: ChatGPT Images by: Midjourney AI Tools: Runway, Leiapix, Photoshop, My Heritage, Eleven Labs, My Edit Editing: Adobe Premiere Music: Pond5 Budget: About $200 and 20 hours at my laptop
Color Tests - I knew I had to meet the tech where it is. Here is a color test that proved I could not use color yet. This is the zoom out feature in Midjourney, so this is it's attempt to color the same street view twice from slightly different perspectives. Not good enough.
AI SFX worked well enough. They wouldn't hold up in a movie theatre but for this purpose it was fine.
Rachel Woods Ancestry Midjourney Runway Stability AI Perplexity Anthropic Hugging Face OpenAI Eleven Labs MyHeritage 23andMe GBH PBS Connecticut College Transition Productions Emerson College New York Film Academy Section Alliance for Community Media MassAccess Essex County Community Foundation Boston Globe Technology Nathaniel Whittemore Rowan Cheung Isar Meitis insideBIGDATA Datafloq AI Trends Andreessen Horowitz Leonardo.Ai TechCrunch IEEE Narratize ASU Julie Ann Wrigley Global Futures Laboratory AI Tool Report The Neuron - AI News Pond5 Pip Decks 1623 Studios SXSW Massachusetts Institute of Technology