Edited by Elizabeth Kealoha. Pod Digest and Images by Ant Neely.
Your weekly skim of what the Machine Cinema crew debated, discovered, and dropped into the chat.
Too busy to read? Have a listen instead.
If you’d like to become a sponsor, drop into our DMs!
Overheard in “Basecamp”…
“A Sora 2 Takeover”
OpenAI’s newest model didn’t just drop — it dominated. From looping-shot microcinema to invite-code black markets, last week’s Basecamp felt like a single shared film set: everyone experimenting, speculating, or troubleshooting Sora 2.
Full disclosure, we had our robot friend help us pull all this together and sometimes they are prone to making harmless mistakes.
🎬 Sora 2: The Ten-Second Revolution
Sora 2 officially launched October 1 — faster, social-first, and capped at 10 seconds. The community immediately got to work, treating each clip like a miniature movie.
Key Takeaways
UGC before auteurs. The 10-second limit pushes virality and short-form behavior before “real” filmmaking tools arrive.
Pro mode = slower, sharper. Early testers found it up to 10× slower but noticeably higher in coherence and lighting.
Install hack: add Sora to your iPhone via Safari → Add to Home Screen for smoother workflow.
Watermark reality. It’s burned into every frame; reframing or cropping works better than AI removal tools.
Invite economy. Codes went parabolic — a “take one, leave one” exchange kept the queue humane.
Opt-Out vs Opt-In Debate. Members flagged how OpenAI’s new opt-out default for using creator data to train Sora 2 differs sharply from earlier opt-in models.
“Consumers first. Creatives later.”
Creative Experiments
Looping-shot storytelling. One member’s six-shot “fetch” loop became a community template: define every shot, then let Sora interpret.
Storyboard integration. DIY boards and Krea-to-Sora workflows kept short-form continuity tight.
Audio polish. Quick fixes included DaVinci Resolve’s Dialogue Isolate, Adobe’s cleanup panel, and AudioShake for stem separation.
Speculation & Signals
DevDay (Oct 6) is expected to unveil longer clips and finer controls.
Google’s countermove. NotebookLM and the rumored “Sparkify” were flagged as the likely response.
Social virality vs. pro pipelines. Sora 2 might be YouTube’s next incubator for AI auteurs — or its short-form content lab.
Sora 2 Video generated by Jay Judah
🧰 Tools Spotlight
Audio: AudioShake, DaVinci Resolve Dialogue Isolate, Adobe Podcast AI
Visual: Krea Real-Time, Midjourney Gen Fill, MagicEraser (video watermark removal)
Productivity: NotebookLM (knowledge sync), Safari PWA trick for Sora 2
💬 Community Perks
Infinity Festival LA partnership confirmed — discounted passes for Machine Cinema members are live. 20% off admission with code IF2520. Big Shout Out to Erik Weaver for making this possible. Please make sure to check out Erik’s panel, AI Matters.
Workshop alert: Vibe Coding Session 2 (Free) — Oct 16 with a quick refresher Oct 7. Great for non-coders building AI-assisted prototypes.
MoMI LAB: “Propaganda in the Age of AI”, PIntimate conversation with Ari Kuschnir, hands-on demos, and a reception. This is the first event leading up to the opening of MoMI LAB, a new space for emerging moving image technology at the Museum of the Moving Image.
🔗 Link Drop
Real Creative “Pick of the Week”
Each week the Machine Cinema members + the Real Creative team obsess over social feeds in search of the world's best AI creative video, gaming, and music projects.
About the project: Tiffani Lee Joseph is an award-winning AI Storyteller based in New York City. She created this week’s Pick of the Week, “Eat the Rich,” an animation that satirizes capitalism. Tiffani developed this short between March and June, using tools such as Midjourney, Runway, Kling, Hedra and Eleven Labs. “Eat the Rich” is a studio-commissioned short slated for release later this year. She is currently collaborating with XXXXXX on AI-driven creative solutions shaping the future of their programming.
Check out “Eat the Rich” here, along with some of her other works including the multi-award-winning short “Pieces,” which explores human truths with haunting intimacy.
You can find this and over 300 AI filmmakers and their projects over at realcreative.ai which features some familiar faces from the Machine Cinema community.
Overhead in “We Love Robots”: Agents, Tariffs & Infinite Remix
This week the conversation swung wide — from geopolitics to open frameworks to mythopoetic cinema. Between U.S. tariffs on foreign films, a new Agentic Commerce Protocol, and psychedelic storytelling experiments, We Love Robots felt like a dispatch from the near future of creative AI.
🧩 Thought of the Week
AI conversations are evolving from tool talk to world-building. From agent protocols to infinite films, the frontier now feels alive — not just coded.
“The next platform isn’t a screen. It’s a collective imagination.”
🎮 EA Goes Private (Again)
Electronic Arts is reportedly going private in a massive deal with Silver Lake and Saudi Arabia’s PIF, drawing instant flashbacks to the Activision buyout era.
Members hope it signals a creative reset after years of “slop.”
“Maybe they’ll finally make great games again.”
🎬 U.S. Tariff Shock: 100% on Foreign-Made Movies
Reuters dropped a bombshell: the U.S. will impose a 100% tariff on films produced outside the country.
What defines a foreign movie?
Do Hollywood productions shot abroad count?
🧠 Tech Companies & the Efficiency Debate
A lively thread compared tech firms’ relentless efficiency to their monopoly safety nets.
“Their real strength is producing truly new things.”
“Marketing isn’t why they win — ferocious iteration is.”
The consensus: efficiency, yes — but powered by market dominance and obsessive product fit, not just genius.
🧩 The Agentic Commerce Protocol
Stripe + OpenAI released an open standard for agents to buy, sell, and coordinate autonomously — a foundational step for Agentic AI commerce.
“Agents that can buy, sell, and coordinate — this changes business workflows entirely.”
🧬 Microsoft’s Multi-Agent Push
Microsoft’s ecosystem keeps sprawling: Semantic Kernel, Agent Framework, and Magentic One are finally converging into an enterprise pathway.
“It’s easier than ever to adopt Agentic AI in the enterprise.”
💎 IBM Granite 4 and the Open-Source Momentum
IBM announced Granite 4.0 — open-source, ISO-certified, cryptographically signed.
Hybrid + hyper-efficient = verifiable AI stack.
Hugging Face’s CTO called it “a cheaper Claude code.”
🌍 Culture & Psychedelic Storytelling
A member unveiled Solipsis — an agentically-fueled, infinite-loop film blending psychedelics, myth, and collective remix.
“An endlessly remixable story arc across five cultural realities… human shepherds guiding the algorithmic infinite.”
It landed as a new vision for AI cinema: interactive myth-making as collaboration.
🧰 Tool Talk
⭐ Manus + Claude 4.5 → standout combo for writing proposals and creative briefs.
😩 Canvas fatigue → ChatGPT’s Canvas “keeps breaking” on long docs.
🧪 Alternatives → Claude + Gamma, Perplexity Projects, z.ai.
“If I really want magic, I’ll pour money into Lovable or Replit Agent Max mode.”
💡 Other Sparks
Faceswaps on Facetime?? Prepare for catfishing
GenTalks Community Call October 1, 2025
Our GenDojo Community Call is a weekly digital get together to connect on how we are all making our way through this new era for creative industries and AI among other emergent technologies. Each week we invite artists, builders, thought leaders to share their knowledge, their works in progress and their ideas on this emergent space.
If you’d like to be added to the recurring invite please DM.
Machine Cinema Linktree: https://linktr.ee/machinecinema
Guests:
Nik Kleverov (Native Foreign / “Critterz”) + D. Ryan Reeb (Obsidian Studio) w/ cameo from Wes Walker
The Conversation
This session unfolded the morning after Sora 2 dropped—so the first 10 minutes were pure cinematic-AI buzz. The community dissected how OpenAI’s short-form app merges publishing + communication, not unlike the early printing press moment for video.
Then Fred introduced Nik Kleverov, whose studio Native Foreign is behind “Critterz” (the first widely-recognized AI-assisted feature now in production with Vertigo Films).
Later, Ryan Reeb and Wes Walker from Obsidian Studio showcased how hybrid pipelines—AI + CG + live action—are rewriting commercial and feature workflows.
Highlights
Sora 2 and the New Creative Medium
Kenny Miller & John Gauntt framed Sora 2 not as a tool but as a communication shift—“people publishing themselves.”
Fred Graver: “Forget the competition—your rival is Sora and the amount of time people spend making and watching short AI videos.”
Glory House: Sora’s “Cameo” and “Remix” modes make virality inevitable.
Fred Grinstein: “It feels more like the printing press moment than Web3 ever did.”
Nik Kleverov — From Main Titles to ‘Critterz’
Traced his path from Digital Kitchen main-title work (Dexter, True Blood) → co-founding Native Foreign → embracing AI.
“Critterz” (OpenAI-backed, with Paddington writers, aiming for summer 2026) became the first AI film to solve the copyright/authorship puzzle.
Legal principle: human authorship + documented provenance.
“Behind-the-scenes is now mandatory proof of authorship.”
Studio runs dual tracks: commercial brands (Delta, Coke, Tools for Humanity) and original IPs.
Announced tongue-in-cheek partnership: “Elizabeth Holmes is leading our next fundraising round.”
On AI authorship: “Editors are the new directors. We’re directing from keyboards.”
On storytelling democratization: “These are ideas that would never have been made otherwise.”
Ryan Reeb + Wes Walker — The Obsidian Model
Ryan Reeb (VFX vet: Transformers, Marvel) now EPs AI-driven commercials and features for Obsidian Studio.
Emphasis on hybrid pipelines and artist-first royalties—training LoRAs on individual storyboard/painter styles, audited for provenance.
Wes Walker: “Great VFX is invisible VFX — you shouldn’t be able to tell how it was made.”
Using AI for LED backdrops, real-time lighting, Unreal previs, and commercials for Chanel, Crayola, Hyundai, Cadillac.
“We’re not an AI studio. We’re a creative studio that uses AI first.”
Themes & Advice
Document everything. Proof of human intent is key to copyright compliance.
Small teams = more at-bats. Lower cost means more creative experiments.
Be in the room. Both Nik and Ryan stressed in-person collaboration for big AI films.
Fear vs creativity. Reeb: “Fear and creativity aren’t in the same game. Lean in.”
Education moat. Fred: those who’ve tracked AI since its start now hold critical contextual advantage.
Community Spotlight
Adriana Vecchioli announced her live-action short Mermaid premiering at Scream Fest (Oct 13, TCL Chinese Theater).
John Fox, veteran storyboard artist, joined mid-session—quickly courted by both Nik and Ryan for future projects.
Fred closed with the first-ever Machine Cinema Freestyle Rap, performed by Nik Kleverov (!).
Key Quotes
“The editor is the new director.” — Nik Kleverov
“AI gives us a jetpack; people fighting it are staging a sit-in at the starting line.” — Ryan Reeb (quoting Andy Cochran)
“Forget competition — your competition is eyeballs and Sora.” — Fred Graver
“It’s not about AI films vs films. It’s just films.” — Nik Kleverov
“Fear and creativity are not in the same game.” — Ryan Reeb
Takeaways for Machine Cinema Creators
Build your own provenance trail (BTS, storyboards, human decisions).
Develop hybrid skills across VFX, editing, prompting, and production.
Start small but publish often — shorts train the market.
Use AI to prototype ideas that “would never be made otherwise.”
Collaborate in-person when stakes are high; jam together when possible.