📺 I just got early access to @runwayml Aleph, and here’s my honest take: Some outputs look janky, while others feel stunningly realistic — and the difference often comes down to your source footage, and how much you push the prompt (so yes you need to play with it). There’s definitely an lexical art to prompting. But even with flaws, Aleph proves the point that matters most here: VFX has officially entered the chat. Still a bit of hit or miss like any other AI tool. Textures sometimes feel like a cutscene from a 2010 video game. And yes, outputs are currently locked at 5 seconds max, but for sure this is just a matter of time. Not exactly blockbuster-ready most of the time, but thats OK. If this is the worst we gonna get, I'm in! But here’s the wild part — I changed scenes, lighting, camera angles, and even deleted elements from a video just by typing. No keyframes, no rotoscoping, no weeks of post. Just... prompts. That alone is a revolution, and we must admit that (even the haters). Sure, purists might roll their eyes — "That’s not real VFX!" But here's the thing: it doesn't need to be. Not for everyone. This is about access. Millions of people who were locked out of post-production workflows just got a key to the door. Think about small studios. Think about indie creators like us. What I love most is that this isn’t some research lab toy. Aleph was built from real-world creative needs — changing light, removing stuff, altering shots — all baked in. You feel Runway listened. Their Creative Partners Program is clearly steering development with user experience front and center. So no, Aleph won’t win you an Oscar just yet — but it might win over the next generation of filmmakers. That’s the trade-off: not perfection, but possibility. Public access is coming soon inside Runway Chat Mode. When it drops — try it. Play with it. Break it. 💬 Then tell me if we’re dumbing down VFX, or just finally letting it evolve? What's your take on text-based AI VFX? #runway #aleph #VFX #AI