The model was developed by an independent AI research collective, HappyHorse AI, led by Zhang Di (ex-Kuaishou VP and Kling AI architect), comprising a team formerly from Alibaba’s Taotian Group Future Life Lab (ATH-AI); multiple reports link it to Alibaba Group, which anonymously released it amid its Token Hub AI consolidation, though the team presents it as independent with full weights and code on GitHub/Hugging Face.[1][3][4][6][8][13] Competitors include closed-source models like Seedance 2.0 (ByteDance), SkyReels V4, and Kling 3.0.
This follows Alibaba's AI push after consolidating operations into Token Hub for streamlined development and revenue growth, building on prior work like Kling AI; the anonymous drop on April 8 propelled it to #1 in blind user preference tests by April 9, highlighting a shift from closed-source dominance.[1][4][5] No API exists yet, limiting production use despite open-source access.[2][7]
It's newsworthy for proving open-source can outperform proprietary giants in real-user blind tests (Elo-based on quality, motion, faithfulness), signaling intensified China-U.S. AI video rivalry, potential enterprise deployment via Alibaba Cloud, and accessible innovation with commercial licensing across 7 languages.[1][2][4][5][10]