Close Menu
    Facebook X (Twitter) Instagram
    • Privacy Policy
    • Terms Of Service
    • Legal Disclaimer
    • Social Media Disclaimer
    • DMCA Compliance
    • Anti-Spam Policy
    Facebook X (Twitter) Instagram
    Brief ChainBrief Chain
    • Home
    • Crypto News
      • Bitcoin
      • Ethereum
      • Altcoins
      • Blockchain
      • DeFi
    • AI News
    • Stock News
    • Learn
      • AI for Beginners
      • AI Tips
      • Make Money with AI
    • Reviews
    • Tools
      • Best AI Tools
      • Crypto Market Cap List
      • Stock Market Overview
      • Market Heatmap
    • Contact
    Brief ChainBrief Chain
    Home»AI News»Trillion-parameter AI model: Ant Group’s Ling-1T launch
    Trillion-parameter AI model: Ant Group's Ling-1T launch
    AI News

    Trillion-parameter AI model: Ant Group’s Ling-1T launch

    October 17, 20254 Mins Read
    Share
    Facebook Twitter LinkedIn Pinterest Email
    binance


    Ant Group has entered the trillion-parameter AI model arena with Ling-1T, a newly open-sourced language model that the Chinese fintech giant positions as a breakthrough in balancing computational efficiency with advanced reasoning capabilities.

    The October 9 announcement marks a significant milestone for the Alipay operator, which has been rapidly building out its artificial intelligence infrastructure across multiple model architectures. 

    The trillion-parameter AI model demonstrates competitive performance on complex mathematical reasoning tasks, achieving 70.42% accuracy on the 2025 American Invitational Mathematics Examination (AIME) benchmark—a standard used to evaluate AI systems’ problem-solving abilities.

    According to Ant Group’s technical specifications, Ling-1T maintains this performance level while consuming an average of over 4,000 output tokens per problem, placing it alongside what the company describes as “best-in-class AI models” in terms of result quality.

    notion

    Dual-pronged approach to AI advancement

    The trillion-parameter AI model release coincides with Ant Group’s launch of dInfer, a specialised inference framework engineered for diffusion language models. This parallel release strategy reflects the company’s bet on multiple technological approaches rather than a single architectural paradigm.

    Diffusion language models represent a departure from the autoregressive systems that underpin widely used chatbots like ChatGPT. Unlike sequential text generation, diffusion models produce outputs in parallel—an approach already prevalent in image and video generation tools but less common in language processing.

    Ant Group’s performance metrics for dInfer suggest substantial efficiency gains. Testing on the company’s LLaDA-MoE diffusion model yielded 1,011 tokens per second on the HumanEval coding benchmark, versus 91 tokens per second for Nvidia’s Fast-dLLM framework and 294 for Alibaba’s Qwen-2.5-3B model running on vLLM infrastructure.

    “We believe that dInfer provides both a practical toolkit and a standardised platform to accelerate research and development in the rapidly growing field of dLLMs,” researchers at Ant Group noted in accompanying technical documentation.

    Ecosystem expansion beyond language models

    The Ling-1T trillion-parameter AI model sits within a broader family of AI systems that Ant Group has assembled over recent months. 

    The company’s portfolio now spans three primary series: the Ling non-thinking models for standard language tasks, Ring thinking models designed for complex reasoning (including the previously released Ring-1T-preview), and Ming multimodal models capable of processing images, text, audio, and video.

    This diversified approach extends to an experimental model designated LLaDA-MoE, which employs Mixture-of-Experts (MoE) architecture—a technique that activates only relevant portions of a large model for specific tasks, theoretically improving efficiency.

    He Zhengyu, chief technology officer at Ant Group, articulated the company’s positioning around these releases. “At Ant Group, we believe Artificial General Intelligence (AGI) should be a public good—a shared milestone for humanity’s intelligent future,” He stated, adding that the open-source releases of both the trillion-parameter AI model and Ring-1T-preview represent steps toward “open and collaborative advancement.”

    Competitive dynamics in a constrained environment

    The timing and nature of Ant Group’s releases illuminate strategic calculations within China’s AI sector. With access to cutting-edge semiconductor technology limited by export restrictions, Chinese technology firms have increasingly emphasised algorithmic innovation and software optimisation as competitive differentiators.

    ByteDance, parent company of TikTok, similarly introduced a diffusion language model called Seed Diffusion Preview in July, claiming five-fold speed improvements over comparable autoregressive architectures. These parallel efforts suggest industry-wide interest in alternative model paradigms that might offer efficiency advantages.

    However, the practical adoption trajectory for diffusion language models remains uncertain. Autoregressive systems continue dominating commercial deployments due to proven performance in natural language understanding and generation—the core requirements for customer-facing applications.

    Open-source strategy as market positioning

    By making the trillion-parameter AI model publicly available alongside the dInfer framework, Ant Group is pursuing a collaborative development model that contrasts with the closed approaches of some competitors. 

    This strategy potentially accelerates innovation while positioning Ant’s technologies as foundational infrastructure for the broader AI community.

    The company is simultaneously developing AWorld, a framework intended to support continual learning in autonomous AI agents—systems designed to complete tasks independently on behalf of users.

    Whether these combined efforts can establish Ant Group as a significant force in global AI development depends partly on real-world validation of the performance claims and partly on adoption rates among developers seeking alternatives to established platforms. 

    The trillion-parameter AI model’s open-source nature may facilitate this validation process while building a community of users invested in the technology’s success.

    For now, the releases demonstrate that major Chinese technology firms view the current AI landscape as fluid enough to accommodate new entrants willing to innovate across multiple dimensions simultaneously.

    See also: Ant Group uses domestic chips to train AI models and cut costs

    Banner for AI & Big Data Expo by TechEx events.

    Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the Cyber Security Expo, click here for more information.

    AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.



    Source link

    10web
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    CryptoExpert
    • Website

    Related Posts

    The cost of thinking | MIT News

    November 20, 2025

    Google DeepMind’s WeatherNext 2 Uses Functional Generative Networks For 8x Faster Probabilistic Weather Forecasts

    November 18, 2025

    CFOs Bet Big on AI-But Warn the Real Wins Come Only When Strategy Takes the Wheel

    November 17, 2025

    MIT researchers propose a new model for legible, modular software | MIT News

    November 16, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    notion
    Latest Posts

    What’s Going On With Saylor’s Bitcoin Strategy, And Is A Collapse Coming?

    November 20, 2025

    Prospective CFTC chair Addresses DeFi Regulation at Nomination Hearing

    November 20, 2025

    The cost of thinking | MIT News

    November 20, 2025

    Early Recovery In Bitcoin, Altcoins Falters: Are New Lows Incoming?

    November 20, 2025

    XRP sees profitability plunge to lowest since 2024 election

    November 20, 2025
    Customgpt
    LEGAL INFORMATION
    • Privacy Policy
    • Terms Of Service
    • Legal Disclaimer
    • Social Media Disclaimer
    • DMCA Compliance
    • Anti-Spam Policy
    Top Insights

    Inside his leveraged crypto liquidation meltdown

    November 21, 2025

    Cayman Court Grants Core Foundation Injunction to Stop Maple Finance’s Bitcoin Product

    November 21, 2025
    bybit
    Facebook X (Twitter) Instagram Pinterest
    © 2025 BriefChain.com - All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.