DUBLIN–(BUSINESS WIRE)–The “China Automotive Multimodal Interaction Development Research Report, 2025” report has been added to ResearchAndMarkets.com’s offeringDUBLIN–(BUSINESS WIRE)–The “China Automotive Multimodal Interaction Development Research Report, 2025” report has been added to ResearchAndMarkets.com’s offering

China Automotive Multimodal Interaction Development Research Report 2025: Closed-Loop Evolution of Multimodal Interaction – Progressive Evolution of L1~L4 Intelligent Cockpits – ResearchAndMarkets.com

2026/01/26 23:36
7 min read

DUBLIN–(BUSINESS WIRE)–The “China Automotive Multimodal Interaction Development Research Report, 2025” report has been added to ResearchAndMarkets.com’s offering.

Research on Automotive Multimodal Interaction: The Interaction Evolution of L1~L4 Cockpits.

This report comprehensively sorts out the installation of Interaction Modalities in automotive cockpits, multimodal interaction patents, mainstream cockpit interaction modes, application of interaction modes in key vehicle models launched in 2025, cockpit interaction solutions of automakers/suppliers, and integration trends of multimodal interaction.

I. Closed-Loop Evolution of Multimodal Interaction: Progressive Evolution of L1~L4 Intelligent Cockpits

According to the “White Paper on Automotive Intelligent Cockpit Levels and Comprehensive Evaluation” jointly released by the China Society of Automotive Engineers (China-SAE), five levels of intelligent cockpits are defined: L0-L4.

As a key driver for cockpit intelligence, multimodal interaction capability relies on the collaboration of AI large models and multiple hardware to achieve the fusion processing of multi-source interaction data. On this basis, it accurately understands the intentions of drivers and passengers and provides scenario-based feedback, ultimately achieving natural, safe, and personalized human-machine interaction. Currently, the automotive intelligent cockpit industry is generally in the L2 stage, with some leading manufacturers exploring and moving towards the L3.

The core feature of L2 intelligent cockpits is “strong perception, weak cognition”. In the L2 stage, the multimodal interaction function of cockpits achieves signal-level fusion. Based on multimodal large model technology, it can “understand users’ ambiguous intentions” and “simultaneously process multiple commands” to execute users’ immediate and explicit commands. At present, most mass-produced intelligent cockpits can enable this.

In the case of Li i6, it is equipped with MindGPT-4o, the latest multimodal model which boasts understanding and response capabilities with ultra-long memory and ultra-low latency, and features more natural language generation. It supports multimodal “see and speak” (voice + vision fusion search: allowing illiterate children to select the cartoons they want to watch by describing the content on the video cover); multimodal referential interaction (voice + gesture: Voice reference to objects: while issuing commands, extend the index finger: pointing left can control the window and complete vehicle control. Voice reference to personnel: passengers in the same row can achieve voice control over designated personnel through gesture and voice coordination, e.g., pointing right and saying “Turn on the seat heating for him”).

The core feature of L3 intelligent cockpits is “strong perception, strong cognition”. In the L3 stage, the multimodal interaction function of cockpits achieves cognitive-level fusion. Relying on large model capabilities, the cockpit system can comprehensively understand the complete current scenario and actively initiate reasonable services or suggestions without the user issuing explicit commands.

The core feature of L4 intelligent cockpits is “full-domain cognition and autonomous evolution”, creating a “full-domain intelligent manager” for users. In the L4 stage, the application of intelligent cockpits will go far beyond the tool attribute and become a “digital twin partner” that can predict users’ unspoken needs, have shared memories, and dispatch all resources for users. Its core experience is: before the user clearly perceives or expresses the need, the system has completed prediction and planning and entered the execution state.

II. Multimodal AI Agent: Understand What You Need and Predict What You Think

AI Agent can be regarded as the core execution unit and key technical architecture for the specific implementation of functions in the evolution of intelligent cockpits from L2 to L4. By integrating voice, vision, touch and situational information, AI Agent can not only “understand” commands, but also “see” the environment and “perceive” the state, thereby integrating the original discrete cockpit functions into a coherent, active and personalized service process.

Agent applications under L2 can be regarded as “enhanced command execution”, which is the ultimate extension of L2 cockpit interaction capabilities. Based on large model technology, the cockpit system decomposes a user’s complex command into multiple steps and then calls different Agent tools to execute them.

In the next level of intelligent cockpits, Agent applications will change from “you say, I do” to “I watch, I guess, I suggest, let’s do it together”. Users do not need to issue any explicit commands. They just sigh and rub their temples, and the system can comprehensively judge data from “camera” (tired micro-expressions), “biological sensors” (heart rate changes), “navigation data” (continuous driving for 2 hours), and “time” (3 pm (afternoon sleepiness period)) via the large model to know that “the user is in the tired period of long-distance driving and has the need to rest and refresh”.

Based on this, the system will take the initiative to initiate interaction: “You seem to need a rest. There is a service zone* kilometers ahead with your favorite ** coffee. Do you need me to turn on the navigation? At the same time, I can play refreshing music for you.” After the user agrees, the system then calls navigation, entertainment and other Agent tools.

Key Topics Covered:

1 Overview of Multimodal Interaction in Automotive Cockpits
1.1 Development Stages of Intelligent Cockpits

1.2 Definition of Multimodal Interaction

1.3 Development System of Multimodal Interaction

1.4 Introduction to Core Interaction Modality Technologies: Haptic Interaction

1.5 Application Scenarios of Large Models in Intelligent Cockpits

1.6 Vehicle-Human Interaction Functions Based on Multimodal AI Large Models

1.7 Industry Chain of Multimodal Interaction

1.8 Industry Chain of Multimodal AI Large Models

1.9 Policy Environment for Multimodal Interaction

1.10 Installation of Interaction Modalities in Cockpits

2 Summary of Patents Related to Automotive Multimodal Interaction
2.1 Summary of Patents Related to Haptic Interaction

2.2 Summary of Patents Related to Auditory Interaction

2.3 Summary of Patents Related to Visual Interaction

2.4 Summary of Patents Related to Olfactory Interaction

2.5 Summary of Patents Related to Other Featured Interaction Modalities

3 Multimodal Interaction Cockpit Solutions of OEMs
3.1 BYD

3.2 SAIC IM Motors

3.3 FAW Hongqi

3.4 Geely

3.5 Great Wall Motor

3.6 Chery

3.7 Changan

3.8 Voyah

3.9 Li Auto

3.10 NIO

3.11 Leapmotor

3.12 Xpeng

3.13 Xiaomi

3.14 BMW

4 Multimodal Cockpit Solutions of Suppliers
4.1 Desay SV

4.2 Joyson Electronics

4.3 SenseTime

4.4 iFLYTEK

4.5 Thundersoft

4.6 Huawei

4.7 Baidu

4.8 Banma Zhixing

5 Application Cases of Multimodal Interaction Solutions for Typical Vehicle Models
5.1 Summary of Application Cases of Multimodal Interaction Solutions for Typical Vehicle Models

5.2 All-New IM L6

5.2.1 Panoramic Summary of Multimodal Interaction Functions

5.2.2 Analysis of Featured Modal Interaction Capabilities

5.3 Fangchengbao Bao 8

5.4 Hongqi Jinkuihua Guoya

5.5 Denza N9

5.6 Zeekr 9X

5.7 Geely Galaxy A7

5.8 Leapmotor B10

5.9 Li i6

5.10 Xpeng G7

5.11 Xiaomi YU7

5.12 MAEXTRO S800

5.13 2025 AITO M9

5.14 All-New BMW X3 M50

5.15 2026 Audi E5 Sportback

5.16 All-New Mercedes-Benz Electric CLA

6 Summary and Development Trends of Multimodal Interaction
6.1 Summary of Large Model Configuration Parameters of OEMs

6.2 Trend 1: Evolution of Multimodal Interaction Based on AI Large Models}

6.3 Trend 2: Cockpit Scenario Application Cases

6.4 Trend 3 (Voice Interaction)

6.5 Trend 4 (Visual Interaction)

For more information about this report visit https://www.researchandmarkets.com/r/layqmj

About ResearchAndMarkets.com
ResearchAndMarkets.com is the world’s leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Contacts

ResearchAndMarkets.com

Laura Wood, Senior Press Manager

press@researchandmarkets.com

For E.S.T Office Hours Call 1-917-300-0470

For U.S./ CAN Toll Free Call 1-800-526-8630

For GMT Office Hours Call +353-1-416-8900

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

‘Love Island Games’ Season 2 Release Schedule—When Do New Episodes Come Out?

‘Love Island Games’ Season 2 Release Schedule—When Do New Episodes Come Out?

The post ‘Love Island Games’ Season 2 Release Schedule—When Do New Episodes Come Out? appeared on BitcoinEthereumNews.com. LOVE ISLAND GAMES — Episode 201 — Pictured: Ariana Madix — (Photo by: Ben Symons/PEACOCK via Getty Images) Ben Symons/PEACOCK via Getty Images We’ve got a text! It’s time for another season of Love Island Games. With fan-favorites returning in hopes of winning the $250,000 cash prize, read on to learn more about Love Island Games Season 2, including the release schedule so you don’t miss a second of drama. Love Island Games is a spinoff in the Love Island franchise that first premiered in 2023. The show follows a similar format to the original series, but with one major twist: all contestants are returning Islanders from previous seasons of Love Island from around the world, including the USA, UK, Australia and more. Another big difference is that games take on much more importance in Love Island Games than the mothership version, with the results “determining advantages, risks, and even who stays and who goes,” according to Peacock. Vanderpump Rules star Ariana Madix is taking over hosting duties for Love Island Games Season 2, replacing Love Island UK star Maya Jama who hosted the first season. Iain Stirling returns as the show’s narrator, while UK alum Maura Higgins will continue to host the Saturday show Love Island: Aftersun. ForbesWho’s In The ‘Love Island Games’ Season 2 Cast? Meet The IslandersBy Monica Mercuri Jack Fowler and Justine Ndiba were named the first-ever winners of Love Island Games in 2023. Justine had previously won Love Island USA Season 2 with Caleb Corprew, while Jack was a contestant on Love Island UK Season 4. In March 2024, Fowler announced on his Instagram story that he and Justine decided to remain “just friends.” The Season 2 premiere revealed the first couples of the season: Andrea Carmona and Charlie Georgios, Andreina Santos-Marte and Tyrique Hyde,…
Share
BitcoinEthereumNews2025/09/18 04:50
Trend Research has liquidated its ETH holdings and currently has only 0.165 coins remaining.

Trend Research has liquidated its ETH holdings and currently has only 0.165 coins remaining.

PANews reported on February 8 that, according to Arkham data, Trend Research, a subsidiary of Yilihua, has liquidated its ETH holdings, with only 0.165 ETH remaining
Share
PANews2026/02/08 11:07
Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council

Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council

The post Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council appeared on BitcoinEthereumNews.com. Michael Saylor and a group of crypto executives met in Washington, D.C. yesterday to push for the Strategic Bitcoin Reserve Bill (the BITCOIN Act), which would see the U.S. acquire up to 1M $BTC over five years. With Bitcoin being positioned yet again as a cornerstone of national monetary policy, many investors are turning their eyes to projects that lean into this narrative – altcoins, meme coins, and presales that could ride on the same wave. Read on for three of the best crypto projects that seem especially well‐suited to benefit from this macro shift:  Bitcoin Hyper, Best Wallet Token, and Remittix. These projects stand out for having a strong use case and high adoption potential, especially given the push for a U.S. Bitcoin reserve.   Why the Bitcoin Reserve Bill Matters for Crypto Markets The strategic Bitcoin Reserve Bill could mark a turning point for the U.S. approach to digital assets. The proposal would see America build a long-term Bitcoin reserve by acquiring up to one million $BTC over five years. To make this happen, lawmakers are exploring creative funding methods such as revaluing old gold certificates. The plan also leans on confiscated Bitcoin already held by the government, worth an estimated $15–20B. This isn’t just a headline for policy wonks. It signals that Bitcoin is moving from the margins into the core of financial strategy. Industry figures like Michael Saylor, Senator Cynthia Lummis, and Marathon Digital’s Fred Thiel are all backing the bill. They see Bitcoin not just as an investment, but as a hedge against systemic risks. For the wider crypto market, this opens the door for projects tied to Bitcoin and the infrastructure that supports it. 1. Bitcoin Hyper ($HYPER) – Turning Bitcoin Into More Than Just Digital Gold The U.S. may soon treat Bitcoin as…
Share
BitcoinEthereumNews2025/09/18 00:27