Mercedes is pushing boundaries with its next-gen MBUX Virtual Assistant, set to debut on the CLA in 2025. This advanced system aims to integrate seamlessly into the vehicles digital ecosystem, offering more than just basic voice commands. By leveraging generative AI and biometric data from cabin sensors, Mercedes hopes to personalize the driving experience based on individual preferences and routines. However, the extent to which drivers are willing to share their digital lives with MB.OS will determine the assistants effectiveness and the speed of its improvements.

Your experience of Mercedes upcoming next-generation virtual assistant — and the luxury cars that its a part of — may not be the same as your neighbors, and a lot of that will come down to trust and just how willing you are to trade access to your digital life in return of a more contextually aware vehicle. At CES 2024, Mercedes unveiled arguably the centerpiece of its future cabin experience, the MBUX Virtual Assistant . Promising to go beyond the canned responses and simplistic interactions of the current voice technology, itll be as much a copilot as it is a hands-free way to control multimedia and navigation.

“There is a combination of a belief, but also feedback, that the ultimate way we communicate is voice,” Magnus Östberg, Chief Software Officer at Mercedes-Benz AG, said during a roundtable discussion at CES 2024, where the automaker hosted SlashGear. “Technology has limited us doing that with the car.”

Tapping generative AI, among other services, the new “Hey Mercedes” assistant will be based on MB.OS, the next-gen digital architecture for vehicles that will debut on the CLA ( previewed by the Concept CLA sedan ) in 2025. Itll weave a digital web between all the systems in the car, from electric powertrains to Mercedes and third-party providers apps running on multiple screens throughout the cabin. But, Östberg suggests, if Mercedes has its way — and if drivers trust their car enough to share the minutiae of their digital life with it — you might not even need to glance at those screens to get the best of the car.

Only speaking up if youre receptive to feedback

“Youre in a moving object, thats just a fact, youre in a moving object,” Östberg points out, “so youre going somewhere, or you are somewhere. And putting that context into place — either if that is time of day, is this part of your regular routine, where are you going, are you off your regular routine — and then make suggestions in that context.”

Whether the MBUX Virtual Assistant will speak up and make those suggestions, however, could depend on just how receptive MB.OS thinks youll be. With a selection of cabin sensors to choose from, including multiple cameras in recent models like the new E-Class , Mercedes has a wealth of biometric data from which to figure out if youre looking for a chatty copilot or not.

“And then, depending on the mood of the driver — we can detect the mood either from the voice, or from facial recognition if the customer has enabled that — to either make suggestions or not to make suggestions,” Östberg explains.

Your car may not be as smart as mine

Just how rapidly your new Mercedes will improve and get more accurate in its interactions, suggestions, and contextual awareness will depend on how willing you are to open your own life to MB.OS. Certainly, therell be no obligation to link your calendar, inform the MBUX Virtual Assistant of your favorite places, or generally unlock your digital world for your car to sift through.

Without that, however — or with the judicious application of more cautious settings — your experience of the technologies that Mercedes is working on may feel a little less dramatic or transformational. Older customers, Östberg suggests, might be wary of allowing MB.OS free rein to read their calendar (assuming they even have their schedule in the cloud). For them, the car might be a little more old-school for longer.

The degree of sharing will, therefore, educate just how fast the assistant improves. That will also affect how useful the large language model (LLM) based voice interactions are, though their Mercedes isnt counting on that technology alone. While LLMs can help deliver more fluent dialog that sounds more natural, Östberg says, the accuracy isnt always there. For that reason, Mercedes is integrating natural language models and rule-based systems.

Beta testers love it, but success is far from guaranteed

The proof of the pudding will, of course, be just how much Mercedes new technologies are actually used. That measure will be based on both existing research — including Mercedes beta testers, of which around 900,000 vehicle owners are currently eligible — and on feedback once those systems are out in the wild.

LLMs, for instance, are going to take some tuning. “Its a brand new tool,” Östberg says, “its a tool we didnt have before. But we have to learn how to use that tool.”

From the beta, theres been a 50% uptick in “Hey Mercedes” usage versus the traditional voice control system. Östberg also plans to use anonymized use data from vehicles in the global fleet to see geographic differences, which features are being used frequently, and which might not have hit the mark at all. Therell also be more detailed feedback when the MBUX Virtual Assistant cant help, whether thats for reasons of lackluster connectivity or anything else. In short, if you want your car to be good conversation, it might require some sharing — and some patience — in order to unlock the most useful gains.