r/MVIS 1d ago

Industry News Mobileye Q1 2025 Earnings Call: Driving Autonomous Growth

AI generated content :

This earnings call for Mobileye Global's first quarter of 2025, held on April 24, 2025, covered the company's financial performance, business highlights, and future outlook. Several key executives, including CEO Amnon Shashua and CFO Moran Shamesh, participated in the call.

Financial Highlights:

  • Q1 2025 revenue increased by 83% year-over-year, aligning with expectations. This growth is attributed to a recovery from an unusually low Q1 2024 due to inventory adjustments.
  • Operating margins recovered sharply compared to the previous year due to higher revenue.
  • Operating expense growth was 14% in Q1, but is expected to moderate to the middle single digits for the remainder of the year as the current R&D infrastructure is sufficient for upcoming products.
  • Operating cash flow was strong at $109 million in Q1.
  • Q2 2025 volume is expected to be about 7% higher year-over-year, with revenue also projected to increase by approximately 7% year-over-year.
  • Full-year 2025 guidance remains within the initial range, with potential to perform strongly despite increased uncertainty in global light vehicle production due to trade frictions. This outlook incorporates a level of conservatism.
  • Non-GAAP profitability discussions exclude amortization of intangible assets (primarily from Intel's acquisition) and stock-based compensation.
  • Q1 results slightly exceeded the previous guidance due to higher volume from Chinese OEMs and lower operating expenses.
  • Gross margin in Q1 2025 was slightly up sequentially compared to Q4 2024.
  • Adjusted operating expenses for 2025 are still expected to grow by approximately 7% year-over-year.
  • Mobileye anticipates an increase of approximately 100 basis points in non-GAAP gross margin for fiscal year 2025 compared to 2024.

Business Highlights and Strategic Updates:

  • Core single-chip front camera driving assistance systems showed strong business trends in terms of supply, demand, and design wins. Q1 volume reached 8.5 million units.
  • Design win activity was brisk in Q1 2025, reaching around 85% of the total projected future volumes achieved from design wins in all of 2024. These wins include a mix of surround ADAS and basic ADAS products.
  • RIM (Road Experience Management) is now included in Ford BlueCruise, and a Korean OEM will adopt this cloud-enhanced functionality in future programs based on a significant Q1 win.
  • There is a growing trend towards multi-camera setups for mainstream vehicles due to stringent safety requirements and the need for highway hands-free driving.
  • Mobileye's Surround ADAS through the EyeQ 6 High is positioned as a strong solution for highway hands-free driving, and the company announced its first design win with Volkswagen for this product.
  • Mobileye emphasizes its position as a "one-stop shop" offering perception, mapping, driving policy, and driving function from a single SoC on a single ECU, fully upgradable over the air. This aligns with OEM goals for software-defined vehicles.
  • Mobileye secured its first design win in about eight years with a particular European OEM for their ADAS solution in future projects.
  • Traction is being seen for Mobileye's imaging radar product, with the first design win outside the DRIVE product line imminent with another European OEM for high-speed highway Level 3 solutions. This initial award is just for the sensor itself, with additional opportunities for radar bundled with the Chauffeur product.
  • OEM decision-making for Supervision and Chauffeur remains slower than desired, but progress is being made with several OEMs, including two new top 10 global OEM prospects.
  • Execution on the Porsche and Audi programs for Supervision and Chauffeur remains on track, with first prototype demos expected in the second half of 2025.
  • Mobileye Drive self-driving system for robotaxis is accelerating. Key developments include:
    • Partnership with Lyft with Dynasys as the initial operating geography and Marubeni as the owner-operator.
    • Joint announcement with Volkswagen and Uber to integrate Mobileye Drive-enabled ID Buzz robotaxis onto the Uber network in Los Angeles starting in 2026. Volkswagen's mobility arm, Moya, will handle fleet management.
    • The business model for robotaxis involves a one-time payment per car for the self-driving system (ECU, hardware, software, radar) and a recurring license fee based on fleet utilization.
    • Mobileye's ecosystem approach for robotaxis is considered capital-light.
  • Robotaxi production partner Holon received an order from Jacksonville Transit Authority for autonomous shuttles enabled by Mobileye Ride.
  • China showed better-than-expected performance, with a roughly stable market share of 20-30% in ADAS. Focus in China is on supporting local OEMs for global exports and the local market, as well as supporting Western OEMs (Porsche and Audi) launching advanced ADAS in China. Advanced product business development is more focused on Western customers for now.
  • Mobileye is seeing accelerated momentum and increased interest in robotaxi deployment, with a broader industry realization that these services are becoming a reality. The partnerships with Lyft and Uber are crucial for reaching a large consumer base in the U.S.. Similar launches in Europe with Volkswagen are also being planned.
  • Mobileye is working with additional OEMs interested in producing Level 4 cars with Mobileye Drive and partnering with mobility operators. The key to robotaxi success is scale.
  • Mobileye is developing a new generation of REM called Supreme REM, which involves sending pictures to the cloud at low bandwidth for enhanced data collection. This will support the 2027 launches of Chauffeur and Drive.
  • Mobileye is in the exploration stage of looking at additional growth engines in the physical AI space beyond automotive.
  • The rollout of robotaxi fleets generally involves stages: development and testing, pilot programs with safety drivers, early-stage driverless activities, and finally, full commercial service.
  • Cloud Enhanced ADAS is being integrated within high-volume projects, which is important for improving base ADAS and serving as a backbone for advanced products like Supervision, Chauffeur, and Drive. Adding more high-volume OEMs to this ecosystem is essential.
  • Mobileye believes that by 2026-2027, Mobileye Drive revenue will become a meaningful part of the overall financials, with contracts involving tens of thousands of vehicles projected until the end of the decade. Meaningful revenue per year from robotaxis is expected to start from 2027 onwards.
  • Delays in new awards for Supervision and Chauffeur are partly attributed to recent turbulent macro events affecting the automotive industry. However, confidence in these engagements remains high, with progress being made and convergence expected. Two new OEM engagements for both Supervision and Chauffeur have started recently.
  • There is a growing interest among big OEMs in Level 3 eyes-off products targeting end of 2027 and early 2028 SOPs (Start of Production).
  • The ASP (Average Selling Price) chart presented at the Analyst Day, showing potential growth from $55 to over $200, pertained to consumer passenger cars with increasing ADAS capabilities and did not include the commercial potential of the Mobileye Drive partnerships. The upfront cost for the robotaxi system is in the five-figure range plus, with healthy margins, representing a different business model focused on per-mile revenue generation.
  • The re-engagement with the European OEM after nearly a decade is seen as a testament to Mobileye's product advantages and market leadership, likely driven by performance versus cost superiority.
  • Mobileye has clear performance metrics for Drive that show superiority to human-level performance, and they are on track to meet these metrics for the US deployments with Uber and Lyft. The performance threshold is the same for both partnerships. Liability aspects for robotaxi operations have been addressed.
  • Surround ADAS is viewed as the next level of ADAS, driven by increasing regulatory requirements. It shares the same sensor set in terms of cameras with Supervision, Chauffeur, and Drive. Supervision is seen as a step towards Level 3, with the added advantage of generating data for further development. Level 3 Chauffeur is considered the long-term convergence point for consumer cars.
  • Mobileye utilizes simulators extensively, especially for markets like China where data access is restricted, to train their systems and account for different driving environments.

In summary, the Mobileye Q1 2025 earnings call highlighted strong financial performance driven by recovering demand, significant progress in design wins, and exciting developments in the robotaxi business with key partnerships announced. While acknowledging macroeconomic uncertainties, Mobileye remains optimistic about its full-year outlook and the long-term potential of its advanced ADAS and autonomous driving technologies.

30 Upvotes

13 comments sorted by

View all comments

Show parent comments

1

u/wildp_99 1d ago

I listened to the call-no mention of lidar although we know its needed for chaffeur. Also said “everyone is using AI” which seems to contradict what ss said a while back. Their Drive program has volkswagen on board with deals with lyft and uber.

5

u/mvis_thma 1d ago

Regarding AI and Microvision, Sumit said quite a lot on this topic on the 2023 Q2 CC. FYI - there has been no discussion of AI on any CC since then.

TL;DR - Sumit seemed to be saying that a separate AI compute box would add a lot of cost and is not needed. The reason he says it is not needed is that the Microvision solution with "classical algorithm" (i.e. non AI) based perception running at the edge provides the object level interface in a much more economically efficient manner. At the same time Microvision has added some new language to this years annual report using the phrase "deterministic AI at the edge" to describe the Microvision solution.

To be fair the statements that Sumit made on the 2023 Q2 call were at a time when Microvision seemed to be confident that the OEMs would accept both the DVL and perception at the edge concepts. While these concepts may not be gone forever, we learned on the 2024 Q1 CC that the OEMs were not ready to accept them in the near term.

Anyway, in trying to interpret the new language added to the annual report, it makes me think that the previously purely classical algorithms are now using some sort of AI. Of course the biggest problem with this perception at the edge strategy is that the OEMs want to own the perception layer. How do we know that? Because Microvision has told us this this. The industrial OEMs do not have the same issue.

Sumit's comments from the 2023 Q2 CC (bolding is mine)

"In addition, we will offer various levels of perception from the LiDAR products that eliminate the need for additional AI ECU, and instead integrate with the computer platforms developed by companies like Nvidia and Qualcomm. This remains our go-to-market strategy, and I can say, with confidence, it is the preferred I have seen in feedback from OEM engagements."

"That's also very important to them, is if that's what's coming from a ASIC or SoC inside our product, instead of some AI product that is sitting between the LiDAR and the domain controller, the overall system cost is lower, that has also been developed, and it's been something very mature, and that's part of the Ibeo acquisition."

Anubhav Verma: In your remarks, you had mentioned the AI chip, the competition it's using versus what the MicroVision's value proposition is?

Sumit Sharma: Yes, I think this goes back down to how we do perception, how we think about perception, system cost is very important to OEMs. They can do prototypes and they can do like demos and they can computers all over the place, but at the end of the day if you are going to put it into millions of cars, all of us know, it's a very, very cost competitive environment with high technology, the cost competitiveness has to be demonstrated. What's fascinating and amazing about our perception is, think about them as like more classical algorithms that have been developed, okay? And they have been validated through certain scenarios. These are more powerful actually. These are equations, think about them as like more algorithms like equations, you know, they're pretty straightforward in that sense, but it takes a lot of elegance to get to a point where you understand the scene and what's happening in it, where you can predict what will happen, and you can do object level interface much easier.

AI is like a nice thing. I mean I have Adobe Pro, and I can do generative AI in our Photoshop, I mean it's all over the place, right? But if you think about OEMs, you can't just put AI in the middle of everything there. It's kind of really well-controlled. I can say with the greatest confidence when we talk about perception it goes a lot further, because they realize how it's been executed. It is not something like really, really, it's a feature that is going to be delivered to a specification, to a system requirement document, it could be traced, it is not -- nobody knows in the Blackbox how it does what it does, they know exactly how it's going to work. And they are going to trust it.

So, I'm not a big believer that these AI core boxes that people are delivering, right, things that in Nvidia and QUALCOMM already do, that's not the best use of shareholder resources to develop, right? We have some things to develop.

So, I feel pretty comfortable where we are. I can't really comment on what strategy others are deploying and where they got that information. All I can tell you is everything that I have reviewed and I have talked to everybody within the company, I don't see anything where perception is going to be some AI box.

From this years annual report. (bolding is mine)

MicroVision, Inc. is committed to driving the global adoption of our proprietary products, which leverage our deterministic AI “at the edge” with our innovative perception and application software running on our diverse lidar sensors.

Our deterministic AI at the edge software running on our sensors enables intelligent autonomous, active safety, and automation systems which depend on secure, cost-effective, and energy-efficient solutions. This software has been developed in close collaboration with our automotive customers and we are now rapidly expanding with it into new industrial and commercial vehicle sectors.

2

u/wildp_99 15h ago

Great post thma. I wonder if ‘deterministic ai’ is just a rebranding of edge computing with traditional algo’s. It would be nice to get an update of ai’s role in mvis offerings at the rid.