Tracked as CVE-2026-25874, the flaw carries a CVSS score of 9.8, enabling unauthenticated attackers to execute arbitrary system commands on vulnerable deployments.
With more than 21,500 GitHub stars, LeRobot’s popularity significantly amplifies the potential impact, particularly in production environments leveraging distributed GPU-based inference.
The vulnerability originates in LeRobot’s asynchronous inference architecture, where policy computation is offloaded to a GPU-backed server via a gRPC-based PolicyServer.
The issue stems from the server’s reliance on Python’s unsafe pickle.loads() function to deserialize incoming data across multiple RPC endpoints.
Compounding the risk, the gRPC service is configured using add_insecure_port(), meaning communications lack Transport Layer Security (TLS) and authentication controls.
This combination allows any attacker with network access to send crafted payloads directly to the service.
Because pickle Inherently allows execution of arbitrary code during deserialization, this design flaw creates a direct path to full system compromise.
Security researcher chocapikk identified that vulnerable RPC handlers, including SendPolicyInstructions and SendObservations, process raw byte streams from protobuf messages and deserialize them using pickle before enforcing type validation.
This sequence is critical: malicious payloads execute during deserialization, before validation checks like isinstance() are applied. As a result, even malformed or unexpected objects can trigger code execution.
For example, an attacker can craft a malicious Python object embedded in a serialized payload that executes system-level commands upon deserialization.
Since validation occurs too late, the payload runs regardless of whether the object is ultimately rejected.
Notably, the affected code sections contained #nosec comments suppressing security linter warnings, suggesting developers were aware of the risks associated with unsafe deserialization but bypassed safeguards.
By default, LeRobot binds its gRPC server to localhost, limiting exposure in isolated environments. However, real-world deployments commonly bind services to 0.0.0.0 enable communication with external GPU servers.
In such configurations, the attack surface expands significantly. Threat actors can scan networks for exposed instances and deliver malicious payloads without authentication or advanced targeting, making the vulnerability highly exploitable at scale.
Organizations using LeRobot should take immediate steps to mitigate CVE-2026-25874:
pickle with secure alternatives such as JSON, native protobuf fields, or Hugging Face’s safetensors.add_insecure_port() to add_secure_port() with TLS.This vulnerability highlights a recurring issue in the machine learning ecosystem: prioritizing rapid prototyping over secure coding practices.
Despite Hugging Face’s development of safetensors to address serialization risks, the presence of a pickle-Based RCE flaw in LeRobot underscores inconsistent security implementation.
As ML frameworks continue to integrate into production and robotics systems, secure design principles must become foundational rather than optional, particularly in distributed architectures handling untrusted network input.
Follow us on Google News , LinkedIn and X to Get More Instant Updates. Set Cyberpress as a Preferred Source in Google
The post Hugging Face LeRobot Vulnerability Enables Unauthenticated Remote Code Execution Attacks appeared first on Cyber Security News.
The MindsEye comeback has begun, with developer Build a Rocket Boy seemingly hoping for a…
Just last month, Resident Evil Requiem became the first 2026 game with Denuvo to be…
A newly analyzed DDoS botnet named Kamasers has emerged as one of the most operationally…
Application security firm Checkmarx has confirmed that data from an internal GitHub repository has been…
Microsoft has introduced a significant upgrade to its Copilot capabilities in Outlook, transforming the AI…
WhatsApp is advancing its privacy-first strategy by developing a proprietary cloud backup service with mandatory…
This website uses cookies.