CVE-2026-22778
vLLM leaks a heap address when PIL throws an error
Description
vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid image is sent to vLLM's multimodal endpoint, PIL throws an error. vLLM returns this error to the client, leaking a heap address. With this leak, we reduce ASLR from 4 billion guesses to ~8 guesses. This vulnerability can be chained a heap overflow with JPEG2000 decoder in OpenCV/FFmpeg to achieve remote code execution. This vulnerability is fixed in 0.14.1.
INFO
Published Date :
Feb. 2, 2026, 11:16 p.m.
Last Modified :
Feb. 3, 2026, 4:44 p.m.
Remotely Exploit :
Yes !
Source :
[email protected]
CVSS Scores
| Score | Version | Severity | Vector | Exploitability Score | Impact Score | Source |
|---|---|---|---|---|---|---|
| CVSS 3.1 | CRITICAL | [email protected] |
Solution
- Update vLLM to version 0.14.1 or later.
- Apply security patches promptly.
- Validate input for multimodal endpoints.
References to Advisories, Solutions, and Tools
Here, you will find a curated list of external links that provide in-depth
information, practical solutions, and valuable tools related to
CVE-2026-22778.
CWE - Common Weakness Enumeration
While CVE identifies
specific instances of vulnerabilities, CWE categorizes the common flaws or
weaknesses that can lead to vulnerabilities. CVE-2026-22778 is
associated with the following CWEs:
Common Attack Pattern Enumeration and Classification (CAPEC)
Common Attack Pattern Enumeration and Classification
(CAPEC)
stores attack patterns, which are descriptions of the common attributes and
approaches employed by adversaries to exploit the CVE-2026-22778
weaknesses.
We scan GitHub repositories to detect new proof-of-concept exploits. Following list is a collection of public exploits and proof-of-concepts, which have been published on GitHub (sorted by the most recently updated).
Results are limited to the first 15 repositories due to potential performance issues.
The following list is the news that have been mention
CVE-2026-22778 vulnerability anywhere in the article.
-
Daily CyberSecurity
Video of Death: Critical vLLM Flaw (CVSS 9.8) Grants Remote Code Execution
A new critical vulnerability has been discovered in vLLM, a widely used high-performance library for Large Language Model (LLM) inference. Tracked as CVE-2026-22778, this flaw carries a devastating CV ... Read more
-
The Cyber Express
Foxit Releases Security Updates for PDF Editor Cloud XSS Vulnerabilities
Foxit Software has released security updates addressing multiple cross-site scripting (XSS) vulnerabilities affecting Foxit PDF Editor Cloud and Foxit eSign, closing gaps that could have allowed attac ... Read more
-
The Cyber Express
Critical vLLM Flaw Exposes Millions of AI Servers to Remote Code Execution
A newly disclosed security flaw has placed millions of AI servers at risk after researchers identified a critical vulnerability in vLLM, a widely deployed Python package for serving large language mod ... Read more
The following table lists the changes that have been made to the
CVE-2026-22778 vulnerability over time.
Vulnerability history details can be useful for understanding the evolution of a vulnerability, and for identifying the most recent changes that may impact the vulnerability's severity, exploitability, or other characteristics.
-
New CVE Received by [email protected]
Feb. 02, 2026
Action Type Old Value New Value Added Description vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid image is sent to vLLM's multimodal endpoint, PIL throws an error. vLLM returns this error to the client, leaking a heap address. With this leak, we reduce ASLR from 4 billion guesses to ~8 guesses. This vulnerability can be chained a heap overflow with JPEG2000 decoder in OpenCV/FFmpeg to achieve remote code execution. This vulnerability is fixed in 0.14.1. Added CVSS V3.1 AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H Added CWE CWE-532 Added Reference https://github.com/vllm-project/vllm/pull/31987 Added Reference https://github.com/vllm-project/vllm/pull/32319 Added Reference https://github.com/vllm-project/vllm/releases/tag/v0.14.1 Added Reference https://github.com/vllm-project/vllm/security/advisories/GHSA-4r2x-xpjr-7cvv