CVE-2025-52566

Publication date 24 June 2025

Last updated 24 June 2025


Ubuntu priority

Cvss 3 Severity Score

8.6 · High

Score breakdown

llama.cpp is an inference of several LLM models in C/C++. Prior to version b5721, there is a signed vs. unsigned integer overflow in llama.cpp's tokenizer implementation (llama_vocab::tokenize) (src/llama-vocab.cpp:3036) resulting in unintended behavior in tokens copying size comparison. Allowing heap-overflowing llama.cpp inferencing engine with carefully manipulated text input during tokenization process. This issue has been patched in version b5721.

Status

Package Ubuntu Release Status
llama.cpp 25.04 plucky Not in release
24.10 oracular Not in release
24.04 LTS noble Not in release
22.04 LTS jammy Not in release

Severity score breakdown

Parameter Value
Base score 8.6 · High
Attack vector Local
Attack complexity Low
Privileges required None
User interaction Required
Scope Changed
Confidentiality High
Integrity impact High
Availability impact High
Vector CVSS:3.1/AV:L/AC:L/PR:N/UI:R/S:C/C:H/I:H/A:H