[PR #656] [CLOSED] Custom fetch wrapper #1676

Closed
opened 2026-03-02 11:58:40 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/karakeep-app/karakeep/pull/656
Author: @sbarbett
Created: 11/14/2024
Status: Closed

Base: mainHead: custom-fetch-wrapper


📝 Commits (2)

  • c72ebfa Add configurable fetch timeout for Ollama client
  • d7c5208 Worker service needs access to the .env file

📊 Changes

4 files changed (+19 additions, -0 deletions)

View changed files

📝 docker/docker-compose.dev.yml (+4 -0)
📝 packages/shared/config.ts (+2 -0)
packages/shared/customFetch.ts (+11 -0)
📝 packages/shared/inference.ts (+2 -0)

📄 Description

I added a custom wrapper for fetch to control the headersTimeout value. The default value is 5 minutes, and slower instances of ollama can take longer than that, so the fetch fails. The following environment variable allows you to up it (the default is still 5 min).

services:
  web:
    environment:
      - INFERENCE_FETCH_TIMEOUT_SEC=3600  # 1 hour fetch timeout

There's more info on this in issue 628. I built this locally and tested it against my ollama instance. With the default timeout:

Nov 14 02:35:33 llama ollama[3733]: [GIN] 2024/11/14 - 02:35:33 | 200 |          5m0s |        10.0.0.2 | POST     "/api/chat"

Hard stop at 5 min. The inference job fails.

workers-1      | 2024-11-14T02:35:33.346Z error: [inference][10] inference job failed: TypeError: fetch failed
workers-1      | TypeError: fetch failed
workers-1      |     at node:internal/deps/undici/undici:13392:13
workers-1      |     at async post (/app/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)
workers-1      |     at async Ollama.processStreamableRequest (/app/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)
workers-1      |     at async OllamaInferenceClient.runModel (/app/packages/shared/inference.ts:2:3386)
workers-1      |     at async OllamaInferenceClient.inferFromText (/app/packages/shared/inference.ts:2:4136)
workers-1      |     at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3135)
workers-1      |     at async inferTags (/app/apps/workers/openaiWorker.ts:6:3370)
workers-1      |     at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6863)
workers-1      |     at async Runner.runOnce (/app/node_modules/liteque/dist/runner.js:2:2578)

Here is the same job after upping the fetch timeout.

Nov 14 02:52:23 llama ollama[3733]: [GIN] 2024/11/14 - 02:52:23 | 200 |         7m31s |        10.0.0.2 | POST     "/api/chat"
workers-1      | 2024-11-14T02:55:55.614Z info: [search][13] Completed successfully

One other small tweak: The docker-compose.dev.yml file needed the env file declared under the worker service.


Thanks for making this useful app.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/karakeep-app/karakeep/pull/656 **Author:** [@sbarbett](https://github.com/sbarbett) **Created:** 11/14/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `custom-fetch-wrapper` --- ### 📝 Commits (2) - [`c72ebfa`](https://github.com/karakeep-app/karakeep/commit/c72ebfa8fc7541f49f7d374bca25ccaf19f0a44b) Add configurable fetch timeout for Ollama client - [`d7c5208`](https://github.com/karakeep-app/karakeep/commit/d7c52083448392b71c5d951d8ec3d5a00aa8f947) Worker service needs access to the .env file ### 📊 Changes **4 files changed** (+19 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `docker/docker-compose.dev.yml` (+4 -0) 📝 `packages/shared/config.ts` (+2 -0) ➕ `packages/shared/customFetch.ts` (+11 -0) 📝 `packages/shared/inference.ts` (+2 -0) </details> ### 📄 Description I added a custom wrapper for fetch to control the `headersTimeout` value. The default value is 5 minutes, and slower instances of ollama can take longer than that, so the fetch fails. The following environment variable allows you to up it (the default is still 5 min). ```yaml services: web: environment: - INFERENCE_FETCH_TIMEOUT_SEC=3600 # 1 hour fetch timeout ``` There's more info on this in [issue 628](https://github.com/hoarder-app/hoarder/issues/628). I built this locally and tested it against my ollama instance. With the default timeout: ```sh Nov 14 02:35:33 llama ollama[3733]: [GIN] 2024/11/14 - 02:35:33 | 200 | 5m0s | 10.0.0.2 | POST "/api/chat" ``` Hard stop at 5 min. The inference job fails. ```sh workers-1 | 2024-11-14T02:35:33.346Z error: [inference][10] inference job failed: TypeError: fetch failed workers-1 | TypeError: fetch failed workers-1 | at node:internal/deps/undici/undici:13392:13 workers-1 | at async post (/app/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) workers-1 | at async Ollama.processStreamableRequest (/app/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) workers-1 | at async OllamaInferenceClient.runModel (/app/packages/shared/inference.ts:2:3386) workers-1 | at async OllamaInferenceClient.inferFromText (/app/packages/shared/inference.ts:2:4136) workers-1 | at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3135) workers-1 | at async inferTags (/app/apps/workers/openaiWorker.ts:6:3370) workers-1 | at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6863) workers-1 | at async Runner.runOnce (/app/node_modules/liteque/dist/runner.js:2:2578) ``` Here is the same job after upping the fetch timeout. ```sh Nov 14 02:52:23 llama ollama[3733]: [GIN] 2024/11/14 - 02:52:23 | 200 | 7m31s | 10.0.0.2 | POST "/api/chat" ``` ```sh workers-1 | 2024-11-14T02:55:55.614Z info: [search][13] Completed successfully ``` --- One other small tweak: The docker-compose.dev.yml file needed the env file declared under the worker service. --- Thanks for making this useful app. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-02 11:58:40 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#1676
No description provided.