[GH-ISSUE #185] How to verify hoarder app is working with the local ollama #141

Open
opened 2026-03-02 11:47:02 +03:00 by kerem · 19 comments
Owner

Originally created by @lihw on GitHub (May 28, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/185

I modify the docker-compose.yml a bit to make hoarder use local ollama inference. Here is the modified yml file.

version: "3.8"
services:
  web:
    image: ghcr.io/hoarder-app/hoarder-web:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - data:/data
    ports:
      - 3000:3000
    env_file:
      - .env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      DATA_DIR: /data
  redis:
    image: redis:7.2-alpine
    restart: unless-stopped
    volumes:
      - redis:/data
  chrome:
    image: gcr.io/zenika-hub/alpine-chrome:123
    restart: unless-stopped
    command:
      - --no-sandbox
      - --disable-gpu
      - --disable-dev-shm-usage
      - --remote-debugging-address=0.0.0.0
      - --remote-debugging-port=9222
      - --hide-scrollbars
  meilisearch:
    image: getmeili/meilisearch:v1.6
    restart: unless-stopped
    env_file:
      - .env
    environment:
      MEILI_NO_ANALYTICS: "true"
    volumes:
      - meilisearch:/meili_data
  ollama:
    hostname: ollama-container
    image: ollama/ollama:0.1.39-rocm
    restart: unless-stopped
    volumes:
      - ollama:/data/ollama
    ports:
      - 11434:11434
    env_file:
      - .env
  workers:
    image: ghcr.io/hoarder-app/hoarder-workers:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - data:/data
    env_file:
      - .env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      BROWSER_WEB_URL: http://chrome:9222
      DATA_DIR: /data
      OLLAMA_BASE_URL: http://ollama-container:11434
      INFERENCE_TEXT_MODEL: llama3
      # OPENAI_API_KEY: ...
    depends_on:
      web:
        condition: service_started

volumes:
  redis:
  meilisearch:
  ollama:
  data:

I entered the hoarder worker container and checked if ollama-container:13414 is connectable. Yes. But when I open the hoarder webpage, it seems tag is not working. I am wondering where hoarder log is. How can I verify that the hoarder app is already connecting to local inference service.

Originally created by @lihw on GitHub (May 28, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/185 I modify the docker-compose.yml a bit to make hoarder use local ollama inference. Here is the modified yml file. ``` version: "3.8" services: web: image: ghcr.io/hoarder-app/hoarder-web:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - data:/data ports: - 3000:3000 env_file: - .env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 DATA_DIR: /data redis: image: redis:7.2-alpine restart: unless-stopped volumes: - redis:/data chrome: image: gcr.io/zenika-hub/alpine-chrome:123 restart: unless-stopped command: - --no-sandbox - --disable-gpu - --disable-dev-shm-usage - --remote-debugging-address=0.0.0.0 - --remote-debugging-port=9222 - --hide-scrollbars meilisearch: image: getmeili/meilisearch:v1.6 restart: unless-stopped env_file: - .env environment: MEILI_NO_ANALYTICS: "true" volumes: - meilisearch:/meili_data ollama: hostname: ollama-container image: ollama/ollama:0.1.39-rocm restart: unless-stopped volumes: - ollama:/data/ollama ports: - 11434:11434 env_file: - .env workers: image: ghcr.io/hoarder-app/hoarder-workers:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - data:/data env_file: - .env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 BROWSER_WEB_URL: http://chrome:9222 DATA_DIR: /data OLLAMA_BASE_URL: http://ollama-container:11434 INFERENCE_TEXT_MODEL: llama3 # OPENAI_API_KEY: ... depends_on: web: condition: service_started volumes: redis: meilisearch: ollama: data: ``` I entered the hoarder worker container and checked if `ollama-container:13414` is connectable. Yes. But when I open the hoarder webpage, it seems tag is not working. I am wondering where hoarder log is. How can I verify that the hoarder app is already connecting to local inference service.
Author
Owner

@kamtschatka commented on GitHub (May 31, 2024):

Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker.
Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again.

<!-- gh-comment-id:2142688964 --> @kamtschatka commented on GitHub (May 31, 2024): Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker. Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again.
Author
Owner

@MohamedBassem commented on GitHub (Jun 1, 2024):

@lihw sorry for the late reply, somehow I missed this issue. As @kamtschatka said, please provide us with the workers container log so that we can help

<!-- gh-comment-id:2143274053 --> @MohamedBassem commented on GitHub (Jun 1, 2024): @lihw sorry for the late reply, somehow I missed this issue. As @kamtschatka said, please provide us with the `workers` container log so that we can help
Author
Owner

@lihw commented on GitHub (Jun 3, 2024):

Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker. Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again.

Thanks for the reply. A following question is where is the logs in the worker container? Could you let me know the file path? Thanks.

<!-- gh-comment-id:2144171680 --> @lihw commented on GitHub (Jun 3, 2024): > Check the worker logs, after navigating, downloading and potentially taking a screenshot, inference is done. If there is an error, there are also logs about it in the worker. Maybe you have missed it, then you can trigger a refresh of the bookmark to retrigger a download and inference again. Thanks for the reply. A following question is where is the logs in the worker container? Could you let me know the file path? Thanks.
Author
Owner

@kamtschatka commented on GitHub (Jun 3, 2024):

AFAIK it is not stored in a file, simply logged to the stdout of the docker container.

<!-- gh-comment-id:2144440331 --> @kamtschatka commented on GitHub (Jun 3, 2024): AFAIK it is not stored in a file, simply logged to the stdout of the docker container.
Author
Owner

@DmacMcgreg commented on GitHub (Jun 19, 2024):

@MohamedBassem

2024-06-19T19:05:39.718Z info: [Crawler][15] Successfully navigated to "https://github.com/hoarder-app/hoarder/issues/8". Waiting for the page to load ...
2024-06-19T19:05:40.742Z info: [Crawler][15] Finished waiting for the page to load.
2024-06-19T19:05:40.845Z info: [Crawler][15] Finished capturing page content and a screenshot. FullPageScreenshot: false
2024-06-19T19:05:40.847Z info: [Crawler][15] Will attempt to extract metadata from page ...
2024-06-19T19:05:41.317Z info: [Crawler][15] Will attempt to extract readable content ...
2024-06-19T19:05:41.608Z info: [Crawler][15] Done extracting readable content.
2024-06-19T19:05:41.614Z info: [Crawler][15] Stored the screenshot as assetId: c4f28878-3e9b-4871-aca5-3087b1f341cb
2024-06-19T19:05:41.693Z info: [Crawler][15] Done extracting metadata from the page.
2024-06-19T19:05:41.693Z info: [Crawler][15] Downloading image from "https://opengraph.githubassets.com/2db47c3e515ccae7e42ad7a70dfd9142fa8cf5d34d8ad9ae05b5f316e51f174a/hoarder-app/hoarder/issues/8"
2024-06-19T19:05:42.388Z info: [Crawler][15] Downloaded the image as assetId: 3402bea0-9b2b-4a69-a710-004a586be8da
2024-06-19T19:05:42.404Z info: [Crawler][15] Completed successfully
2024-06-19T19:05:42.412Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve"
2024-06-19T19:05:42.414Z info: [search][38] Attempting to index bookmark with id duclhshurqvnoqbtzuijxeve ...
2024-06-19T19:05:42.420Z error: [inference][15] inference job failed: TypeError: fetch failed
2024-06-19T19:05:42.490Z info: [search][38] Completed successfully
2024-06-19T19:05:42.975Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve"
2024-06-19T19:05:42.982Z error: [inference][15] inference job failed: TypeError: fetch failed
2024-06-19T19:05:44.023Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve"
2024-06-19T19:05:44.027Z error: [inference][15] inference job failed: TypeError: fetch failed

<!-- gh-comment-id:2179359458 --> @DmacMcgreg commented on GitHub (Jun 19, 2024): @MohamedBassem 2024-06-19T19:05:39.718Z info: [Crawler][15] Successfully navigated to "https://github.com/hoarder-app/hoarder/issues/8". Waiting for the page to load ... 2024-06-19T19:05:40.742Z info: [Crawler][15] Finished waiting for the page to load. 2024-06-19T19:05:40.845Z info: [Crawler][15] Finished capturing page content and a screenshot. FullPageScreenshot: false 2024-06-19T19:05:40.847Z info: [Crawler][15] Will attempt to extract metadata from page ... 2024-06-19T19:05:41.317Z info: [Crawler][15] Will attempt to extract readable content ... 2024-06-19T19:05:41.608Z info: [Crawler][15] Done extracting readable content. 2024-06-19T19:05:41.614Z info: [Crawler][15] Stored the screenshot as assetId: c4f28878-3e9b-4871-aca5-3087b1f341cb 2024-06-19T19:05:41.693Z info: [Crawler][15] Done extracting metadata from the page. 2024-06-19T19:05:41.693Z info: [Crawler][15] Downloading image from "https://opengraph.githubassets.com/2db47c3e515ccae7e42ad7a70dfd9142fa8cf5d34d8ad9ae05b5f316e51f174a/hoarder-app/hoarder/issues/8" 2024-06-19T19:05:42.388Z info: [Crawler][15] Downloaded the image as assetId: 3402bea0-9b2b-4a69-a710-004a586be8da 2024-06-19T19:05:42.404Z info: [Crawler][15] Completed successfully 2024-06-19T19:05:42.412Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:42.414Z info: [search][38] Attempting to index bookmark with id duclhshurqvnoqbtzuijxeve ... 2024-06-19T19:05:42.420Z error: [inference][15] inference job failed: TypeError: fetch failed 2024-06-19T19:05:42.490Z info: [search][38] Completed successfully 2024-06-19T19:05:42.975Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:42.982Z error: [inference][15] inference job failed: TypeError: fetch failed 2024-06-19T19:05:44.023Z info: [inference][15] Starting an inference job for bookmark with id "duclhshurqvnoqbtzuijxeve" 2024-06-19T19:05:44.027Z error: [inference][15] inference job failed: TypeError: fetch failed
Author
Owner

@MohamedBassem commented on GitHub (Jun 19, 2024):

@DmacMcgreg this is usually an indication that your ollama URL is incorrect or unreachable from the worker container

<!-- gh-comment-id:2179361301 --> @MohamedBassem commented on GitHub (Jun 19, 2024): @DmacMcgreg this is usually an indication that your ollama URL is incorrect or unreachable from the worker container
Author
Owner

@DmacMcgreg commented on GitHub (Jun 19, 2024):

CleanShot 2024-06-19 at 15 39 14@2x

CleanShot 2024-06-19 at 15 39 36@2x

@MohamedBassem I've also confirmed both inference models are working as expected locally.

<!-- gh-comment-id:2179367438 --> @DmacMcgreg commented on GitHub (Jun 19, 2024): ![CleanShot 2024-06-19 at 15 39 14@2x](https://github.com/hoarder-app/hoarder/assets/3266023/0356d3d0-572d-456d-bfae-30c70bbdf4db) ![CleanShot 2024-06-19 at 15 39 36@2x](https://github.com/hoarder-app/hoarder/assets/3266023/cfa516eb-a53e-45f8-ae10-857819114554) @MohamedBassem I've also confirmed both inference models are working as expected locally.
Author
Owner

@MohamedBassem commented on GitHub (Jun 19, 2024):

from within the worker container, "127.0.0.1" refers to the localhost of the worker container not the docker host. Either add ollama to the same network as the worker and refer to it by the container name, or replace 127.0.0.1 with 'host.docker.internal' to point to the ip of the docker host.

<!-- gh-comment-id:2179370163 --> @MohamedBassem commented on GitHub (Jun 19, 2024): from within the worker container, "127.0.0.1" refers to the localhost of the worker container not the docker host. Either add ollama to the same network as the worker and refer to it by the container name, or replace 127.0.0.1 with 'host.docker.internal' to point to the ip of the docker host.
Author
Owner

@DmacMcgreg commented on GitHub (Jun 19, 2024):

That works, thanks!

<!-- gh-comment-id:2179374041 --> @DmacMcgreg commented on GitHub (Jun 19, 2024): That works, thanks!
Author
Owner

@kamtschatka commented on GitHub (Sep 26, 2024):

this is finished, right?

<!-- gh-comment-id:2377573901 --> @kamtschatka commented on GitHub (Sep 26, 2024): this is finished, right?
Author
Owner

@debsidian commented on GitHub (Oct 4, 2024):

How would you integrate with ollama running on bare metal, not in a docker container?

i.e. hoarder is running in docker but ollama is on bare-metal.

<!-- gh-comment-id:2394311690 --> @debsidian commented on GitHub (Oct 4, 2024): How would you integrate with ollama running on bare metal, not in a docker container? i.e. hoarder is running in docker but ollama is on bare-metal.
Author
Owner

@kamtschatka commented on GitHub (Oct 4, 2024):

there is no difference. ollama is accessed using the IP address anyways, so simply put the IP address (or domain if you happen to have one) of ollama on bare-metal.
I am running ollama on my PC with a 4070 TI and I simply put 10.0.0.<don't know exactly> in the config and that was it

<!-- gh-comment-id:2394315912 --> @kamtschatka commented on GitHub (Oct 4, 2024): there is no difference. ollama is accessed using the IP address anyways, so simply put the IP address (or domain if you happen to have one) of ollama on bare-metal. I am running ollama on my PC with a 4070 TI and I simply put 10.0.0.<don't know exactly> in the config and that was it
Author
Owner

@debsidian commented on GitHub (Oct 4, 2024):

I simply put 10.0.0.<don't know exactly> in the config and that was it

In your docker-compose, what is your specified network? Is your container just on the host network?

I'm getting this error:

2024-10-04T19:31:39.746Z error: [inference][8] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:12500:13

    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22)

    at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/inference.ts:2:3726)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:32:158)

    at async inferTags (/app/apps/workers/openaiWorker.ts:32:375)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:32:3805)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567)
<!-- gh-comment-id:2394351779 --> @debsidian commented on GitHub (Oct 4, 2024): > I simply put 10.0.0.<don't know exactly> in the config and that was it In your docker-compose, what is your specified network? Is your container just on the host network? I'm getting this error: ``` 2024-10-04T19:31:39.746Z error: [inference][8] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:12500:13 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.0/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22) at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086) at async OllamaInferenceClient.inferFromText (/app/apps/workers/inference.ts:2:3726) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:32:158) at async inferTags (/app/apps/workers/openaiWorker.ts:32:375) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:32:3805) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567) ```
Author
Owner

@kamtschatka commented on GitHub (Oct 18, 2024):

I have not changed the compose file regarding network

<!-- gh-comment-id:2422368978 --> @kamtschatka commented on GitHub (Oct 18, 2024): I have not changed the compose file regarding network
Author
Owner

@debsidian commented on GitHub (Nov 7, 2024):

My issue has been sorted. The problem was with my ollama config and had nothing to do with the hoarder app. Sorry for the bother.

<!-- gh-comment-id:2463309827 --> @debsidian commented on GitHub (Nov 7, 2024): My issue has been sorted. The problem was with my ollama config and had nothing to do with the hoarder app. Sorry for the bother.
Author
Owner

@gamegoofs2 commented on GitHub (Dec 14, 2024):

I'm having a similar issue, except, it seems I'm getting a connection to ollama it is just timing out. I have the default context length and the timeout is set to 3 minutes.

2024-12-14T19:09:36.048Z info: [inference][8] Starting an inference job for bookmark with id "xx4bpal9fhhaostqkpmeyfhe"

2024-12-14T19:12:36.040Z error: [inference][8] inference job failed: Error: Timeout

Error: Timeout at Timeout._onTimeout (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2672)

    at listOnTimeout (node:internal/timers:594:17)

    at process.processTimers (node:internal/timers:529:7)

<!-- gh-comment-id:2543316768 --> @gamegoofs2 commented on GitHub (Dec 14, 2024): I'm having a similar issue, except, it seems I'm getting a connection to ollama it is just timing out. I have the default context length and the timeout is set to 3 minutes. ``` 2024-12-14T19:09:36.048Z info: [inference][8] Starting an inference job for bookmark with id "xx4bpal9fhhaostqkpmeyfhe" 2024-12-14T19:12:36.040Z error: [inference][8] inference job failed: Error: Timeout Error: Timeout at Timeout._onTimeout (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2672) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) ```
Author
Owner

@gamegoofs2 commented on GitHub (Dec 14, 2024):

Never mind. It seems to have fixed itself after I came to it later.

<!-- gh-comment-id:2543371669 --> @gamegoofs2 commented on GitHub (Dec 14, 2024): Never mind. It seems to have fixed itself after I came to it later.
Author
Owner

@michaelblasius commented on GitHub (Dec 25, 2024):

debsidian

My issue has been sorted. The problem was with my ollama config and had nothing to do with the hoarder app. Sorry for the bother.

I am getting similar errors below. You mentioned that the issue was with your ollama config. What was it that you changed?

TypeError: fetch failed
at node:internal/deps/undici/undici:13484:13
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)
at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)
at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)
at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)
at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)
at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

<!-- gh-comment-id:2561841015 --> @michaelblasius commented on GitHub (Dec 25, 2024): [debsidian](https://github.com/debsidian) > My issue has been sorted. The problem was with my ollama config and had nothing to do with the hoarder app. Sorry for the bother. I am getting similar errors below. You mentioned that the issue was with your ollama config. What was it that you changed? TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)
Author
Owner

@michaelblasius commented on GitHub (Dec 29, 2024):

I found my solution here:
https://github.com/hoarder-app/hoarder/issues/294

found the IPv4 IPAM Gateway and entered that address in the hoarder docker compose file

    extra_hosts:
      - host.docker.internal:172.20.0.1

and added the following to the .env

OLLAMA_BASE_URL=http://172.20.0.1:11434
<!-- gh-comment-id:2564573045 --> @michaelblasius commented on GitHub (Dec 29, 2024): I found my solution here: https://github.com/hoarder-app/hoarder/issues/294 found the IPv4 IPAM Gateway and entered that address in the hoarder docker compose file ``` extra_hosts: - host.docker.internal:172.20.0.1 ``` and added the following to the .env ``` OLLAMA_BASE_URL=http://172.20.0.1:11434 ```
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#141
No description provided.