[GH-ISSUE #455] feature: import from omnivor #294

Closed
opened 2026-03-02 11:48:31 +03:00 by kerem · 6 comments
Owner

Originally created by @mrinc on GitHub (Oct 3, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/455

Omnivore doesn't have an export, its a real PITA, so either docs of how to simply export from omnivore or an import feature would be awesome....

I'll see if I can do something and push a PR, else if you have any ideas on it that would be great.

Thanks!

Originally created by @mrinc on GitHub (Oct 3, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/455 Omnivore doesn't have an export, its a real PITA, so either docs of how to simply export from omnivore or an import feature would be awesome.... I'll see if I can do something and push a PR, else if you have any ideas on it that would be great. Thanks!
kerem closed this issue 2026-03-02 11:48:32 +03:00
Author
Owner

@kamtschatka commented on GitHub (Oct 3, 2024):

shouldn't you rather create an issue with omnivore to get an export feature?

<!-- gh-comment-id:2391910785 --> @kamtschatka commented on GitHub (Oct 3, 2024): shouldn't you rather create an issue with omnivore to get an export feature?
Author
Owner

@mrinc commented on GitHub (Oct 3, 2024):

There are a bunch of discussions already on this.

But this is 2 fold, if someone comes around trying to work out how to do it and searches the issues - this issue will show up.

So regardless of the outcome, it can at least help someone in the future :)

<!-- gh-comment-id:2391930031 --> @mrinc commented on GitHub (Oct 3, 2024): There are a bunch of discussions already on this. But this is 2 fold, if someone comes around trying to work out how to do it and searches the issues - this issue will show up. So regardless of the outcome, it can at least help someone in the future :)
Author
Owner

@kamtschatka commented on GitHub (Oct 3, 2024):

ok, but then just link the discussion here and we'll close this issue. The hoarder maintainers are not going to add a bookmark export to omnivore ;-)

<!-- gh-comment-id:2391935237 --> @kamtschatka commented on GitHub (Oct 3, 2024): ok, but then just link the discussion here and we'll close this issue. The hoarder maintainers are not going to add a bookmark export to omnivore ;-)
Author
Owner

@mrinc commented on GitHub (Oct 3, 2024):

That is unacceptable! I expect them to do it! (jk) :P

Annnddd .... here we go :)

#!/bin/bash

if [ "$#" -ne 1 ]; then
    echo "Usage: $0 <API_KEY>"
    exit 1
fi

API_KEY="$1"
ENDPOINT="https://api-prod.omnivore.app/api/graphql"
OUTPUT_FILE="omnivore_bookmarks.html"

# Initialize the bookmarks file
echo '<!DOCTYPE NETSCAPE-Bookmark-file-1>
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=UTF-8">
<TITLE>Bookmarks</TITLE>
<H1>Bookmarks</H1>
<DL><p>' > "$OUTPUT_FILE"

fetch_page() {
    local after="$1"
    local query
    if [ -z "$after" ]; then
        query='{"query": "query { search(first: 100, query: \"\", includeContent: false) { ... on SearchSuccess { edges { node { id url title } } pageInfo { hasNextPage endCursor } } } }"}'
    else
        query="{\"query\": \"query { search(first: 100, after: \\\"$after\\\", query: \\\"\\\", includeContent: false) { ... on SearchSuccess { edges { node { id url title } } pageInfo { hasNextPage endCursor } } } }\"}"
    fi

    curl --silent --location "$ENDPOINT" \
         --header "Authorization: $API_KEY" \
         --header 'Content-Type: application/json' \
         --header 'Accept: application/json' \
         --data "$query"
}

process_results() {
    local json="$1"
    echo "$json" | jq -r '.data.search.edges[] | "<DT><A HREF=\"\(.node.url)\">\(.node.title)</A>"' >> "$OUTPUT_FILE"
}

after=""
has_next_page=true

while $has_next_page; do
    response=$(fetch_page "$after")
    process_results "$response"
    
    has_next_page=$(echo "$response" | jq -r '.data.search.pageInfo.hasNextPage')
    after=$(echo "$response" | jq -r '.data.search.pageInfo.endCursor')
    
    echo "Processed a page. More pages: $has_next_page"
    
    # Optional: add a small delay to avoid hitting rate limits
    sleep 1
done

# Close the bookmarks file
echo '</DL><p>' >> "$OUTPUT_FILE"

echo "Bookmarks have been saved to $OUTPUT_FILE"

Create the script (bash) and run it passing in your API key as the argument.

It'll generate a bookmarks file that you can import into hoarder.

<!-- gh-comment-id:2391941238 --> @mrinc commented on GitHub (Oct 3, 2024): That is unacceptable! I expect them to do it! (jk) :P Annnddd .... here we go :) ```bash #!/bin/bash if [ "$#" -ne 1 ]; then echo "Usage: $0 <API_KEY>" exit 1 fi API_KEY="$1" ENDPOINT="https://api-prod.omnivore.app/api/graphql" OUTPUT_FILE="omnivore_bookmarks.html" # Initialize the bookmarks file echo '<!DOCTYPE NETSCAPE-Bookmark-file-1> <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=UTF-8"> <TITLE>Bookmarks</TITLE> <H1>Bookmarks</H1> <DL><p>' > "$OUTPUT_FILE" fetch_page() { local after="$1" local query if [ -z "$after" ]; then query='{"query": "query { search(first: 100, query: \"\", includeContent: false) { ... on SearchSuccess { edges { node { id url title } } pageInfo { hasNextPage endCursor } } } }"}' else query="{\"query\": \"query { search(first: 100, after: \\\"$after\\\", query: \\\"\\\", includeContent: false) { ... on SearchSuccess { edges { node { id url title } } pageInfo { hasNextPage endCursor } } } }\"}" fi curl --silent --location "$ENDPOINT" \ --header "Authorization: $API_KEY" \ --header 'Content-Type: application/json' \ --header 'Accept: application/json' \ --data "$query" } process_results() { local json="$1" echo "$json" | jq -r '.data.search.edges[] | "<DT><A HREF=\"\(.node.url)\">\(.node.title)</A>"' >> "$OUTPUT_FILE" } after="" has_next_page=true while $has_next_page; do response=$(fetch_page "$after") process_results "$response" has_next_page=$(echo "$response" | jq -r '.data.search.pageInfo.hasNextPage') after=$(echo "$response" | jq -r '.data.search.pageInfo.endCursor') echo "Processed a page. More pages: $has_next_page" # Optional: add a small delay to avoid hitting rate limits sleep 1 done # Close the bookmarks file echo '</DL><p>' >> "$OUTPUT_FILE" echo "Bookmarks have been saved to $OUTPUT_FILE" ``` Create the script (bash) and run it passing in your API key as the argument. It'll generate a bookmarks file that you can import into hoarder.
Author
Owner

@mrinc commented on GitHub (Oct 3, 2024):

I didn't expect them to do it - thus why moreof a question of ideas.

But anywhoo, there is a bash script that will export omnivore links in a bookmark format that can be easily imported into hoarder :)

So if anyone has to migrate later on, they can just run this ....

Only bash, curl and jq required - so a non python dev can also use it.

Going to close now since records exist for any future person in the same stuck.

<!-- gh-comment-id:2391945313 --> @mrinc commented on GitHub (Oct 3, 2024): I didn't expect them to do it - thus why moreof a question of ideas. But anywhoo, there is a bash script that will export omnivore links in a bookmark format that can be easily imported into hoarder :) So if anyone has to migrate later on, they can just run this .... Only bash, curl and jq required - so a non python dev can also use it. Going to close now since records exist for any future person in the same stuck.
Author
Owner

@sylvesterroos commented on GitHub (Oct 30, 2024):

Omnivore is closing down, and they have added an export feature

<!-- gh-comment-id:2447577987 --> @sylvesterroos commented on GitHub (Oct 30, 2024): Omnivore is closing down, and they have added an export feature
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#294
No description provided.