mirror of
https://github.com/ArchiveBox/ArchiveBox.git
synced 2026-04-25 09:06:02 +03:00
[GH-ISSUE #1319] Feature Request: Automatically rewrite URLs to use alternative frontends for difficult-to-archive sites (e.g. using benbusby/farside) #2319
Labels
No labels
expected: maybe someday
expected: next release
expected: release after next
expected: unlikely unless contributed
good first ticket
help wanted
pull-request
scope: all users
scope: windows users
size: easy
size: hard
size: medium
size: medium
status: backlog
status: blocked
status: done
status: idea-phase
status: needs followup
status: wip
status: wontfix
touches: API/CLI/Spec
touches: configuration
touches: data/schema/architecture
touches: dependencies/packaging
touches: docs
touches: js
touches: views/replayers/html/css
why: correctness
why: functionality
why: performance
why: security
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ArchiveBox#2319
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @pirate on GitHub (Jan 12, 2024).
Original GitHub issue: https://github.com/ArchiveBox/ArchiveBox/issues/1319
Type
What is the problem that your feature request solves
Sites like Facebook, Instagram, Twitter, Tiktok, etc. are difficult to archive and frequently block bot traffic or require logged-in sessions to simply view content.
Describe the ideal specific solution you'd want, and whether it fits into any broader scope of changes
Many alternative frontends exist that display social media content with less clutter and in a more easily archivable way. e.g.
twitter.com/ArchiveBoxApp->nitter.net/ArchiveBoxAppArchiveBox should be configurable to rewrite sites the user chooses to use alternative frontends.
Ideally it should be a general solution to URL rewriting and cleanup that can take over from URL_ALLOWLIST/DENYLIST and also handle merging duplicate URLs.
What hacks or alternative solutions have you tried to solve the problem?
Manually replacing URL fragments before piping them in to archivebox:
How badly do you want this new feature?