mirror of
https://github.com/ArchiveBox/ArchiveBox.git
synced 2026-04-25 17:16:00 +03:00
[GH-ISSUE #195] Non-link-following archiving #1645
Labels
No labels
expected: maybe someday
expected: next release
expected: release after next
expected: unlikely unless contributed
good first ticket
help wanted
pull-request
scope: all users
scope: windows users
size: easy
size: hard
size: medium
size: medium
status: backlog
status: blocked
status: done
status: idea-phase
status: needs followup
status: wip
status: wontfix
touches: API/CLI/Spec
touches: configuration
touches: data/schema/architecture
touches: dependencies/packaging
touches: docs
touches: js
touches: views/replayers/html/css
why: correctness
why: functionality
why: performance
why: security
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ArchiveBox#1645
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @diego898 on GitHub (Mar 26, 2019).
Original GitHub issue: https://github.com/ArchiveBox/ArchiveBox/issues/195
(please fill out the following information, feel free to delete sections if they're not applicable)
Describe the bug
While archiving this link:
https://gradientscience.org/batchnorm/
I realized it suddently started to try and archive youtube links, reddit links, github themes, etc. Previous blog posts Ive archived have not archived this way. How can I make it only archive the current page/url Ive passed?
Steps to reproduce
./archive https://gradientscience.org/batchnorm/Software versions
58c9b47@pirate commented on GitHub (Mar 27, 2019):
https://github.com/pirate/ArchiveBox/wiki/Usage#import-a-single-url-or-list-of-urls-via-stdin
https://github.com/pirate/ArchiveBox/wiki/Usage#import-list-of-links-exported-from-browser-or-another-service
Links piped in via stdin are added to the archive, URLs passed as an argument get treated like a feed of URLs to import, not an individual page.