[GH-ISSUE #1066] Bug: archivebox update and archivebox list are slow to start due to unnecessary calls to merge_links() when reading from disk #2178

Open
opened 2026-03-01 17:57:04 +03:00 by kerem · 1 comment
Owner

Originally created by @ntevenhere on GitHub (Dec 22, 2022).
Original GitHub issue: https://github.com/ArchiveBox/ArchiveBox/issues/1066

Describe the bug

On a large archive, archivebox update or archivebox list starts right away some CPU intensive process that takes a long time to complete.

Steps to reproduce

  1. Have a large archive (I have 1800+ links)
  2. archivebox update or list

Screenshots or log output

docker-compose run archivebox update
[i] [2022-12-21 23:02:15] ArchiveBox v0.6.2: archivebox update
    > /data

[▶] [2022-12-21 23:05:07] Starting archiving of 1847 snapshots in index...

The user is simply left waiting, and for the lack of an explanation, the user is asked of his faith. Fans are screeching, it could be an infinite loop. If the user waits, they would wait 3 minutes. If they kill the program, the suspicion of an infinite loop would be wrong: (who could blame them)

Traceback (most recent call last):
  File "/usr/local/bin/archivebox", line 33, in <module>
    sys.exit(load_entry_point('archivebox', 'console_scripts', 'archivebox')())
  File "/app/archivebox/cli/__init__.py", line 140, in main
    run_subcommand(
  File "/app/archivebox/cli/__init__.py", line 80, in run_subcommand
    module.main(args=subcommand_args, stdin=stdin, pwd=pwd)    # type: ignore
  File "/app/archivebox/cli/archivebox_update.py", line 119, in main
    update(
  File "/app/archivebox/util.py", line 114, in typechecked_function
    return func(*args, **kwargs)
  File "/app/archivebox/main.py", line 788, in update
    matching_folders = list_folders(
  File "/app/archivebox/util.py", line 114, in typechecked_function
    return func(*args, **kwargs)
  File "/app/archivebox/main.py", line 929, in list_folders
    return STATUS_FUNCTIONS[status](links, out_dir=out_dir)
  File "/app/archivebox/index/__init__.py", line 411, in get_indexed_folders
    links = [snapshot.as_link_with_details() for snapshot in snapshots.iterator()]
  File "/app/archivebox/index/__init__.py", line 411, in <listcomp>
    links = [snapshot.as_link_with_details() for snapshot in snapshots.iterator()]
  File "/app/archivebox/core/models.py", line 127, in as_link_with_details
    return load_link_details(self.as_link())
  File "/app/archivebox/util.py", line 114, in typechecked_function
    return func(*args, **kwargs)
  File "/app/archivebox/index/__init__.py", line 348, in load_link_details
    existing_link = parse_json_link_details(out_dir)
  File "/app/archivebox/util.py", line 114, in typechecked_function
    return func(*args, **kwargs)
  File "/app/archivebox/index/json.py", line 110, in parse_json_link_details
    return Link.from_json(link_json, guess)
  File "/app/archivebox/index/schema.py", line 246, in from_json
    cast_result = ArchiveResult.from_json(json_result, guess)
  File "/app/archivebox/index/schema.py", line 97, in from_json
    info['end_ts'] = parse_date(info['end_ts'])
  File "/app/archivebox/util.py", line 114, in typechecked_function
    return func(*args, **kwargs)
  File "/app/archivebox/util.py", line 157, in parse_date
    return dateparser(date, settings={'TIMEZONE': 'UTC'}).replace(tzinfo=timezone.utc)
  File "/usr/local/lib/python3.9/site-packages/dateparser/conf.py", line 89, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/dateparser/__init__.py", line 54, in parse
    data = parser.get_date_data(date_string, date_formats)
  File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 421, in get_date_data
    parsed_date = _DateLocaleParser.parse(
  File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 178, in parse
    return instance._parse()
  File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 182, in _parse
    date_data = self._parsers[parser_name]()
  File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 196, in _try_freshness_parser
    return freshness_date_parser.get_date_data(self._get_translated_date(), self._settings)
  File "/usr/local/lib/python3.9/site-packages/dateparser/freshness_date_parser.py", line 159, in get_date_data
    date, period = self.parse(date_string, settings)
  File "/usr/local/lib/python3.9/site-packages/dateparser/freshness_date_parser.py", line 88, in parse
    self.now = apply_timezone(_now, settings.TIMEZONE)
  File "/usr/local/lib/python3.9/site-packages/dateparser/utils/__init__.py", line 115, in apply_timezone
    new_datetime = apply_dateparser_timezone(date_time, tz_string)
  File "/usr/local/lib/python3.9/site-packages/dateparser/utils/__init__.py", line 103, in apply_dateparser_timezone
    if info['regex'].search(' %s' % offset_or_timezone_abb):
KeyboardInterrupt

Long story short, I snooped after setting up the dev enviroment. Before telling the user anything, archivebox is iterating on every matching link, of 1847. But that's not what's slow! It's a particular function, merge_links() that, ran 1847 times, adds up to a lot of waiting.

Merge_links() is called by load_link_details(), seemingly to combine the information from disk about the link we're currently processing, and prettify it too. So far so good, but why are we doing in this in bulk? For example, archivebox list is going to iterate on each link, to print it, so why not "merge" as you roll? Do we really need a complete list of merged links before doing anything? Perhaps I'm not seeing the entire picture though...

Originally created by @ntevenhere on GitHub (Dec 22, 2022). Original GitHub issue: https://github.com/ArchiveBox/ArchiveBox/issues/1066 #### Describe the bug On a large archive, `archivebox update` or `archivebox list` starts right away some CPU intensive process that takes a long time to complete. #### Steps to reproduce 1. Have a large archive (I have 1800+ links) 2. `archivebox update` or `list` #### Screenshots or log output ``` docker-compose run archivebox update [i] [2022-12-21 23:02:15] ArchiveBox v0.6.2: archivebox update > /data [▶] [2022-12-21 23:05:07] Starting archiving of 1847 snapshots in index... ``` The user is simply left waiting, and for the lack of an explanation, the user is asked of his faith. Fans are screeching, it could be an infinite loop. If the user waits, they would wait 3 minutes. If they kill the program, the suspicion of an infinite loop would be wrong: (who could blame them) ``` Traceback (most recent call last): File "/usr/local/bin/archivebox", line 33, in <module> sys.exit(load_entry_point('archivebox', 'console_scripts', 'archivebox')()) File "/app/archivebox/cli/__init__.py", line 140, in main run_subcommand( File "/app/archivebox/cli/__init__.py", line 80, in run_subcommand module.main(args=subcommand_args, stdin=stdin, pwd=pwd) # type: ignore File "/app/archivebox/cli/archivebox_update.py", line 119, in main update( File "/app/archivebox/util.py", line 114, in typechecked_function return func(*args, **kwargs) File "/app/archivebox/main.py", line 788, in update matching_folders = list_folders( File "/app/archivebox/util.py", line 114, in typechecked_function return func(*args, **kwargs) File "/app/archivebox/main.py", line 929, in list_folders return STATUS_FUNCTIONS[status](links, out_dir=out_dir) File "/app/archivebox/index/__init__.py", line 411, in get_indexed_folders links = [snapshot.as_link_with_details() for snapshot in snapshots.iterator()] File "/app/archivebox/index/__init__.py", line 411, in <listcomp> links = [snapshot.as_link_with_details() for snapshot in snapshots.iterator()] File "/app/archivebox/core/models.py", line 127, in as_link_with_details return load_link_details(self.as_link()) File "/app/archivebox/util.py", line 114, in typechecked_function return func(*args, **kwargs) File "/app/archivebox/index/__init__.py", line 348, in load_link_details existing_link = parse_json_link_details(out_dir) File "/app/archivebox/util.py", line 114, in typechecked_function return func(*args, **kwargs) File "/app/archivebox/index/json.py", line 110, in parse_json_link_details return Link.from_json(link_json, guess) File "/app/archivebox/index/schema.py", line 246, in from_json cast_result = ArchiveResult.from_json(json_result, guess) File "/app/archivebox/index/schema.py", line 97, in from_json info['end_ts'] = parse_date(info['end_ts']) File "/app/archivebox/util.py", line 114, in typechecked_function return func(*args, **kwargs) File "/app/archivebox/util.py", line 157, in parse_date return dateparser(date, settings={'TIMEZONE': 'UTC'}).replace(tzinfo=timezone.utc) File "/usr/local/lib/python3.9/site-packages/dateparser/conf.py", line 89, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/dateparser/__init__.py", line 54, in parse data = parser.get_date_data(date_string, date_formats) File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 421, in get_date_data parsed_date = _DateLocaleParser.parse( File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 178, in parse return instance._parse() File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 182, in _parse date_data = self._parsers[parser_name]() File "/usr/local/lib/python3.9/site-packages/dateparser/date.py", line 196, in _try_freshness_parser return freshness_date_parser.get_date_data(self._get_translated_date(), self._settings) File "/usr/local/lib/python3.9/site-packages/dateparser/freshness_date_parser.py", line 159, in get_date_data date, period = self.parse(date_string, settings) File "/usr/local/lib/python3.9/site-packages/dateparser/freshness_date_parser.py", line 88, in parse self.now = apply_timezone(_now, settings.TIMEZONE) File "/usr/local/lib/python3.9/site-packages/dateparser/utils/__init__.py", line 115, in apply_timezone new_datetime = apply_dateparser_timezone(date_time, tz_string) File "/usr/local/lib/python3.9/site-packages/dateparser/utils/__init__.py", line 103, in apply_dateparser_timezone if info['regex'].search(' %s' % offset_or_timezone_abb): KeyboardInterrupt ``` Long story short, I snooped after setting up the dev enviroment. Before telling the user anything, archivebox is iterating on every matching link, of 1847. But that's not what's slow! It's a particular function, `merge_links()` that, ran 1847 times, adds up to a lot of waiting. Merge_links() is called by load_link_details(), seemingly to combine the information from disk about the link we're currently processing, and prettify it too. So far so good, but why are we doing in this in bulk? For example, `archivebox list` is going to iterate on each link, to print it, so why not "merge" as you roll? Do we really need a complete list of merged links before doing anything? Perhaps I'm not seeing the entire picture though...
Author
Owner

@pirate commented on GitHub (Jan 19, 2024):

The merge is done before the list in order to dedupe them, as sometimes there are duplicate snapshots (between sqlite db and disk folder, or disk folder with another disk folder) from an older install or other archive getting merged in. https://github.com/ArchiveBox/ArchiveBox/wiki/Upgrading-or-Merging-Archives#merge-two-or-more-existing-archives

I'll likely improve this in the future, but it might require splitting out the import/dedupe step into an explicit user-run command in order to unblock the performance changes.

<!-- gh-comment-id:1899644879 --> @pirate commented on GitHub (Jan 19, 2024): The merge is done before the list in order to dedupe them, as sometimes there are duplicate snapshots (between sqlite db and disk folder, or disk folder with another disk folder) from an older install or other archive getting merged in. https://github.com/ArchiveBox/ArchiveBox/wiki/Upgrading-or-Merging-Archives#merge-two-or-more-existing-archives I'll likely improve this in the future, but it might require splitting out the import/dedupe step into an explicit user-run command in order to unblock the performance changes.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/ArchiveBox#2178
No description provided.