[GH-ISSUE #103] librespot crashes - main thread panic #93

Closed
opened 2026-02-27 19:28:47 +03:00 by kerem · 11 comments
Owner

Originally created by @sashahilton00 on GitHub (Jan 29, 2018).
Original GitHub issue: https://github.com/librespot-org/librespot/issues/103

Issue by pwitka2
Monday Jan 22, 2018 at 20:10 GMT
Originally opened as https://github.com/plietar/librespot/issues/273


librespot crashes with such error message:

INFO:librespot::player: Track "Sweet Pea" loaded
thread 'main' panicked at 'Box', core/src/session.rs:83:42
note: Run with RUST_BACKTRACE=1 for a backtrace.

It happened after long idle (paused) time.

I have raspbian wheezy
raspotify 0.9.1~librespot.20171005T125424Z.8971d3a

Originally created by @sashahilton00 on GitHub (Jan 29, 2018). Original GitHub issue: https://github.com/librespot-org/librespot/issues/103 <a href="https://github.com/pwitka2"><img src="https://avatars0.githubusercontent.com/u/10363750?v=4" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [pwitka2](https://github.com/pwitka2)** _Monday Jan 22, 2018 at 20:10 GMT_ _Originally opened as https://github.com/plietar/librespot/issues/273_ ---- librespot crashes with such error message: **INFO:librespot::player: Track "Sweet Pea" loaded thread 'main' panicked at 'Box<Any>', core/src/session.rs:83:42 note: Run with `RUST_BACKTRACE=1` for a backtrace.** It happened after long idle (paused) time. I have raspbian wheezy raspotify 0.9.1~librespot.20171005T125424Z.8971d3a
Author
Owner

@sashahilton00 commented on GitHub (Jan 29, 2018):

Comment by kingosticks
Monday Jan 22, 2018 at 20:52 GMT


I would suggest you run RUST_BACKTRACE=1 and provide the backtrace.

<!-- gh-comment-id:361278827 --> @sashahilton00 commented on GitHub (Jan 29, 2018): <a href="https://github.com/kingosticks"><img src="https://avatars2.githubusercontent.com/u/934824?v=4" align="left" width="48" height="48" hspace="10"></img></a> **Comment by [kingosticks](https://github.com/kingosticks)** _Monday Jan 22, 2018 at 20:52 GMT_ ---- I would suggest you run RUST_BACKTRACE=1 and provide the backtrace.
Author
Owner

@sashahilton00 commented on GitHub (Jan 29, 2018):

Comment by pwitka2
Wednesday Jan 24, 2018 at 19:16 GMT


Here is a backtrace, but app crashes in different place:
INFO:librespot::player: Track "Magic Wind" loaded
thread 'main' panicked at 'connection closed', /checkout/src/libcore/option.rs:819:4
stack backtrace:
0:
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16: __libc_start_main

<!-- gh-comment-id:361278851 --> @sashahilton00 commented on GitHub (Jan 29, 2018): <a href="https://github.com/pwitka2"><img src="https://avatars0.githubusercontent.com/u/10363750?v=4" align="left" width="48" height="48" hspace="10"></img></a> **Comment by [pwitka2](https://github.com/pwitka2)** _Wednesday Jan 24, 2018 at 19:16 GMT_ ---- Here is a backtrace, but app crashes in different place: INFO:librespot::player: Track "Magic Wind" loaded thread 'main' panicked at 'connection closed', /checkout/src/libcore/option.rs:819:4 stack backtrace: 0: <unknown> 1: <unknown> 2: <unknown> 3: <unknown> 4: <unknown> 5: <unknown> 6: <unknown> 7: <unknown> 8: <unknown> 9: <unknown> 10: <unknown> 11: <unknown> 12: <unknown> 13: <unknown> 14: <unknown> 15: <unknown> 16: __libc_start_main
Author
Owner

@sashahilton00 commented on GitHub (Jan 29, 2018):

Comment by kingosticks
Wednesday Jan 24, 2018 at 19:23 GMT


Raspotify seem to have decided to strip the binary. That's not helpful.

<!-- gh-comment-id:361278884 --> @sashahilton00 commented on GitHub (Jan 29, 2018): <a href="https://github.com/kingosticks"><img src="https://avatars2.githubusercontent.com/u/934824?v=4" align="left" width="48" height="48" hspace="10"></img></a> **Comment by [kingosticks](https://github.com/kingosticks)** _Wednesday Jan 24, 2018 at 19:23 GMT_ ---- Raspotify seem to have decided to strip the binary. That's not helpful.
Author
Owner

@ComlOnline commented on GitHub (Jan 29, 2018):

Following a conversation with the dev of raspotify he was going to release a new version based on this fork. I'd wait for that and try again.

<!-- gh-comment-id:361314954 --> @ComlOnline commented on GitHub (Jan 29, 2018): Following a conversation with the dev of raspotify he was going to release a new version based on this fork. I'd wait for that and try again.
Author
Owner

@ComlOnline commented on GitHub (Jan 29, 2018):

Just checked he's already done it: https://github.com/dtcooper/raspotify/releases/tag/0.10
Try to cause the error again and let us know

<!-- gh-comment-id:361315063 --> @ComlOnline commented on GitHub (Jan 29, 2018): Just checked he's already done it: https://github.com/dtcooper/raspotify/releases/tag/0.10 Try to cause the error again and let us know
Author
Owner

@sashahilton00 commented on GitHub (Jan 29, 2018):

This issue is still present, I had it this morning. It's like he said, it crops up after no music has been playing for a while (and potentially internet has disconnected/reconnected). I'll run with a backtrace later and see if i can get a trace.

<!-- gh-comment-id:361316180 --> @sashahilton00 commented on GitHub (Jan 29, 2018): This issue is still present, I had it this morning. It's like he said, it crops up after no music has been playing for a while (and potentially internet has disconnected/reconnected). I'll run with a backtrace later and see if i can get a trace.
Author
Owner

@sashahilton00 commented on GitHub (Jan 29, 2018):

@kingosticks @ComlOnline here is a backtrace for this issue. I triggered it by playing a song through librespot, then disconnecting the internet and waiting for the buffered music to finish playing. What looks to happen is at the end of the song, the track load is called, and since it is disconnected, the session is dropped, and this is not handled. Full trace below:

DEBUG:librespot::spirc: kMessageTypeNotify "LRS" dc956dabe1c2e64e61d0aa7b96d4093c70896df9 21 1517251454413
DEBUG:librespot_core::session: Session[0] strong=3 weak=3
DEBUG:librespot::player: command=Load(SpotifyId(u128 { high: 15501048872228962569, low: 12963269621848591226 }), true, 0)
DEBUG:librespot_core::session: drop Dispatch
thread 'main' panicked at 'Box<Any>', core/src/session.rs:83:42
stack backtrace:
   0: std::sys::imp::backtrace::tracing::imp::unwind_backtrace
             at src/libstd/sys/unix/backtrace/tracing/gcc_s.rs:49
   1: std::sys_common::backtrace::_print
             at src/libstd/sys_common/backtrace.rs:68
   2: std::panicking::default_hook::{{closure}}
             at src/libstd/sys_common/backtrace.rs:57
             at src/libstd/panicking.rs:381
   3: std::panicking::default_hook
             at src/libstd/panicking.rs:397
   4: std::panicking::begin_panic
             at src/libstd/panicking.rs:577
   5: std::panicking::begin_panic
             at /Users/travis/build/rust-lang/rust/src/libstd/panicking.rs:538
   6: librespot_core::session::Session::connect::{{closure}}::{{closure}}
             at core/src/session.rs:83
   7: <core::result::Result<T, E>>::map_err
             at /Users/travis/build/rust-lang/rust/src/libcore/result.rs:500
   8: <futures::future::map_err::MapErr<A, F> as futures::future::Future>::poll
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/map_err.rs:34
   9: <alloc::boxed::Box<F> as futures::future::Future>::poll
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/mod.rs:107
  10: <futures::task_impl::Spawn<F>>::poll_future_notify::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291
  11: <futures::task_impl::Spawn<T>>::enter::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352
  12: futures::task_impl::std::set
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/std/mod.rs:90
  13: <futures::task_impl::Spawn<T>>::enter
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352
  14: <futures::task_impl::Spawn<F>>::poll_future_notify
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291
  15: tokio_core::reactor::Core::dispatch_task::{{closure}}
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:356
  16: <scoped_tls::ScopedKey<T>>::set
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/scoped-tls-0.1.0/src/lib.rs:135
  17: tokio_core::reactor::Core::dispatch_task
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:355
  18: tokio_core::reactor::Core::dispatch
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:316
  19: futures::task_impl::Notify::clone_id
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:304
  20: tokio_core::reactor::Core::run
             at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:241
  21: librespot::main
             at src/main.rs:377
  22: panic_unwind::dwarf::eh::read_encoded_pointer
             at src/libpanic_unwind/lib.rs:101
  23: rust_panic
             at src/libstd/panicking.rs:459
             at src/libstd/panic.rs:365
             at src/libstd/rt.rs:58
  24: librespot::main
DEBUG:librespot::spirc: drop Spirc[0]
DEBUG:librespot::player: Shutting down player thread ...
<!-- gh-comment-id:361346910 --> @sashahilton00 commented on GitHub (Jan 29, 2018): @kingosticks @ComlOnline here is a backtrace for this issue. I triggered it by playing a song through librespot, then disconnecting the internet and waiting for the buffered music to finish playing. What looks to happen is at the end of the song, the track load is called, and since it is disconnected, the session is dropped, and this is not handled. Full trace below: ``` DEBUG:librespot::spirc: kMessageTypeNotify "LRS" dc956dabe1c2e64e61d0aa7b96d4093c70896df9 21 1517251454413 DEBUG:librespot_core::session: Session[0] strong=3 weak=3 DEBUG:librespot::player: command=Load(SpotifyId(u128 { high: 15501048872228962569, low: 12963269621848591226 }), true, 0) DEBUG:librespot_core::session: drop Dispatch thread 'main' panicked at 'Box<Any>', core/src/session.rs:83:42 stack backtrace: 0: std::sys::imp::backtrace::tracing::imp::unwind_backtrace at src/libstd/sys/unix/backtrace/tracing/gcc_s.rs:49 1: std::sys_common::backtrace::_print at src/libstd/sys_common/backtrace.rs:68 2: std::panicking::default_hook::{{closure}} at src/libstd/sys_common/backtrace.rs:57 at src/libstd/panicking.rs:381 3: std::panicking::default_hook at src/libstd/panicking.rs:397 4: std::panicking::begin_panic at src/libstd/panicking.rs:577 5: std::panicking::begin_panic at /Users/travis/build/rust-lang/rust/src/libstd/panicking.rs:538 6: librespot_core::session::Session::connect::{{closure}}::{{closure}} at core/src/session.rs:83 7: <core::result::Result<T, E>>::map_err at /Users/travis/build/rust-lang/rust/src/libcore/result.rs:500 8: <futures::future::map_err::MapErr<A, F> as futures::future::Future>::poll at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/map_err.rs:34 9: <alloc::boxed::Box<F> as futures::future::Future>::poll at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/future/mod.rs:107 10: <futures::task_impl::Spawn<F>>::poll_future_notify::{{closure}} at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291 11: <futures::task_impl::Spawn<T>>::enter::{{closure}} at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352 12: futures::task_impl::std::set at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/std/mod.rs:90 13: <futures::task_impl::Spawn<T>>::enter at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:352 14: <futures::task_impl::Spawn<F>>::poll_future_notify at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.14/src/task_impl/mod.rs:291 15: tokio_core::reactor::Core::dispatch_task::{{closure}} at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:356 16: <scoped_tls::ScopedKey<T>>::set at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/scoped-tls-0.1.0/src/lib.rs:135 17: tokio_core::reactor::Core::dispatch_task at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:355 18: tokio_core::reactor::Core::dispatch at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:316 19: futures::task_impl::Notify::clone_id at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:304 20: tokio_core::reactor::Core::run at /Users/sashahilton/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.9/src/reactor/mod.rs:241 21: librespot::main at src/main.rs:377 22: panic_unwind::dwarf::eh::read_encoded_pointer at src/libpanic_unwind/lib.rs:101 23: rust_panic at src/libstd/panicking.rs:459 at src/libstd/panic.rs:365 at src/libstd/rt.rs:58 24: librespot::main DEBUG:librespot::spirc: drop Spirc[0] DEBUG:librespot::player: Shutting down player thread ... ```
Author
Owner

@michalfita commented on GitHub (Feb 6, 2018):

I can add this happens when connection to the Internet is dropped (router up, but DSL dies).

Would it be at least possible that the process:

pi@raspberrypi:~$ ps aux | grep respot
raspoti+   872  3.5  1.3  33716 12184 ?        Ssl  lut05  75:03 /usr/bin/librespot --name Malina ∏ 3 --backend alsa --bitrate 320 --disable-audio-cache
pi        2049  0.0  0.2   5992  1884 pts/0    S+   22:44   0:00 grep respot

really dies on crash, so systemd could restart it?

If it's fixed, we need to wait for next release of raspotify.

<!-- gh-comment-id:363591450 --> @michalfita commented on GitHub (Feb 6, 2018): I can add this happens when connection to the Internet is dropped (router up, but DSL dies). Would it be at least possible that the process: ``` pi@raspberrypi:~$ ps aux | grep respot raspoti+ 872 3.5 1.3 33716 12184 ? Ssl lut05 75:03 /usr/bin/librespot --name Malina ∏ 3 --backend alsa --bitrate 320 --disable-audio-cache pi 2049 0.0 0.2 5992 1884 pts/0 S+ 22:44 0:00 grep respot ``` really dies on crash, so `systemd` could restart it? If it's fixed, we need to wait for next release of `raspotify`.
Author
Owner

@awiouy commented on GitHub (Feb 8, 2018):

From

This method will not catch panics from polling the future f. If the future panics then it's the responsibility of the caller to catch that panic and handle it as appropriate.

Is that done in session.rs? If so, where? If not, where should it be done and what should it do?

<!-- gh-comment-id:364096751 --> @awiouy commented on GitHub (Feb 8, 2018): From - https://docs.rs/tokio-core/0.1.12/tokio_core/reactor/struct.Handle.html#method.spawn and - https://docs.rs/tokio-core/0.1.12/tokio_core/reactor/struct.Remote.html#method.spawn: This method will not catch panics from polling the future f. If the future panics then it's the responsibility of the caller to catch that panic and handle it as appropriate. Is that done in session.rs? If so, where? If not, where should it be done and what should it do?
Author
Owner

@sashahilton00 commented on GitHub (Feb 9, 2018):

I'm not sure where this is implemented exactly, as I haven't looked at the code that handles this, but it's my understanding that the session logic is due for a rewrite to enable handling of things like reconnection, as it has been mentioned several times in the past, most recently by @plietar in Gitter on Feb 8:

Is there a tracking issue for reconnecting automatically when the connection is broken? I had a few ideas for implementation from a while ago I'd like to write down.

Hence, I've created #134 to track suggestions/improvements that can be made to the existing session logic, or in a rewrite of it.

<!-- gh-comment-id:364293203 --> @sashahilton00 commented on GitHub (Feb 9, 2018): I'm not sure where this is implemented exactly, as I haven't looked at the code that handles this, but it's my understanding that the session logic is due for a rewrite to enable handling of things like reconnection, as it has been mentioned several times in the past, most recently by @plietar in Gitter on Feb 8: ``` Is there a tracking issue for reconnecting automatically when the connection is broken? I had a few ideas for implementation from a while ago I'd like to write down. ``` Hence, I've created #134 to track suggestions/improvements that can be made to the existing session logic, or in a rewrite of it.
Author
Owner

@plietar commented on GitHub (Feb 14, 2018):

Since this is just a re-connection issue I'm going to close in favour of #134

<!-- gh-comment-id:365626893 --> @plietar commented on GitHub (Feb 14, 2018): Since this is just a re-connection issue I'm going to close in favour of #134
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/librespot#93
No description provided.