Home of the Squeezebox™ & Transporter® network music players.
Results 1 to 7 of 7
  1. #1
    Junior Member
    Join Date
    Dec 2016
    Posts
    5

    Forcing playback state and offset from a protocol handler plugin

    I have (foolishly) decided that the right way to listen to podcasts through my Squeezebox is to hack together a ChromeCast Audio controller plugin. Presumably this would also let me cast other sources were I so inclined.

    The ChromeCast device is connected to the (Mac) server's audio input and a modified version of the WavInput plugin is feeding data from a build of sox reading from CoreAudio and piping that data as a Slim::Player::Pipeline. So far so good, but the apparent duration of the track is 0, and the playback offset just keeps counting up.

    Next I got brave and wired up a Python child process and some toy IPC using the pychromecast library. Now when the pipeline starts a "play" is sent to the ChromeCast and when it stops a "pause" is sent. canDoAction for "pause" returns false, because my reading is that "pause" means "server stops reading from the pipe", not "can the remote endpoint pause and send the server a stream of zeros". Corrections welcome.

    A little more plumbing and getMetadataFor is returning track/album/info fetched from the python child, but duration is ignored _unless_ I set it on the song in getNextTrack. However the duration isn't helpful because it's the real duration, but the current offset is wrong.

    There's a lot of questions elided in the above (is this a isRepeatingStream?, etc.), but the basics are there. However, before I decide to take this any further I realize I need to answer a core question: Is there a way to push playback external state into the server? Or must the server always control the state of the stream?

    Some examples:

    - When playback starts I may already be midway through a track, can I force the time (but not byte) offset in the UI?

    - At every getMetadataFor there's a chance the play/pause state, or the time offset will have been changed externally via the casting device (or even another cast controller app). I have all this state available, but its not clear to me that i can force the server into the expected state. Can I tell the server the offset changed or the playback state changed without actually trying to tell the server to play or pause? In this scenario LMS is a passive listener, its not really in control.

    - I can actually notify on state changes on the device, can I run state updates then? Or must I wait for the next getMetadataFor()?

    - a canDoAction() for "rew" seems to want to be able to set the seek offset to zero itself, but the ChromeCast already has its own meaning for skip back that varies by the casting app. Is there a way to turn the rew button click into just the ChromeCast command and let it handle recomputing the seek offset?

    - By the same token a skip forward (canDoAction() "stop"?) appears to want to trigger the getNextTrack() plumbing, but again all I really need is to send the skip forward command and let the ChromeCast handle it.


    Any insight appreciated. I have the sense this isn't likely to work, or at least I can't find anything similar in plugins I've looked at, but I wanted to check before I gave up,

    Axel

  2. #2
    Senior Member
    Join Date
    Oct 2005
    Location
    Ireland
    Posts
    15,798
    UI controls for jump to will dependon the device.

    The WebUI ffw/rew will do next track / prev track. On WebUI click on timeline to jump to specific time. If you overload ffw/rew with your own control you will have plroblem with interactions with other LMS functions.

    On older player such as SB3 etc and Touch . rew+hold wil do forward in time (or knowb on Boom).

    LMS will not do any specific time related function (i.e. offset) with ff/rew above unless the track playing has a defined duration.

    If the playing track does not have duration set (and it can be set after track starts playing) - there is no point in trying to use LMS ff/rew and timebar.

    BTW your solution sounds too complicated but first version are always complicated as you explore the limits.

    There was a plugin a while ago which did VLC control - it may be worth looking at for ideas but TBH I never used it.http://forums.slimdevices.com/showth...r-Plugin-2-21b

    edit:

    BBC iPlayer which has ability to start anywhere within last 3 hours of a live stream - since live stream has no duration timebar couldn't be used so I implemented jump back/forward x minutes as a "More" menu.
    Last edited by bpa; 2018-01-31 at 03:19.

  3. #3
    Junior Member
    Join Date
    Dec 2016
    Posts
    5
    UI controls for jump to will dependon the device. <snip>
    OK, that makes sense in that different device types have different implementations, but all seem to assume that if a duration is available, then they can control the absolute seek position, albeit in different ways.

    Based on that it sounds like there's no way to tell LMS the "real" position in the track? Can I just force a position that all player types will understand and then just not allow seeking?

    If the playing track does not have duration set (and it can be set after track starts playing) - there is no point in trying to use LMS ff/rew and timebar.
    Its not obvious to me how to set the duration later. Should getMetadataFor() work? I set a "duration" on the returned hash, but it seemed to have no effect.

    BTW your solution sounds too complicated but first version are always complicated as you explore the limits.
    I'd be interested if you'd elaborate, given the poor docs and general lack of code commenting I'd love to be pointed in an easier direction.

    There was a plugin a while ago which did VLC control - it may be worth looking at for ideas but TBH I never used it.http://forums.slimdevices.com/showth...r-Plugin-2-21b
    It seems like the plugin archive is gone? Or did I miss something?

    BBC iPlayer which has ability to start anywhere within last 3 hours of a live stream - since live stream has no duration timebar couldn't be used so I implemented jump back/forward x minutes as a "More" menu.
    I'll look at that plugin next, thanks.

    In this case its not really a live stream, I have accurate duration and offset info. Even if I can't allow skip/seek operations it would be nice to reflect the current state in the display. Is there a way to push just that info if I'm willing to leave behind the skip operations?

    Thanks for the help,

    Axel

  4. #4
    Senior Member
    Join Date
    Oct 2005
    Location
    Ireland
    Posts
    15,798
    Quote Originally Posted by AxxelH View Post
    Based on that it sounds like there's no way to tell LMS the "real" position in the track? Can I just force a position that all player types will understand and then just not allow seeking?
    You still don;t understand.

    When a track is played usually an offfset is passed. So when a track is started offset will be zero. If click on a timebar is halfway - then offset will be half the duration. If you have a custom menu or other even - you can provide the offset.

    The track playing protocolhandler will interpret offset according to the protocol of the source material (e.g. HTTP use header bytes ranges)

    Its not obvious to me how to set the duration later. Should getMetadataFor() work? I set a "duration" on the returned hash, but it seemed to have no effect.
    You need to understand LMS track and song object (e.g look at end of Slim/Player/Song.pm many routines change song attributes but call your protocolhandler ).

    Set the value in $song->duration.
    The protocol handler's canSeek needs this value to be supported as it enable the ff/rew functionality.

    I'd be interested if you'd elaborate, given the poor docs and general lack of code commenting I'd love to be pointed in an easier direction.
    Better docs wouldn't help. Look at plugins. Find one which is similar to what you want and understand how it works.

    It seems like the plugin archive is gone? Or did I miss something?
    Seems so. Authors has not been on forum since 2009. Sorry.

    I'll look at that plugin next, thanks.
    It's a mess - sorry. It's stil the first version for DASH - too complicated.

    In this case its not really a live stream, I have accurate duration and offset info. Even if I can't allow skip/seek operations it would be nice to reflect the current state in the display. Is there a way to push just that info if I'm willing to leave behind the skip operations?
    Everything is interconnected. If you get the offset correct when playing starts - then display will match.

  5. #5
    Junior Member
    Join Date
    Dec 2016
    Posts
    5
    You still don;t understand.

    When a track is played usually an offfset is passed. So when a track is started offset will be zero. If click on a timebar is halfway - then offset will be half the duration. If you have a custom menu or other even - you can provide the offset.

    The track playing protocolhandler will interpret offset according to the protocol of the source material (e.g. HTTP use header bytes ranges)
    I think we're talking past each other... I understand that in typical operation LMS expects a song to be a stream of bytes from some source, and the protocol handler to interpret LMS commands as operations within that stream. Rwd/ff being seek time offsets that must be interpreted into byte ranges, etc. I also understand that in more complex protocols its up to the handler to interpret server operations into more complex operations (say, computing a new HTTP request with a range header).

    I'm specifically asking if I can invert that relationship. I have a continuous stream of bytes that represents a sequence of songs, or even no song at all if the casting is paused. Within that stream I know the current song, the position within that song, and the duration of the current song. When LMS begins observing that stream it is effectively joining an in-progress song and it will be carried along though those songs indefinitely, it has no control over seek position, or the relationship between the current byte offset and the current play position and duration.

    I'd like to force the player state to use the those values, rather than compute its own position within the duration. In effect I want it to be a passive receiver of position information. Is that possible? I realize its possible this is simply not possible, as LMS is not designed with this model in mind.

    Axel

  6. #6
    Senior Member
    Join Date
    Oct 2005
    Location
    Ireland
    Posts
    15,798
    Quote Originally Posted by AxxelH View Post
    I'd like to force the player state to use the those values, rather than compute its own position within the duration. In effect I want it to be a passive receiver of position information. Is that possible? I realize its possible this is simply not possible, as LMS is not designed with this model in mind.
    I don't think you can use this model but I have never tried - it feels wrong as LMS is in control and with the proposed model you'll be fighting it at every turn.

    For example if LMS thinks it is playing a song of a duration, the LMS uses info from the player on the number of seconds played. So position information depends on LMS knowing the duration of a track (possibly set using Slim::Schema::RemoteTrack->updateOrCreate ), knowing where it starts from offset ( set after a seek Slim::Player::Source::gototime ) and how much audio has been played ( from Slim::Player::Squeezebox2::songElapsedSeconds ).

    Pausing does not stop server reading from source - it stops the player from playing - server only stops reading from source when buffers are full. Some protocol handler will close the sources (e.g. live streams as some live stream disconnect after x mins) others (e.g. podcasts) will keep TCP connectn open and will resume when play is pressed.

    I think to do what you want, it should be in a protocol handler faking a stream of tracks and chopping up input stream into virtual stracks.

    I understand the stream of bytes but I don't understand what level of control you want or have. I don't know with your source if you can jump back to beginning of a track possibly 30 mins earlier or jump forward 2 hours .

    There is a Source object which has an API.
    Slim::Player::Source::gototime($client, $newPos) - it calls controller jumpToTime

    The LMS Controller object controls the streaming - the "API" is in the file Slim/Player/StreamingController.pm
    So to request LMS to go to a specific time a call like Slim::Player::StreamingController::jumpToTime is used.
    To find how much time has been played of current song use PlayerSongElapsed()
    The controller also does pause, play, stop, flush etc.

  7. #7
    Junior Member
    Join Date
    Dec 2016
    Posts
    5
    Just to follow up and bring the thread to a conclusion...

    I don't think you can use this model but I have never tried - it feels wrong as LMS is in control and with the proposed model you'll be fighting it at every turn.
    Indeed, this turned out to be the case. I have pretty fine control of the Chromecast stream, but trying to let the Chromecast drive the position information was a complete failure. Although one can set many of the elapsed values, the LMS playback code doesn't actually pay attention to the changes made. Everything wants to go through jumpToTime to actually update state. There's probably a way to fake it by ignoring some seek operations opportunisitically, but that looks fragile and a lot of work for little benefit.

    So I ended up with a "live" stream that starts and stops, but can't seek, and has no position information. Very much like the wave input. It does, however, update the metadata for the current track. Good enough for podcast listening.

    Thanks for all the help,

    Axel

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •