Change in Slim::Formats::HTTP breaks SqueezeScrobbler

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • James
    Senior Member
    • Apr 2005
    • 647

    Change in Slim::Formats::HTTP breaks SqueezeScrobbler

    Hi all,
    bit of a wierd one here. ..

    It seems that change 21854 to Slim::Formats::HTTP in 7.1 breaks my LastFM plugin in SqueezeScrobbler.

    The requests to get a session from LastFM now always get a 400 error from their servers. Sending the same URL in a web browser works, and reverting to the previous version of the file fixes it.

    The only change is to add an extra CRLF on the end of the request! Any idea why this is a problem? If it breaks the LastFM requests it could be a problem with other servers as well...

    James
  • Andy Grundman
    Former Squeezebox Guy
    • Jan 2006
    • 7395

    #2
    Change in Slim::Formats::HTTP breaksSqueezeScrobbler

    On Jul 25, 2008, at 4:08 PM, James wrote:

    >
    > Hi all,
    > bit of a wierd one here. ..
    >
    > It seems that change 21854 to Slim::Formats::HTTP in 7.1 breaks my
    > LastFM plugin in SqueezeScrobbler.
    >
    > The requests to get a session from LastFM now always get a 400 error
    > from their servers. Sending the same URL in a web browser works, and
    > reverting to the previous version of the file fixes it.
    >
    > The only change is to add an extra CRLF on the end of the request! Any
    > idea why this is a problem? If it breaks the LastFM requests it could
    > be a problem with other servers as well...
    >
    > James


    Hmm this shouldn't cause a problem at all as the HTTP request is
    filtered through HTTP::Request which would add the needed CRLF
    anyway. Can you get a log of the failure with network.asynchttp?

    Comment

    • Craig, James \(IT\)

      #3
      Change in Slim::Formats::HTTPbreaksSqueezeScrobbler

      I'll try, although I am not using the code asynchronously.

      When I change the plugin to use the SimpleAsync... classes instead, the same request works fine.
      But I was deliberately not doing it asynchronously as I wanted to know the web request status before proceeding.

      James
      --------------------------------------------------------

      NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error.

      Comment

      • Andy Grundman
        Former Squeezebox Guy
        • Jan 2006
        • 7395

        #4
        Change in Slim::Formats::HTTPbreaksSqueezeScrobbler

        On Jul 28, 2008, at 8:36 AM, Craig, James (IT) wrote:

        > I'll try, although I am not using the code asynchronously.
        >
        > When I change the plugin to use the SimpleAsync... classes instead,
        > the same request works fine.
        > But I was deliberately not doing it asynchronously as I wanted to
        > know the web request status before proceeding.


        Ah, well you should definitely think about using async HTTP instead.

        Comment

        • volpone
          Senior Member
          • Mar 2008
          • 204

          #5
          Originally posted by Craig, James \(IT\)
          I'll try, although I am not using the code asynchronously.

          When I change the plugin to use the SimpleAsync... classes instead, the same request works fine.
          But I was deliberately not doing it asynchronously as I wanted to know the web request status before proceeding.

          James
          --------------------------------------------------------
          James,
          Do you think you can fix this because i've got the same 400 error from Last.fm server with the newly released SC 7.1 and SqueezeScrobbler plugin when trying to play Last.FM Radio via SC.

          Also the Last.FM service via "official" 7.1 plugin seems not usable outside US/GB/DK .. Is (apart from the current request problem) SqueezeScrobbler a workaround using Last.FM from France ?

          Regards

          Volpone
          SqueezeBoxServer 7.9 / ReadyNas Pro (x86) | SBTouch - SB3 - Duet - Boom - Ipeng
          SBTouch => Rega DAC => Rega Brio R amp => Harbeth SLH5 speakers
          see details & photos here, 4 slides

          Comment

          • Craig, James \(IT\)

            #6
            Change in Slim::Formats::HTTPbreaksSqueezeScrobbler

            Yes, I do have a fix but Sourceforge wouldn't let me check it in over the weekend!

            I will try and upload it later today, I guessed that some people will still want to use SqueezeScrobbler due to the regional restriction
            (which is really bizarre, because I believe it doesn't exist for any other players!)

            James
            --------------------------------------------------------

            NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error.

            Comment

            • volpone
              Senior Member
              • Mar 2008
              • 204

              #7
              Originally posted by Craig, James \(IT\)

              I will try and upload it later today, I guessed that some people will still want to use SqueezeScrobbler due to the regional restriction
              (which is really bizarre, because I believe it doesn't exist for any other players!)

              James
              Thank's a lot James for this and all you work.

              Volpone
              SqueezeBoxServer 7.9 / ReadyNas Pro (x86) | SBTouch - SB3 - Duet - Boom - Ipeng
              SBTouch => Rega DAC => Rega Brio R amp => Harbeth SLH5 speakers
              see details & photos here, 4 slides

              Comment

              • mavit
                Senior Member
                • Feb 2007
                • 148

                #8
                Originally posted by andyg
                On Jul 25, 2008, at 4:08 PM, James wrote:

                > It seems that change 21854 to Slim::Formats::HTTP in 7.1 breaks my
                > LastFM plugin in SqueezeScrobbler.
                >
                > The requests to get a session from LastFM now always get a 400 error
                > from their servers. Sending the same URL in a web browser works, and
                > reverting to the previous version of the file fixes it.
                >
                > The only change is to add an extra CRLF on the end of the request! Any
                > idea why this is a problem? If it breaks the LastFM requests it could
                > be a problem with other servers as well...


                Hmm this shouldn't cause a problem at all as the HTTP request is
                filtered through HTTP::Request which would add the needed CRLF
                anyway. Can you get a log of the failure with network.asynchttp?
                Doesn't this change result in two blank lines after the HTTP header when you're not seeking? Don't you only want one?

                Comment

                • Andy Grundman
                  Former Squeezebox Guy
                  • Jan 2006
                  • 7395

                  #9
                  Change in Slim::Formats::HTTP breaksSqueezeScrobbler

                  On Aug 12, 2008, at 4:36 PM, mavit wrote:

                  >
                  > andyg;323472 Wrote:
                  >> On Jul 25, 2008, at 4:08 PM, James wrote:
                  >>
                  >>> It seems that change 21854 to Slim::Formats::HTTP in 7.1 breaks my
                  >>> LastFM plugin in SqueezeScrobbler.
                  >>>
                  >>> The requests to get a session from LastFM now always get a 400 error
                  >>> from their servers. Sending the same URL in a web browser works, and
                  >>> reverting to the previous version of the file fixes it.
                  >>>
                  >>> The only change is to add an extra CRLF on the end of the request!

                  >> Any
                  >>> idea why this is a problem? If it breaks the LastFM requests it

                  >> could
                  >>> be a problem with other servers as well...

                  >>
                  >>
                  >> Hmm this shouldn't cause a problem at all as the HTTP request is
                  >> filtered through HTTP::Request which would add the needed CRLF
                  >> anyway. Can you get a log of the failure with network.asynchttp?

                  >
                  > Doesn't this change result in two blank lines after the HTTP header
                  > when you're not seeking? Don't you only want one?


                  An HTTP header ends with 2 CRLF's. We were only adding 1 before this
                  change. This is moot on SC because the whole request gets filtered
                  through HTTP::Request which will fix any problem like that anyway.
                  The problem only became apparent on SN where HTTP::Request isn't used
                  here.

                  Comment

                  • mavit
                    Senior Member
                    • Feb 2007
                    • 148

                    #10
                    Originally posted by andyg
                    An HTTP header ends with 2 CRLF's. We were only adding 1 before this
                    change. This is moot on SC because the whole request gets filtered
                    through HTTP::Request which will fix any problem like that anyway.
                    The problem only became apparent on SN where HTTP::Request isn't used
                    here.
                    Ah, but I count three CRLFs. This effectively gives you a normal header followed by a body consisting of a single CRLF. GET requests aren't supposed to have bodies, hence the 400 Bad Request response.

                    From Wireshark:
                    Code:
                    0000   47 45 54 20 2f 20 48 54 54 50 2f 31 2e 30 0d 0a  GET / HTTP/1.0..
                    0010   43 61 63 68 65 2d 43 6f 6e 74 72 6f 6c 3a 20 6e  Cache-Control: n
                    0020   6f 2d 63 61 63 68 65 0d 0a 43 6f 6e 6e 65 63 74  o-cache..Connect
                    0030   69 6f 6e 3a 20 63 6c 6f 73 65 0d 0a 41 63 63 65  ion: close..Acce
                    0040   70 74 3a 20 2a 2f 2a 0d 0a 48 6f 73 74 3a 20 68  pt: */*..Host: h
                    0050   61 6e 64 6d 61 67 69 63 2e 6d 75 78 74 61 70 65  andmagic.muxtape
                    0060   2e 63 6f 6d 0d 0a 55 73 65 72 2d 41 67 65 6e 74  .com..User-Agent
                    0070   3a 20 69 54 75 6e 65 73 2f 34 2e 37 2e 31 20 28  : iTunes/4.7.1 (
                    0080   4c 69 6e 75 78 3b 20 4e 3b 20 52 65 64 20 48 61  Linux; N; Red Ha
                    0090   74 3b 20 69 36 38 36 2d 6c 69 6e 75 78 3b 20 45  t; i686-linux; E
                    00a0   4e 3b 20 75 74 66 38 29 20 53 71 75 65 65 7a 65  N; utf8) Squeeze
                    00b0   43 65 6e 74 65 72 2f 37 2e 31 2f 32 32 31 37 30  Center/7.1/22170
                    00c0   0d 0a 49 63 79 2d 4d 65 74 61 64 61 74 61 3a 20  ..Icy-Metadata: 
                    00d0   31 0d 0a 0d 0a 0d 0a                             1......

                    Comment

                    • Andy Grundman
                      Former Squeezebox Guy
                      • Jan 2006
                      • 7395

                      #11
                      Change in Slim::Formats::HTTP breaksSqueezeScrobbler

                      On Aug 12, 2008, at 5:21 PM, mavit wrote:

                      >
                      > andyg;328802 Wrote:
                      >> An HTTP header ends with 2 CRLF's. We were only adding 1 before this
                      >> change. This is moot on SC because the whole request gets filtered
                      >> through HTTP::Request which will fix any problem like that anyway.
                      >> The problem only became apparent on SN where HTTP::Request isn't used
                      >>
                      >> here.

                      >
                      > Ah, but I count three CRLFs. This effectively gives you a normal
                      > header followed by a body consisting of a single CRLF. GET requests
                      > aren't supposed to have bodies, hence the 400 Bad Request response.
                      >
                      >> From Wireshark:

                      >
                      > Code:
                      > --------------------
                      >
                      > 0000 47 45 54 20 2f 20 48 54 54 50 2f 31 2e 30 0d 0a GET / HTTP/
                      > 1.0..
                      > 0010 43 61 63 68 65 2d 43 6f 6e 74 72 6f 6c 3a 20 6e Cache-
                      > Control: n
                      > 0020 6f 2d 63 61 63 68 65 0d 0a 43 6f 6e 6e 65 63 74 o-
                      > cache..Connect
                      > 0030 69 6f 6e 3a 20 63 6c 6f 73 65 0d 0a 41 63 63 65 ion:
                      > close..Acce
                      > 0040 70 74 3a 20 2a 2f 2a 0d 0a 48 6f 73 74 3a 20 68 pt: */
                      > *..Host: h
                      > 0050 61 6e 64 6d 61 67 69 63 2e 6d 75 78 74 61 70 65
                      > andmagic.muxtape
                      > 0060 2e 63 6f 6d 0d 0a 55 73 65 72 2d 41 67 65 6e 74 .com..User-
                      > Agent
                      > 0070 3a 20 69 54 75 6e 65 73 2f 34 2e 37 2e 31 20 28 : iTunes/
                      > 4.7.1 (
                      > 0080 4c 69 6e 75 78 3b 20 4e 3b 20 52 65 64 20 48 61 Linux; N;
                      > Red Ha
                      > 0090 74 3b 20 69 36 38 36 2d 6c 69 6e 75 78 3b 20 45 t; i686-
                      > linux; E
                      > 00a0 4e 3b 20 75 74 66 38 29 20 53 71 75 65 65 7a 65 N; utf8)
                      > Squeeze
                      > 00b0 43 65 6e 74 65 72 2f 37 2e 31 2f 32 32 31 37 30 Center/
                      > 7.1/22170
                      > 00c0 0d 0a 49 63 79 2d 4d 65 74 61 64 61 74 61 3a 20 ..Icy-
                      > Metadata:
                      > 00d0 31 0d 0a 0d 0a 0d 0a 1......


                      You're right, I stand corrected. Fixed in 7.2 change 22563.

                      Comment

                      Working...