Home of the Squeezebox™ & Transporter® network music players.
Page 2 of 5 FirstFirst 1234 ... LastLast
Results 11 to 20 of 47
  1. #11
    Senior Member erland's Avatar
    Join Date
    Dec 2005
    Location
    Sweden
    Posts
    11,285
    Quote Originally Posted by mamema View Post
    the involved sql queries are:

    UPDATE tracks,track_statistics SET track_statistics.url=tracks.url,track_statistics.u rlmd5=tracks.urlmd5 where tracks.musicbrainz_id is not null and tracks.musicbrainz_id=track_statistics.musicbrainz _id and track_statistics.url!=tracks.url and length(tracks.url)<".($useLongUrls?512:256);
    This above is only used for MySQL and I don't think LMS supports MySQL these days, so you can probably ignore that.

    Quote Originally Posted by mamema View Post
    CREATE temp table temp_track_statistics as select tracks.url,tracks.urlmd5,tracks.musicbrainz_id from tracks join track_statistics on tracks.musicbrainz_id=track_statistics.musicbrainz _id where track_statistics.musicbrainz_id is not null and track_statistics.urlmd5!=tracks.urlmd5";

    UPDATE track_statistics SET url=(select url from temp_track_statistics where musicbrainz_id=track_statistics.musicbrainz_id),ur lmd5=(select urlmd5 from temp_track_statistics where musicbrainz_id=track_statistics.musicbrainz_id) where exists (select url from temp_track_statistics where musicbrainz_id=track_statistics.musicbrainz_id)";

    try to test now, if a INNER JOIN instead of WHERE Clause will speed things up
    If you like to go the quick and dirty route and just optimise these SQL statements a bit I wonder if there needs to be some indexes created for the temp table. For the permanent track_statistics table I create a number of indexes here in the code: https://github.com/erland/lms-tracks...torage.pm#L365
    Could be that these indexes aren't transfered automatically to the temp table and in that case that could maybe be the issue. However, it is a bit strange because looking at the SQL I would expect the temp table to be empty unless you have renamed/moved music files. I wonder if you have multiple tracks with the same musicbrainz id in your database, because that would probably cause the temp table to contain some data. Multiple tracks with same musicbrainz id is known to result in data in track_statistics table to become incorrectly duplicated, so I've suggested to people who have these kind of duplicates in their library to disable musicbrainz support in TrackStat.
    Erland Isaksson (My homepage)
    Developer of many plugins/applets
    Starting with LMS 8.0 I no longer support my plugins/applets (see here for more information )

  2. #12
    Senior Member erland's Avatar
    Join Date
    Dec 2005
    Location
    Sweden
    Posts
    11,285
    Quote Originally Posted by mamema View Post
    Need to go later through your extensive reply first, but one point bugs me:

    >- The issue is that when a track is moved or renamed standard LMS scanner will delete the old track and add a new track and the rating, play counts and last played time is lost.

    i don't move stuff around, neither do i rename it.
    But the extensive 3.5 hour run is always triggered when update scanning takes place.

    The only thing happening at the moment is heavy musicip fingerprinting, but with preserved file change date.

    So either i do not understand your mentioned "rename" sentence, or there is something to investigate.....
    The SQL statement in the refresh operation still executes but isn't supposed to do anything.However, as mentioned in my post a few seconds ago, I would expect the temp table to be empty in your case. However, if you have multiple tracks with same musicbrainz id that could explain it, because then the temp table would not be empty and it would likely cause a lot of duplicates in the track_statistics table.

    You could run the select that creates the temp table and see if it returns something towards your database. If you aren't able to connect directly towards the database you can use Database Query plugin and create a free form query to run the select statement.
    Erland Isaksson (My homepage)
    Developer of many plugins/applets
    Starting with LMS 8.0 I no longer support my plugins/applets (see here for more information )

  3. #13
    Senior Member
    Join Date
    Mar 2011
    Posts
    164
    Quote Originally Posted by erland View Post
    You could run the select that creates the temp table and see if it returns something towards your database. If you aren't able to connect directly towards the database you can use Database Query plugin and create a free form query to run the select statement.
    i'm just running now my 3 hour scanning run and modified the plugin to not drop the temp table, if it is created, so i can look into it.

    If you're right, and i'm sure you are, then the import scan module way would be the best then.
    Last edited by mamema; 2021-02-26 at 06:36.

  4. #14
    Babelfish's Best Boy mherger's Avatar
    Join Date
    Apr 2005
    Location
    Switzerland
    Posts
    20,530

    Working on a plugin - Scanning question

    > the involved sql queries are:

    Time each of these queries to figure out which one burns the most CPU cycle.

    my $t = time();
    runSQL1(); # whatever
    warn "SQL1: " . (time() - $t);

    Or similar. Poor man's instrumentation.

    If you want to go fancy, you can look into installing NYTProf and
    profile the runs (https://metacpan.org/pod/Devel::NYTProf). This will
    further slow down the process and take additional time to generate
    useful reports. You might want to try with a smaller collection first.

  5. #15
    Babelfish's Best Boy mherger's Avatar
    Join Date
    Apr 2005
    Location
    Switzerland
    Posts
    20,530

    Working on a plugin - Scanning question

    >> the involved sql queries are:
    >
    > Time each of these queries to figure out which one burns the most CPU
    > cycle.
    >
    > my $t = time();
    > runSQL1(); # whatever
    > warn "SQL1: " . (time() - $t);
    >
    > Or similar. Poor man's instrumentation.


    Oh, just saw that Erland did add a lot of timing measurements already.
    Run LMS with debug mode for plugin.trackstat, then provide the
    server.log file.

  6. #16
    Babelfish's Best Boy mherger's Avatar
    Join Date
    Apr 2005
    Location
    Switzerland
    Posts
    20,530

    Working on a plugin - Scanning question

    > Part 3: Analyze command
    > https://github.com/erland/lms-tracks...orage.pm#L1406
    > - Run analyze command to optimise the performance of TrackStat table
    > - Not sure analyse is possible to run on per track basis so this should
    > probably run as a post scan action. Don't think it's needed at LMS
    > startup.


    Analayze should certainly be run, but only once, at the end of a scan -
    which LMS already does.

    > Part 8: Update ratings in LMS database
    > https://github.com/erland/lms-tracks...orage.pm#L1496
    > - The purpose of this part is to update ratings in LMS tracks_persistent
    > table with the value from TrackStat table.


    So are you saying that you keep a separate database file around, then
    duplicate the data to the persist database? That certainly would be one
    good place to optimize...

    > - Not really sure this is required any more, might be there for
    > historical reason when tracks_persistent didn't always survived upgrades


    Yep.

  7. #17
    Senior Member
    Join Date
    Mar 2011
    Posts
    164
    at the moment looking around to find some sqlite query tools which are able to execute those queries. A little bit difficult because those tools (DBeaver for example) are executing against ONE db file and i have to use the persistent.db and library.db

    and Michael by the way, i've enabled debug level and the snippet about the 3.5 hours (above in thread) is the only time related message from trackstat

  8. #18
    Senior Member erland's Avatar
    Join Date
    Dec 2005
    Location
    Sweden
    Posts
    11,285
    Quote Originally Posted by mamema View Post
    at the moment looking around to find some sqlite query tools which are able to execute those queries. A little bit difficult because those tools (DBeaver for example) are executing against ONE db file and i have to use the persistent.db and library.db
    If you want to run a SELECT statement itís probably easiest to do that by creating a Free form query in Database Query plugin. By doing that you donít have to care if the tables are in persist or library db.

    Quote Originally Posted by mamema View Post
    i've enabled debug level and the snippet about the 3.5 hours (above in thread) is the only time related message from trackstat
    Thatís for the musicbrainz part, do you have the log entries for the other parts of the refresh operation also.
    Erland Isaksson (My homepage)
    Developer of many plugins/applets
    Starting with LMS 8.0 I no longer support my plugins/applets (see here for more information )

  9. #19
    Babelfish's Best Boy mherger's Avatar
    Join Date
    Apr 2005
    Location
    Switzerland
    Posts
    20,530
    Quote Originally Posted by mamema View Post
    at the moment looking around to find some sqlite query tools which are able to execute those queries. A little bit difficult because those tools (DBeaver for example) are executing against ONE db file and i have to use the persistent.db and library.db
    I'm using SQLiteStudio (https://sqlitestudio.pl). It allows you to "attach" additional files:

    Name:  Screenshot 2021-02-26 at 16.18.18.png
Views: 30
Size:  64.9 KB


    Quote Originally Posted by mamema View Post
    and Michael by the way, i've enabled debug level and the snippet about the 3.5 hours (above in thread) is the only time related message from trackstat
    Wouldn't there be more information, like what step took how long?
    Michael

    "It doesn't work - what shall I do?" - "Please check your server.log and/or scanner.log file!"
    (LMS: Settings/Information)

  10. #20
    Senior Member erland's Avatar
    Join Date
    Dec 2005
    Location
    Sweden
    Posts
    11,285
    Quote Originally Posted by mherger View Post

    > Part 8: Update ratings in LMS database
    > https://github.com/erland/lms-tracks...orage.pm#L1496
    > - The purpose of this part is to update ratings in LMS tracks_persistent
    > table with the value from TrackStat table.


    So are you saying that you keep a separate database file around, then
    duplicate the data to the persist database? That certainly would be one
    good place to optimize...
    Separate table in persist.db. So itís the same database file but separate table.
    Not sure itís possible to avoid a separate table as long as LMS deletes the data in track_persistent it you rename or move a music file. Part 1 in my description takes care of recovering the old rating, play count and last played time by using musicbrainz id but thatís only possible if the old data hasnít been deleted when this logic executes.
    Erland Isaksson (My homepage)
    Developer of many plugins/applets
    Starting with LMS 8.0 I no longer support my plugins/applets (see here for more information )

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •