Merge branch 'master' into heif

This commit is contained in:
Paul Friederichsen 2023-08-02 16:50:10 -05:00
commit 22aa6ebe3b
46 changed files with 712 additions and 232 deletions

View File

@ -7,6 +7,38 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 537](https://github.com/hydrusnetwork/hydrus/releases/tag/v537)
### new filetype selector
* I rewrote the expanding checkbox list that selects filetypes in 'system:filetype' and File Import Options into a more normal tree view with checkboxes. it is more compact and scrolls neatly, letting us stack it with all these new filetypes we've been adding and more in future. the 'clicking a category selects all children' logic is preserved
* I re-ordered the actual filetypes in each sublist here. I tried to put the most common filetypes at the top and listed the rest in alphabetical order below, going for the best of both worlds. you don't want to scroll down to find webm, but you don't want to hunt through a giant hydev-written 'popularity' list to find realmedia either. let's see how it works out
* I split all the archive types away from 'applications' into a new 'archives' group
* and I did the same for the 'image project files' like krita and xcf. svg and psd may be significantly more renderable soon, so this category may get further shake-up
* this leaves 'applications' as just flash and pdf for now
* it isn't a big deal, but these new groups are reflected in _options->media_ too
* all file import options and filetype system predicates that previously said 'all applications' _should_ now say 'all applications, image project files, or archives'
### fast database delete
* I have long planned a fix for 'the PTR takes ages to delete' problem. today marks the first step in this
* deleting a huge service like the PTR and deleting/resetting/regeneratting a variety of other large data stores are now essentially instant. the old tables are not deleted instantly, but renamed and moved to a deferred delete zone
* the maintenance task that actually does the deferred background delete is not yet ready, so for now these jobs sit in the landing zone taking up their original hard disk space. I expect to have it done for next week, so bear with me if you need to delete a lot this week
* as this system gets fleshed out, the new UI under _database>db maintenance->review deferred delete table data_ will finish up too
### misc
* fixed a bitrot issue in the v534 update code related to the file maintenance manager not existing at the time of db update. if you got the 'some exif scanning failed to schedule!' popup on update, don't worry about it. everything actually worked ok, it was just a final unimportant reporting step that failed (issue #1414)
* fixed the grid layout on 'migrate tags', which at some point in the recent past went completely bananas
* tightened up some of the code that calculates and schedules deferred physical file delete. it now catches a couple of cases it wasn't and skips some work it should've
* reduced some overhead in the hover window show/hide logic. in very-heavy-session clients, this was causing significant (7ms multiple times a second) lag
* when you ok the 'manage login scripts' dialog, it no longer re-links new entries for all those scripts into the 'manage logins' system. this now only happens once on database initialisation
* the manage login scripts test routine no longer spams test errors to popup dialogs. they are still written to log if you need more data
* silenced a bit of PIL warning logspam when a file with unusual or broken EXIF data is loaded
* silenced the long time logspam that oftens happens when generating flash thumbnails
* fixed a stupid typo error in the routine that schedules downloading files from file repositories
* `nose`, `six`, and `zope` are no longer in any of the requirements.txts. I think these were needed a million years ago as PyInstaller hacks, but the situation is much better these days
## [Version 536](https://github.com/hydrusnetwork/hydrus/releases/tag/v536)
### more new filetypes
@ -334,46 +366,3 @@ title: Changelog
* cleaned up the PyInstall spec files a little more, removing some 'hidden-import' stuff from the pyinstaller spec files that was no longer used and pushing the server executables to the binaries section
* added a short section to the Windows 'running from source' help regarding pinning a shortcut to a bat to Start--there's a neat way to do it, if Windows won't let you
* updated a couple little more areas in the help for client->hydrus_client
## [Version 527](https://github.com/hydrusnetwork/hydrus/releases/tag/v527)
### important updates
* There are important technical updates this week that will require most users to update differently!
* first, OpenCV is updated to a new version, and this causes a dll conflict on at least one platform, necessitating a clean install
* second, the program executables are renamed from 'client' and 'server' to 'hydrus_client' and 'hydrus_server', necessitating shortcut updates
* as always, but doubly so this week, I strongly recommend you make a backup before updating. the instructions are simple, but if there is a problem, you'll always be able to roll back
* so, in summary, for each install type--
* - if you use the windows installer, install as normal. your start menu 'hydrus client' shortcut should be overwritten with one to the new executable, so you don't have to do anything there, but if you use a custom shortcut, you will need to update that too
* - if you use one of the normal extract builds, you will have to do a 'clean install', as here https://hydrusnetwork.github.io/hydrus/getting_started_installing.html#clean_installs . you also need to update your program shortcuts
* - macOS users have no special instructions. update as normal
* - source users, git pull as normal. if you haven't already, feel free to run setup_venv again to get the new OpenCV. update your launch scripts to point at the new 'hydrus_client.py' scripts
* - if you have patched my code, particularly the boot code, obviously update your patches! the 'hydrus_client.py' scripts just under 'hydrus' module all got renamed to '\_boot' too!
* also, some related stuff like firewall rules (if you run the Client API) may need updating!
### boring related update stuff
* the Windows build's sqlite3.dll and exe command line interface are updated to the latest, 3.41.2
* the 'updating' help now has a short section for the 526->527 update step, reiterating the above
* the builds no longer include the hydrus source in the 'hydrus' subdirectory. this was an old failed test in dual-booting that was mostly forgotten about and now cleaned up. if you want to run from source, get the source
* the windows hydrus_client and hydrus_server executables now have proper version info if you right-click->properties and look at the details tab
### Qt Media Player
* THIS IS VERY BUGGY AND SOMETIMES CRASHY; DISABLED FOR MOST USERS; NOT FOR NORMAL USE YET
* I have integrated Qt's Media Player into hydrus. it is selectable in _options->media_ (if you are an advanced user and running from source) and it works like my native viewer or mpv. it has good pixels-on-screen performance and audio support, but it is buggy and my implementation is experimental. for some reason, it crashes instantly when running from a frozen executable, so it is only available for source users atm. I would like feedback from advanced source users who have had trouble with mpv--does it work? how well? any crashes?
* this widget appears to be under active development by the Qt guys. the differences between 6.4.1 vs 6.5.0 are significant. I hope the improvements continue!
* current limitations are:
* - It is only available on Qt6, sorry legacy Qt5 source users
* - this thing crashed the program like hell during development. I tightened it up and can't get it to crash any more with my test files on source, but be careful
* - the video renderer is OpenGL and in Qt world that seems to mean it is ALWAYS ON TOP at all times. although it doesn't interfere with click events if you aim for the scanbar (so Qt's z-indexing logic is still correct), its pixels nonetheless cover the scanbar and my media viewer hover windows (I will have to figure out a different scanbar layout with this thing)
* - longer audio-only files often stutter intolerably
* - many videos can't scan beyond the start
* - some videos turn into pixel wash mess
* - some videos seem to be cropped wrong with green bars in the spare space
* - it spams a couple lines of file parsing error/warning info to the log for many videos. sometimes it spams a lot continuously. no idea how to turn it off!
* anyway, despite the bugs and crashing, I found this thing impressive and I hope it can be a better fallback than my rubbish native viewer in future. it is a shame it crashes when built, but I'll see what I can do. maybe it'll be ready for our purposes by Qt7
### misc
* if twisted fails to load, its exact error is saved, and if you try to launch a server, that error is printed to the log along with the notification popup

View File

@ -34,6 +34,35 @@
<div class="content">
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
<ul>
<li>
<h2 id="version_537"><a href="#version_537">version 537</a></h2>
<ul>
<li><h3>new filetype selector</h3></li>
<li>I rewrote the expanding checkbox list that selects filetypes in 'system:filetype' and File Import Options into a more normal tree view with checkboxes. it is more compact and scrolls neatly, letting us stack it with all these new filetypes we've been adding and more in future. the 'clicking a category selects all children' logic is preserved</li>
<li>I re-ordered the actual filetypes in each sublist here. I tried to put the most common filetypes at the top and listed the rest in alphabetical order below, going for the best of both worlds. you don't want to scroll down to find webm, but you don't want to hunt through a giant hydev-written 'popularity' list to find realmedia either. let's see how it works out</li>
<li>I split all the archive types away from 'applications' into a new 'archives' group</li>
<li>and I did the same for the 'image project files' like krita and xcf. svg and psd may be significantly more renderable soon, so this category may get further shake-up</li>
<li>this leaves 'applications' as just flash and pdf for now</li>
<li>it isn't a big deal, but these new groups are reflected in _options->media_ too</li>
<li>all file import options and filetype system predicates that previously said 'all applications' _should_ now say 'all applications, image project files, or archives'</li>
<li><h3>fast database delete</h3></li>
<li>I have long planned a fix for 'the PTR takes ages to delete' problem. today marks the first step in this</li>
<li>deleting a huge service like the PTR and deleting/resetting/regeneratting a variety of other large data stores are now essentially instant. the old tables are not deleted instantly, but renamed and moved to a deferred delete zone</li>
<li>the maintenance task that actually does the deferred background delete is not yet ready, so for now these jobs sit in the landing zone taking up their original hard disk space. I expect to have it done for next week, so bear with me if you need to delete a lot this week</li>
<li>as this system gets fleshed out, the new UI under _database>db maintenance->review deferred delete table data_ will finish up too</li>
<li><h3>misc</h3></li>
<li>fixed a bitrot issue in the v534 update code related to the file maintenance manager not existing at the time of db update. if you got the 'some exif scanning failed to schedule!' popup on update, don't worry about it. everything actually worked ok, it was just a final unimportant reporting step that failed (issue #1414)</li>
<li>fixed the grid layout on 'migrate tags', which at some point in the recent past went completely bananas</li>
<li>tightened up some of the code that calculates and schedules deferred physical file delete. it now catches a couple of cases it wasn't and skips some work it should've</li>
<li>reduced some overhead in the hover window show/hide logic. in very-heavy-session clients, this was causing significant (7ms multiple times a second) lag</li>
<li>when you ok the 'manage login scripts' dialog, it no longer re-links new entries for all those scripts into the 'manage logins' system. this now only happens once on database initialisation</li>
<li>the manage login scripts test routine no longer spams test errors to popup dialogs. they are still written to log if you need more data</li>
<li>silenced a bit of PIL warning logspam when a file with unusual or broken EXIF data is loaded</li>
<li>silenced the long time logspam that oftens happens when generating flash thumbnails</li>
<li>fixed a stupid typo error in the routine that schedules downloading files from file repositories</li>
<li>`nose`, `six`, and `zope` are no longer in any of the requirements.txts. I think these were needed a million years ago as PyInstaller hacks, but the situation is much better these days</li>
</ul>
</li>
<li>
<h2 id="version_536"><a href="#version_536">version 536</a></h2>
<ul>

View File

@ -217,7 +217,9 @@ media_viewer_capabilities = {
HC.GENERAL_IMAGE : static_full_support,
HC.GENERAL_VIDEO : animated_full_support,
HC.GENERAL_AUDIO : audio_full_support,
HC.GENERAL_APPLICATION : no_support
HC.GENERAL_APPLICATION : no_support,
HC.GENERAL_APPLICATION_ARCHIVE : no_support,
HC.GENERAL_IMAGE_PROJECT : no_support
}
for mime in HC.SEARCHABLE_MIMES:

View File

@ -719,9 +719,10 @@ def SetDefaultFavouriteSearchManagerData( favourite_search_manager ):
favourite_search_manager.SetFavouriteSearchRows( rows )
def SetDefaultLoginManagerScripts( login_manager ):
default_login_scripts = GetDefaultLoginScripts()
login_manager.SetLoginScripts( default_login_scripts )
login_manager.SetLoginScripts( default_login_scripts, auto_link = True )

View File

@ -262,7 +262,7 @@ class QuickDownloadManager( object ):
continue
hash = random.sample( hashes_still_to_download_in_this_run, 1 )[0]
hash = random.sample( list( hashes_still_to_download_in_this_run ), 1 )[0]
hashes_still_to_download_in_this_run.discard( hash )

View File

@ -1261,7 +1261,7 @@ class ClientFilesManager( object ):
except HydrusExceptions.FileMissingException:
HydrusData.Print( 'Wanted to physically delete the "{}" file, with expected mime "{}", but it was not found!'.format( file_hash.hex(), expected_mime ) )
HydrusData.Print( 'Wanted to physically delete the "{}" file, with expected mime "{}", but it was not found!'.format( file_hash.hex(), HC.mime_string_lookup[ expected_mime ] ) )

View File

@ -76,6 +76,10 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
media_view[ HC.GENERAL_APPLICATION ] = ( CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, media_start_paused, media_start_with_embed, CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, preview_start_paused, preview_start_with_embed, null_zoom_info )
media_view[ HC.GENERAL_APPLICATION_ARCHIVE ] = ( CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, media_start_paused, media_start_with_embed, CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, preview_start_paused, preview_start_with_embed, null_zoom_info )
media_view[ HC.GENERAL_IMAGE_PROJECT ] = ( CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, media_start_paused, media_start_with_embed, CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, preview_start_paused, preview_start_with_embed, null_zoom_info )
return media_view

View File

@ -4645,7 +4645,7 @@ class DB( HydrusDB.HydrusDB ):
#
self.modules_files_storage = ClientDBFilesStorage.ClientDBFilesStorage( self._c, self._cursor_transaction_wrapper, self.modules_services, self.modules_hashes, self.modules_texts )
self.modules_files_storage = ClientDBFilesStorage.ClientDBFilesStorage( self._c, self._cursor_transaction_wrapper, self.modules_db_maintenance, self.modules_services, self.modules_hashes, self.modules_texts )
self._modules.append( self.modules_files_storage )
@ -4663,7 +4663,7 @@ class DB( HydrusDB.HydrusDB ):
#
self.modules_mappings_counts = ClientDBMappingsCounts.ClientDBMappingsCounts( self._c, self.modules_services )
self.modules_mappings_counts = ClientDBMappingsCounts.ClientDBMappingsCounts( self._c, self.modules_db_maintenance, self.modules_services )
self._modules.append( self.modules_mappings_counts )
@ -4685,7 +4685,7 @@ class DB( HydrusDB.HydrusDB ):
#
self.modules_mappings_storage = ClientDBMappingsStorage.ClientDBMappingsStorage( self._c, self.modules_services )
self.modules_mappings_storage = ClientDBMappingsStorage.ClientDBMappingsStorage( self._c, self.modules_db_maintenance, self.modules_services )
self._modules.append( self.modules_mappings_storage )
@ -4697,11 +4697,11 @@ class DB( HydrusDB.HydrusDB ):
#
self.modules_tag_siblings = ClientDBTagSiblings.ClientDBTagSiblings( self._c, self.modules_services, self.modules_tags, self.modules_tags_local_cache )
self.modules_tag_siblings = ClientDBTagSiblings.ClientDBTagSiblings( self._c, self.modules_db_maintenance, self.modules_services, self.modules_tags, self.modules_tags_local_cache )
self._modules.append( self.modules_tag_siblings )
self.modules_tag_parents = ClientDBTagParents.ClientDBTagParents( self._c, self.modules_services, self.modules_tags_local_cache, self.modules_tag_siblings )
self.modules_tag_parents = ClientDBTagParents.ClientDBTagParents( self._c, self.modules_db_maintenance, self.modules_services, self.modules_tags_local_cache, self.modules_tag_siblings )
self._modules.append( self.modules_tag_parents )
@ -4712,11 +4712,11 @@ class DB( HydrusDB.HydrusDB ):
# when you do the mappings caches, storage and display, consider carefully how you want them slotting in here
# don't rush into it
self.modules_tag_search = ClientDBTagSearch.ClientDBTagSearch( self._c, self.modules_services, self.modules_tags, self.modules_tag_display, self.modules_tag_siblings, self.modules_mappings_counts )
self.modules_tag_search = ClientDBTagSearch.ClientDBTagSearch( self._c, self.modules_db_maintenance, self.modules_services, self.modules_tags, self.modules_tag_display, self.modules_tag_siblings, self.modules_mappings_counts )
self._modules.append( self.modules_tag_search )
self.modules_mappings_counts_update = ClientDBMappingsCountsUpdate.ClientDBMappingsCountsUpdate( self._c, self.modules_services, self.modules_mappings_counts, self.modules_tags_local_cache, self.modules_tag_display, self.modules_tag_search )
self.modules_mappings_counts_update = ClientDBMappingsCountsUpdate.ClientDBMappingsCountsUpdate( self._c, self.modules_db_maintenance, self.modules_services, self.modules_mappings_counts, self.modules_tags_local_cache, self.modules_tag_display, self.modules_tag_search )
self._modules.append( self.modules_mappings_counts_update )
@ -4730,7 +4730,7 @@ class DB( HydrusDB.HydrusDB ):
self._modules.append( self.modules_mappings_cache_combined_files_storage )
self.modules_mappings_cache_specific_display = ClientDBMappingsCacheSpecificDisplay.ClientDBMappingsCacheSpecificDisplay( self._c, self.modules_services, self.modules_mappings_counts, self.modules_mappings_counts_update, self.modules_mappings_storage, self.modules_tag_display )
self.modules_mappings_cache_specific_display = ClientDBMappingsCacheSpecificDisplay.ClientDBMappingsCacheSpecificDisplay( self._c, self.modules_db_maintenance, self.modules_services, self.modules_mappings_counts, self.modules_mappings_counts_update, self.modules_mappings_storage, self.modules_tag_display )
self._modules.append( self.modules_mappings_cache_specific_display )
@ -4758,7 +4758,7 @@ class DB( HydrusDB.HydrusDB ):
# how about a module for 'local file services', it can do various filtering
self.modules_repositories = ClientDBRepositories.ClientDBRepositories( self._c, self._cursor_transaction_wrapper, self.modules_services, self.modules_files_storage, self.modules_files_metadata_basic, self.modules_hashes_local_cache, self.modules_tags_local_cache, self.modules_files_maintenance_queue )
self.modules_repositories = ClientDBRepositories.ClientDBRepositories( self._c, self._cursor_transaction_wrapper, self.modules_db_maintenance, self.modules_services, self.modules_files_storage, self.modules_files_metadata_basic, self.modules_hashes_local_cache, self.modules_tags_local_cache, self.modules_files_maintenance_queue )
self._modules.append( self.modules_repositories )
@ -6278,6 +6278,7 @@ class DB( HydrusDB.HydrusDB ):
if action == 'autocomplete_predicates': result = self.modules_tag_search.GetAutocompletePredicates( *args, **kwargs )
elif action == 'boned_stats': result = self._GetBonedStats( *args, **kwargs )
elif action == 'client_files_locations': result = self.modules_files_physical_storage.GetClientFilesLocations( *args, **kwargs )
elif action == 'deferred_delete_data': result = self.modules_db_maintenance.GetDeferredDeleteTableData( *args, **kwargs )
elif action == 'deferred_physical_delete': result = self.modules_files_storage.GetDeferredPhysicalDelete( *args, **kwargs )
elif action == 'duplicate_pairs_for_filtering': result = self._DuplicatesGetPotentialDuplicatePairsForFiltering( *args, **kwargs )
elif action == 'file_duplicate_hashes': result = self.modules_files_duplicates.GetFileHashesByDuplicateType( *args, **kwargs )
@ -9614,6 +9615,11 @@ class DB( HydrusDB.HydrusDB ):
if version == 536:
self._Execute( 'CREATE TABLE IF NOT EXISTS main.deferred_delete_tables ( name TEXT, num_rows INTEGER );' )
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )

View File

@ -42,7 +42,18 @@ class ClientDBFilesMaintenanceQueue( ClientDBModule.ClientDBModule ):
self._ExecuteMany( 'REPLACE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) VALUES ( ?, ?, ? );', ( ( hash_id, job_type, time_can_start ) for hash_id in hash_ids ) )
HG.client_controller.files_maintenance_manager.Wake()
if HG.client_controller.IsBooted():
try:
# if this happens during boot db update, this doesn't exist lol
HG.client_controller.files_maintenance_manager.Wake()
except:
pass
def AddJobsHashes( self, hashes, job_type, time_can_start = 0 ):

View File

@ -12,6 +12,7 @@ from hydrus.core import HydrusTime
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client import ClientTime
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMaster
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
@ -232,9 +233,10 @@ class DBLocationContextBranch( DBLocationContext, ClientDBModule.ClientDBModule
class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
def __init__( self, cursor: sqlite3.Cursor, cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper, modules_services: ClientDBServices.ClientDBMasterServices, modules_hashes: ClientDBMaster.ClientDBMasterHashes, modules_texts: ClientDBMaster.ClientDBMasterTexts ):
def __init__( self, cursor: sqlite3.Cursor, cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices, modules_hashes: ClientDBMaster.ClientDBMasterHashes, modules_texts: ClientDBMaster.ClientDBMasterTexts ):
self._cursor_transaction_wrapper = cursor_transaction_wrapper
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_hashes = modules_hashes
self.modules_texts = modules_texts
@ -330,6 +332,8 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( pending_files_table_name ), ( ( hash_id, ) for ( hash_id, timestamp ) in insert_rows ) )
pending_changed = self._GetRowCount() > 0
if service_id == self.modules_services.combined_local_file_service_id:
for ( hash_id, timestamp ) in insert_rows:
@ -337,14 +341,12 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
self.ClearDeferredPhysicalDeleteIds( file_hash_id = hash_id, thumbnail_hash_id = hash_id )
elif self.modules_services.GetService( service_id ).GetServiceType() == HC.FILE_REPOSITORY:
elif self.modules_services.GetService( service_id ).GetServiceType() in ( HC.FILE_REPOSITORY, HC.IPFS ):
# it may be the case the files were just uploaded after being deleted
self.DeferFilesDeleteIfNowOrphan( [ hash_id for ( hash_id, timestamp ) in insert_rows ] )
pending_changed = self._GetRowCount() > 0
return pending_changed
@ -505,10 +507,10 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
self._Execute( 'DROP TABLE IF EXISTS {};'.format( current_files_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( deleted_files_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( pending_files_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( petitioned_files_table_name ) )
self.modules_db_maintenance.DeferredDropTable( current_files_table_name )
self.modules_db_maintenance.DeferredDropTable( deleted_files_table_name )
self.modules_db_maintenance.DeferredDropTable( pending_files_table_name )
self.modules_db_maintenance.DeferredDropTable( petitioned_files_table_name )
def FilterAllCurrentHashIds( self, hash_ids, just_these_service_ids = None ):
@ -661,7 +663,7 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
if len( orphan_hash_ids ) > 0:
just_these_service_ids = self.modules_services.GetServiceIds( ( HC.FILE_REPOSITORY, ) )
just_these_service_ids = self.modules_services.GetServiceIds( ( HC.FILE_REPOSITORY, HC.IPFS ) )
if ignore_service_id is not None:
@ -1289,13 +1291,13 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( petitioned_files_table_name ), ( ( hash_id, ) for hash_id in hash_ids ) )
if self.modules_services.GetService( service_id ).GetServiceType() in ( HC.COMBINED_LOCAL_FILE, HC.FILE_REPOSITORY ):
pending_changed = self._GetRowCount() > 0
if self.modules_services.GetService( service_id ).GetServiceType() == HC.COMBINED_LOCAL_FILE:
self.DeferFilesDeleteIfNowOrphan( hash_ids )
pending_changed = self._GetRowCount() > 0
return pending_changed

View File

@ -27,7 +27,8 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
return {
'main.last_shutdown_work_time' : ( 'CREATE TABLE IF NOT EXISTS {} ( last_shutdown_work_time INTEGER );', 400 ),
'main.analyze_timestamps' : ( 'CREATE TABLE IF NOT EXISTS {} ( name TEXT, num_rows INTEGER, timestamp INTEGER );', 400 ),
'main.vacuum_timestamps' : ( 'CREATE TABLE IF NOT EXISTS {} ( name TEXT, timestamp INTEGER );', 400 )
'main.vacuum_timestamps' : ( 'CREATE TABLE IF NOT EXISTS {} ( name TEXT, timestamp INTEGER );', 400 ),
'main.deferred_delete_tables' : ( 'CREATE TABLE IF NOT EXISTS {} ( name TEXT, num_rows INTEGER );', 567 )
}
@ -143,6 +144,50 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
self._Execute( 'INSERT OR IGNORE INTO analyze_timestamps ( name, num_rows, timestamp ) VALUES ( ?, ?, ? );', ( name, num_rows, HydrusTime.GetNow() ) )
def DeferredDropTable( self, table_name: str ):
try:
self._Execute( f'SELECT 1 FROM {table_name};' ).fetchone()
except:
# table doesn't exist I guess!
return
table_name_without_schema = table_name
if '.' in table_name:
table_name_without_schema = table_name.split( '.' )[-1]
new_table_name = 'deferred_delete_{}_{}'.format( table_name_without_schema, os.urandom( 16 ).hex() )
self._Execute( f'ALTER TABLE {table_name} RENAME TO {new_table_name};' )
result = self._Execute( 'SELECT num_rows FROM analyze_timestamps WHERE name = ?;', ( table_name_without_schema, ) ).fetchone()
if result is None:
num_rows = None
else:
( num_rows, ) = result
self._Execute( 'INSERT INTO deferred_delete_tables ( name, num_rows ) VALUES ( ?, ? );', ( new_table_name, num_rows ) )
def GetDeferredDeleteTableData( self ):
data = self._Execute( 'SELECT name, num_rows FROM deferred_delete_tables;' ).fetchall()
return data
def GetLastShutdownWorkTime( self ):
result = self._Execute( 'SELECT last_shutdown_work_time FROM last_shutdown_work_time;' ).fetchone()
@ -170,6 +215,8 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
all_names.discard( 'sqlite_stat1' )
all_names = { name for name in all_names if not name.startswith( 'deferred_delete_' ) }
if force_reanalyze:
names_to_analyze = list( all_names )

View File

@ -8,6 +8,7 @@ from hydrus.core import HydrusData
from hydrus.core import HydrusDBBase
from hydrus.core import HydrusTime
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMappingsCounts
from hydrus.client.db import ClientDBMappingsCountsUpdate
from hydrus.client.db import ClientDBMappingsStorage
@ -20,8 +21,9 @@ class ClientDBMappingsCacheSpecificDisplay( ClientDBModule.ClientDBModule ):
CAN_REPOPULATE_ALL_MISSING_DATA = True
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts, modules_mappings_counts_update: ClientDBMappingsCountsUpdate.ClientDBMappingsCountsUpdate, modules_mappings_storage: ClientDBMappingsStorage.ClientDBMappingsStorage, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts, modules_mappings_counts_update: ClientDBMappingsCountsUpdate.ClientDBMappingsCountsUpdate, modules_mappings_storage: ClientDBMappingsStorage.ClientDBMappingsStorage, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_mappings_counts = modules_mappings_counts
self.modules_mappings_counts_update = modules_mappings_counts_update
@ -308,8 +310,8 @@ class ClientDBMappingsCacheSpecificDisplay( ClientDBModule.ClientDBModule ):
( cache_display_current_mappings_table_name, cache_display_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificDisplayMappingsCacheTableNames( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_display_current_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_display_pending_mappings_table_name ) )
self.modules_db_maintenance.DeferredDropTable( cache_display_current_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( cache_display_pending_mappings_table_name )
self.modules_mappings_counts.DropTables( ClientTags.TAG_DISPLAY_ACTUAL, file_service_id, tag_service_id )

View File

@ -335,9 +335,9 @@ class ClientDBMappingsCacheSpecificStorage( ClientDBModule.ClientDBModule ):
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_current_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_deleted_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_pending_mappings_table_name ) )
self.modules_db_maintenance.DeferredDropTable( cache_current_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( cache_deleted_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( cache_pending_mappings_table_name )
self.modules_mappings_counts.DropTables( ClientTags.TAG_DISPLAY_STORAGE, file_service_id, tag_service_id )

View File

@ -6,6 +6,7 @@ from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusDBBase
from hydrus.client import ClientData
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
from hydrus.client.metadata import ClientTags
@ -48,8 +49,9 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
CAN_REPOPULATE_ALL_MISSING_DATA = True
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
ClientDBModule.ClientDBModule.__init__( self, 'client mappings counts', cursor )
@ -207,7 +209,7 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
table_name = self.GetCountsCacheTableName( tag_display_type, file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( table_name ) )
self.modules_db_maintenance.DeferredDropTable( table_name )
def FilterExistingTagIds( self, tag_display_type, file_service_id, tag_service_id, tag_ids_table_name ):

View File

@ -2,6 +2,7 @@ import sqlite3
import typing
from hydrus.client.db import ClientDBDefinitionsCache
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMappingsCounts
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
@ -11,8 +12,9 @@ from hydrus.client.metadata import ClientTags
class ClientDBMappingsCountsUpdate( ClientDBModule.ClientDBModule ):
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts, modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay, modules_tag_search: ClientDBTagSearch.ClientDBTagSearch ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts, modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay, modules_tag_search: ClientDBTagSearch.ClientDBTagSearch ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_mappings_counts = modules_mappings_counts
self.modules_tags_local_cache = modules_tags_local_cache

View File

@ -3,6 +3,7 @@ import typing
from hydrus.core import HydrusConstants as HC
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
@ -22,6 +23,7 @@ def DoingAFileJoinTagSearchIsFaster( estimated_file_row_count, estimated_tag_row
return estimated_file_row_count * ( file_lookup_speed_ratio + temp_table_overhead ) < estimated_tag_row_count
def GenerateMappingsTableNames( service_id: int ) -> typing.Tuple[ str, str, str, str ]:
suffix = str( service_id )
@ -36,6 +38,7 @@ def GenerateMappingsTableNames( service_id: int ) -> typing.Tuple[ str, str, str
return ( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name )
def GenerateSpecificDisplayMappingsCacheTableNames( file_service_id, tag_service_id ):
suffix = '{}_{}'.format( file_service_id, tag_service_id )
@ -46,6 +49,7 @@ def GenerateSpecificDisplayMappingsCacheTableNames( file_service_id, tag_service
return ( cache_display_current_mappings_table_name, cache_display_pending_mappings_table_name )
def GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id ):
suffix = '{}_{}'.format( file_service_id, tag_service_id )
@ -58,10 +62,12 @@ def GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id ):
return ( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name )
class ClientDBMappingsStorage( ClientDBModule.ClientDBModule ):
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
ClientDBModule.ClientDBModule.__init__( self, 'client mappings storage', cursor )
@ -123,10 +129,10 @@ class ClientDBMappingsStorage( ClientDBModule.ClientDBModule ):
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( current_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( deleted_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( pending_mappings_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( petitioned_mappings_table_name ) )
self.modules_db_maintenance.DeferredDropTable( current_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( deleted_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( pending_mappings_table_name )
self.modules_db_maintenance.DeferredDropTable( petitioned_mappings_table_name )
def FilterExistingUpdateMappings( self, tag_service_id, mappings_ids, action ):

View File

@ -13,6 +13,7 @@ from hydrus.core import HydrusTime
from hydrus.core.networking import HydrusNetwork
from hydrus.client import ClientFiles
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBDefinitionsCache
from hydrus.client.db import ClientDBFilesMaintenanceQueue
from hydrus.client.db import ClientDBFilesMetadataBasic
@ -36,18 +37,21 @@ def GenerateRepositoryDefinitionTableNames( service_id: int ):
return ( hash_id_map_table_name, tag_id_map_table_name )
def GenerateRepositoryFileDefinitionTableName( service_id: int ):
( hash_id_map_table_name, tag_id_map_table_name ) = GenerateRepositoryDefinitionTableNames( service_id )
return hash_id_map_table_name
def GenerateRepositoryTagDefinitionTableName( service_id: int ):
( hash_id_map_table_name, tag_id_map_table_name ) = GenerateRepositoryDefinitionTableNames( service_id )
return tag_id_map_table_name
def GenerateRepositoryUpdatesTableNames( service_id: int ):
repository_updates_table_name = '{}{}'.format( REPOSITORY_UPDATES_PREFIX, service_id )
@ -56,12 +60,14 @@ def GenerateRepositoryUpdatesTableNames( service_id: int ):
return ( repository_updates_table_name, repository_unregistered_updates_table_name, repository_updates_processed_table_name )
class ClientDBRepositories( ClientDBModule.ClientDBModule ):
def __init__(
self,
cursor: sqlite3.Cursor,
cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper,
modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance,
modules_services: ClientDBServices.ClientDBMasterServices,
modules_files_storage: ClientDBFilesStorage.ClientDBFilesStorage,
modules_files_metadata_basic: ClientDBFilesMetadataBasic.ClientDBFilesMetadataBasic,
@ -75,6 +81,7 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
ClientDBModule.ClientDBModule.__init__( self, 'client repositories', cursor )
self._cursor_transaction_wrapper = cursor_transaction_wrapper
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_files_storage = modules_files_storage
self.modules_files_metadata_basic = modules_files_metadata_basic
@ -305,14 +312,14 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
( repository_updates_table_name, repository_unregistered_updates_table_name, repository_updates_processed_table_name ) = GenerateRepositoryUpdatesTableNames( service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( repository_updates_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( repository_unregistered_updates_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( repository_updates_processed_table_name ) )
self.modules_db_maintenance.DeferredDropTable( repository_updates_table_name )
self.modules_db_maintenance.DeferredDropTable( repository_unregistered_updates_table_name )
self.modules_db_maintenance.DeferredDropTable( repository_updates_processed_table_name )
( hash_id_map_table_name, tag_id_map_table_name ) = GenerateRepositoryDefinitionTableNames( service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( hash_id_map_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( tag_id_map_table_name ) )
self.modules_db_maintenance.DeferredDropTable( hash_id_map_table_name )
self.modules_db_maintenance.DeferredDropTable( tag_id_map_table_name )
self._ClearOutstandingWorkCache( service_id )

View File

@ -6,9 +6,9 @@ import typing
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusDBBase
from hydrus.core import HydrusTime
from hydrus.client.db import ClientDBDefinitionsCache
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
from hydrus.client.db import ClientDBTagSiblings
@ -42,11 +42,13 @@ class ClientDBTagParents( ClientDBModule.ClientDBModule ):
def __init__(
self,
cursor: sqlite3.Cursor,
modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance,
modules_services: ClientDBServices.ClientDBMasterServices,
modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags,
modules_tag_siblings: ClientDBTagSiblings.ClientDBTagSiblings
):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_tags_local_cache = modules_tags_local_cache
self.modules_tag_siblings = modules_tag_siblings
@ -172,8 +174,8 @@ class ClientDBTagParents( ClientDBModule.ClientDBModule ):
( cache_ideal_tag_parents_lookup_table_name, cache_actual_tag_parents_lookup_table_name ) = GenerateTagParentsLookupCacheTableNames( tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_actual_tag_parents_lookup_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_ideal_tag_parents_lookup_table_name ) )
self.modules_db_maintenance.DeferredDropTable( cache_actual_tag_parents_lookup_table_name )
self.modules_db_maintenance.DeferredDropTable( cache_ideal_tag_parents_lookup_table_name )
self._Execute( 'DELETE FROM tag_parent_application WHERE master_service_id = ? OR application_service_id = ?;', ( tag_service_id, tag_service_id ) )

View File

@ -13,6 +13,7 @@ from hydrus.core import HydrusTags
from hydrus.core import HydrusTime
from hydrus.client import ClientConstants as CC
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMappingsCounts
from hydrus.client.db import ClientDBMappingsStorage
from hydrus.client.db import ClientDBMaster
@ -38,6 +39,7 @@ def ConvertWildcardToSQLiteLikeParameter( wildcard ):
return like_param
def GenerateCombinedFilesIntegerSubtagsTableName( tag_service_id ):
name = 'combined_files_integer_subtags_cache'
@ -46,6 +48,7 @@ def GenerateCombinedFilesIntegerSubtagsTableName( tag_service_id ):
return integer_subtags_table_name
def GenerateCombinedFilesSubtagsFTS4TableName( tag_service_id ):
name = 'combined_files_subtags_fts4_cache'
@ -54,6 +57,7 @@ def GenerateCombinedFilesSubtagsFTS4TableName( tag_service_id ):
return subtags_fts4_table_name
def GenerateCombinedFilesSubtagsSearchableMapTableName( tag_service_id ):
name = 'combined_files_subtags_searchable_map_cache'
@ -62,6 +66,7 @@ def GenerateCombinedFilesSubtagsSearchableMapTableName( tag_service_id ):
return subtags_searchable_map_table_name
def GenerateCombinedFilesTagsTableName( tag_service_id ):
name = 'combined_files_tags_cache'
@ -70,6 +75,7 @@ def GenerateCombinedFilesTagsTableName( tag_service_id ):
return tags_table_name
def GenerateCombinedTagsTagsTableName( file_service_id ):
name = 'combined_tags_tags_cache'
@ -78,6 +84,7 @@ def GenerateCombinedTagsTagsTableName( file_service_id ):
return tags_table_name
def GenerateSpecificIntegerSubtagsTableName( file_service_id, tag_service_id ):
name = 'specific_integer_subtags_cache'
@ -88,6 +95,7 @@ def GenerateSpecificIntegerSubtagsTableName( file_service_id, tag_service_id ):
return integer_subtags_table_name
def GenerateSpecificSubtagsFTS4TableName( file_service_id, tag_service_id ):
name = 'specific_subtags_fts4_cache'
@ -98,6 +106,7 @@ def GenerateSpecificSubtagsFTS4TableName( file_service_id, tag_service_id ):
return subtags_fts4_table_name
def GenerateSpecificSubtagsSearchableMapTableName( file_service_id, tag_service_id ):
name = 'specific_subtags_searchable_map_cache'
@ -108,6 +117,7 @@ def GenerateSpecificSubtagsSearchableMapTableName( file_service_id, tag_service_
return subtags_searchable_map_table_name
def GenerateSpecificTagsTableName( file_service_id, tag_service_id ):
name = 'specific_tags_cache'
@ -118,6 +128,7 @@ def GenerateSpecificTagsTableName( file_service_id, tag_service_id ):
return tags_table_name
def WildcardHasFTS4SearchableCharacters( wildcard: str ):
# fts4 says it can do alphanumeric or unicode with a value >= 128
@ -132,12 +143,14 @@ def WildcardHasFTS4SearchableCharacters( wildcard: str ):
return False
class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
CAN_REPOPULATE_ALL_MISSING_DATA = True
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices, modules_tags: ClientDBMaster.ClientDBMasterTags, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay, modules_tag_siblings: ClientDBTagSiblings.ClientDBTagSiblings, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices, modules_tags: ClientDBMaster.ClientDBMasterTags, modules_tag_display: ClientDBTagDisplay.ClientDBTagDisplay, modules_tag_siblings: ClientDBTagSiblings.ClientDBTagSiblings, modules_mappings_counts: ClientDBMappingsCounts.ClientDBMappingsCounts ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_tags = modules_tags
self.modules_tag_display = modules_tag_display
@ -401,19 +414,19 @@ class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
tags_table_name = self.GetTagsTableName( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( tags_table_name ) )
self.modules_db_maintenance.DeferredDropTable( tags_table_name )
subtags_fts4_table_name = self.GetSubtagsFTS4TableName( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( subtags_fts4_table_name ) )
self.modules_db_maintenance.DeferredDropTable( subtags_fts4_table_name )
subtags_searchable_map_table_name = self.GetSubtagsSearchableMapTableName( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( subtags_searchable_map_table_name ) )
self.modules_db_maintenance.DeferredDropTable( subtags_searchable_map_table_name )
integer_subtags_table_name = self.GetIntegerSubtagsTableName( file_service_id, tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( integer_subtags_table_name ) )
self.modules_db_maintenance.DeferredDropTable( integer_subtags_table_name )
def FilterExistingTagIds( self, file_service_id, tag_service_id, tag_ids_table_name ):

View File

@ -10,6 +10,7 @@ from hydrus.core import HydrusTime
from hydrus.client import ClientConstants as CC
from hydrus.client.db import ClientDBDefinitionsCache
from hydrus.client.db import ClientDBMaintenance
from hydrus.client.db import ClientDBMaster
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBServices
@ -29,6 +30,7 @@ def GenerateTagSiblingsLookupCacheTableName( display_type: int, service_id: int
return cache_actual_tag_siblings_lookup_table_name
def GenerateTagSiblingsLookupCacheTableNames( service_id ):
cache_ideal_tag_siblings_lookup_table_name = 'external_caches.ideal_tag_siblings_lookup_cache_{}'.format( service_id )
@ -36,12 +38,14 @@ def GenerateTagSiblingsLookupCacheTableNames( service_id ):
return ( cache_ideal_tag_siblings_lookup_table_name, cache_actual_tag_siblings_lookup_table_name )
class ClientDBTagSiblings( ClientDBModule.ClientDBModule ):
CAN_REPOPULATE_ALL_MISSING_DATA = True
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices, modules_tags: ClientDBMaster.ClientDBMasterTags, modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags ):
def __init__( self, cursor: sqlite3.Cursor, modules_db_maintenance: ClientDBMaintenance.ClientDBMaintenance, modules_services: ClientDBServices.ClientDBMasterServices, modules_tags: ClientDBMaster.ClientDBMasterTags, modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags ):
self.modules_db_maintenance = modules_db_maintenance
self.modules_services = modules_services
self.modules_tags_local_cache = modules_tags_local_cache
self.modules_tags = modules_tags
@ -186,8 +190,8 @@ class ClientDBTagSiblings( ClientDBModule.ClientDBModule ):
( cache_ideal_tag_siblings_lookup_table_name, cache_actual_tag_siblings_lookup_table_name ) = GenerateTagSiblingsLookupCacheTableNames( tag_service_id )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_actual_tag_siblings_lookup_table_name ) )
self._Execute( 'DROP TABLE IF EXISTS {};'.format( cache_ideal_tag_siblings_lookup_table_name ) )
self.modules_db_maintenance.DeferredDropTable( cache_actual_tag_siblings_lookup_table_name )
self.modules_db_maintenance.DeferredDropTable( cache_ideal_tag_siblings_lookup_table_name )
self._Execute( 'DELETE FROM tag_sibling_application WHERE master_service_id = ? OR application_service_id = ?;', ( tag_service_id, tag_service_id ) )

View File

@ -3064,6 +3064,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
maintenance_submenu = ClientGUIMenus.GenerateMenu( menu )
ClientGUIMenus.AppendMenuItem( maintenance_submenu, 'analyze', 'Optimise slow queries by running statistical analyses on the database.', self._AnalyzeDatabase )
ClientGUIMenus.AppendMenuItem( maintenance_submenu, 'review deferred delete table data', 'See how many tables are being deleted in the background.', self._ReviewDeferredDeleteTableData )
ClientGUIMenus.AppendMenuItem( maintenance_submenu, 'review vacuum data', 'See whether it is worth rebuilding the database to reformat tables and recover disk space.', self._ReviewVacuumData )
ClientGUIMenus.AppendSeparator( maintenance_submenu )
@ -5518,6 +5519,42 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
frame.SetPanel( panel )
def _ReviewDeferredDeleteTableData( self ):
job_key = ClientThreading.JobKey( cancellable = True )
def work_callable():
deferred_delete_data = self._controller.Read( 'deferred_delete_data' )
return deferred_delete_data
def publish_callable( deferred_delete_data ):
if job_key.IsCancelled():
return
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'review vacuum data' )
panel = ClientGUIScrolledPanelsReview.ReviewDeferredDeleteTableData( frame, self._controller, deferred_delete_data )
frame.SetPanel( panel )
job_key.Delete()
job_key.SetStatusText( 'loading database data' )
self._controller.pub( 'message', job_key )
job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable )
job.start()
def _ReviewFileMaintenance( self ):
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'file maintenance' )

View File

@ -132,6 +132,7 @@ def ConvertTextToPixelWidth( window, char_cols ) -> int:
return round( char_cols * one_char_width )
def DialogIsOpen():
tlws = QW.QApplication.topLevelWidgets()
@ -412,6 +413,7 @@ def TLWOrChildIsActive( win ):
return False
def UpdateAppDisplayName():
app_display_name = HG.client_controller.new_options.GetString( 'app_display_name' )
@ -430,6 +432,7 @@ def UpdateAppDisplayName():
def WidgetOrAnyTLWChildHasFocus( window ):
active_window = QW.QApplication.activeWindow()

View File

@ -1672,7 +1672,8 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
login_result = str( e )
HydrusData.ShowException( e )
HydrusData.Print( 'During login test, encountered this halt/error:' )
HydrusData.PrintException( e )
finally:

View File

@ -1,7 +1,10 @@
import typing
from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.client import ClientConstants as CC
from hydrus.client.gui import ClientGUIFunctions
@ -11,10 +14,119 @@ from hydrus.client.search import ClientSearch
class OptionsPanel( QW.QWidget ):
def GetValue( self ): raise NotImplementedError()
def GetValue( self ):
raise NotImplementedError()
def SetValue( self, info ): raise NotImplementedError()
def SetValue( self, info ):
raise NotImplementedError()
class OptionsPanelMimesTree( OptionsPanel ):
def __init__( self, parent, selectable_mimes ):
OptionsPanel.__init__( self, parent )
self._selectable_mimes = set( selectable_mimes )
self._mimes_to_items = {}
self._general_mime_types_to_items = {}
general_mime_types = []
general_mime_types.append( HC.GENERAL_IMAGE )
general_mime_types.append( HC.GENERAL_ANIMATION )
general_mime_types.append( HC.GENERAL_VIDEO )
general_mime_types.append( HC.GENERAL_AUDIO )
general_mime_types.append( HC.GENERAL_APPLICATION )
general_mime_types.append( HC.GENERAL_IMAGE_PROJECT )
general_mime_types.append( HC.GENERAL_APPLICATION_ARCHIVE )
self._my_tree = QP.TreeWidgetWithInheritedCheckState( self )
self._my_tree.setHeaderHidden( True )
for general_mime_type in general_mime_types:
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
if len( mimes_in_type ) == 0:
continue
general_mime_item = QW.QTreeWidgetItem()
general_mime_item.setText( 0, HC.mime_string_lookup[ general_mime_type ] )
general_mime_item.setFlags( general_mime_item.flags() | QC.Qt.ItemIsUserCheckable )
general_mime_item.setCheckState( 0, QC.Qt.Unchecked )
general_mime_item.setData( 0, QC.Qt.UserRole, general_mime_type )
self._my_tree.addTopLevelItem( general_mime_item )
self._general_mime_types_to_items[ general_mime_type ] = general_mime_item
for mime in mimes_in_type:
mime_item = QW.QTreeWidgetItem()
mime_item.setText( 0, HC.mime_string_lookup[ mime ] )
mime_item.setFlags( mime_item.flags() | QC.Qt.ItemIsUserCheckable )
mime_item.setData( 0, QC.Qt.UserRole, mime )
general_mime_item.addChild( mime_item )
self._mimes_to_items[ mime ] = mime_item
#
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._my_tree )
self.setLayout( vbox )
#self._my_tree.itemClicked.connect( self._ItemClicked )
def _GetMimesForGeneralMimeType( self, general_mime_type ):
mimes_in_type = HC.general_mimetypes_to_mime_groups[ general_mime_type ]
mimes_in_type = [ mime for mime in mimes_in_type if mime in self._selectable_mimes ]
return mimes_in_type
def GetValue( self ) -> typing.Tuple[ int ]:
mimes = tuple( [ mime for ( mime, item ) in self._mimes_to_items.items() if item.checkState( 0 ) == QC.Qt.Checked ] )
return mimes
def SetValue( self, checked_mimes: typing.Collection[ int ] ):
checked_mimes = ClientSearch.ConvertSummaryFiletypesToSpecific( checked_mimes, only_searchable = False )
for ( mime, item ) in self._mimes_to_items.items():
if mime in checked_mimes:
check_state = QC.Qt.Checked
else:
check_state = QC.Qt.Unchecked
item.setCheckState( 0, check_state )
class OptionsPanelMimes( OptionsPanel ):
BUTTON_CURRENTLY_HIDDEN = '\u25B6'

View File

@ -3019,6 +3019,8 @@ class EditMediaViewOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
QP.AddToLayout( vbox, ClientGUICommon.BetterStaticText(self,text), CC.FLAGS_EXPAND_PERPENDICULAR )
# TODO: Yo this layout sucks, figure out some better dynamic presentation of these options based on mime viewing capability, atm doing enable/disable and weird hide/show here is bad
rows = []
rows.append( ( 'media viewer show action: ', self._media_show_action ) )
@ -3050,7 +3052,7 @@ class EditMediaViewOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( 'if the media is smaller than the media viewer canvas: ', self._media_scale_up ) )
rows.append( ( 'if the media is larger than the media viewer canvas: ', self._media_scale_down ) )
rows.append( ( 'if the media is smaller than the preview canvas: ', self._preview_scale_up) )
rows.append( ( 'if the media is smaller than the preview canvas: ', self._preview_scale_up ) )
rows.append( ( 'if the media is larger than the preview canvas: ', self._preview_scale_down ) )
gridbox = ClientGUICommon.WrapInGrid( self, rows )
@ -3131,6 +3133,10 @@ class EditMediaViewOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
is_application = self._mime == HC.GENERAL_APPLICATION or self._mime in HC.general_mimetypes_to_mime_groups[ HC.GENERAL_APPLICATION ]
is_archive = self._mime == HC.GENERAL_APPLICATION_ARCHIVE or self._mime in HC.general_mimetypes_to_mime_groups[ HC.GENERAL_APPLICATION_ARCHIVE ]
# this is the one that is likely to get tricky, with SVG and PSD. maybe we'll move to 'renderable image projects' something
is_image_project = self._mime == HC.GENERAL_IMAGE_PROJECT or self._mime in HC.general_mimetypes_to_mime_groups[ HC.GENERAL_IMAGE_PROJECT ]
is_image = self._mime == HC.GENERAL_IMAGE or self._mime in HC.general_mimetypes_to_mime_groups[ HC.GENERAL_IMAGE ]
is_audio = self._mime == HC.GENERAL_AUDIO or self._mime in HC.general_mimetypes_to_mime_groups[ HC.GENERAL_AUDIO ]
@ -3140,7 +3146,7 @@ class EditMediaViewOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._scale_down_quality.setEnabled( False )
if is_image or is_application:
if is_image or is_application or is_archive or is_image_project:
self._media_start_paused.setEnabled( False )
self._preview_start_paused.setEnabled( False )

View File

@ -1196,7 +1196,7 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
QP.AddToLayout( gridbox, ClientGUICommon.BetterStaticText( self._migration_panel, 'filter' ), CC.FLAGS_CENTER )
QP.AddToLayout( gridbox, ClientGUICommon.BetterStaticText( self._migration_panel, 'action' ), CC.FLAGS_CENTER )
QP.AddToLayout( gridbox, ClientGUICommon.BetterStaticText( self._migration_panel, 'destination' ), CC.FLAGS_CENTER )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
QP.AddToLayout( gridbox, self._migration_content_type, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._migration_source, CC.FLAGS_EXPAND_BOTH_WAYS )
@ -1205,37 +1205,24 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
QP.AddToLayout( gridbox, self._migration_destination, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._migration_go, CC.FLAGS_EXPAND_BOTH_WAYS )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
QP.AddToLayout( gridbox, self._migration_source_archive_path_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, file_left_vbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
QP.AddToLayout( gridbox, self._migration_destination_archive_path_button, CC.FLAGS_EXPAND_BOTH_WAYS )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
QP.AddToLayout( gridbox, self._migration_source_hash_type_st, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( gridbox, tag_right_vbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
QP.AddToLayout( gridbox, dest_hash_type_hbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
ClientGUICommon.AddGridboxStretchSpacer( gridbox )
ClientGUICommon.AddGridboxStretchSpacer( self._migration_panel, gridbox )
self._migration_panel.Add( gridbox )
#
vbox = QP.VBoxLayout()
message = 'Regarding '
if self._hashes is None:
message += 'all'
else:
message += HydrusData.ToHumanInt( len( self._hashes ) )
message = 'The content from the SOURCE that the FILTER ALLOWS is applied using the ACTION to the DESTINATION.'
message += os.linesep * 2
message += 'To delete content en masse from one location, select what you want to delete with the filter and set the source and destination the same.'
@ -1247,8 +1234,12 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
st = ClientGUICommon.BetterStaticText( self, message )
st.setWordWrap( True )
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, self._migration_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( vbox, self._migration_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
#vbox.addStretch( 1 )
self.widget().setLayout( vbox )
@ -3892,6 +3883,101 @@ class ThreadsPanel( QW.QWidget ):
self._list_ctrl.SetData( threads )
class ReviewDeferredDeleteTableData( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, controller, deferred_delete_data ):
ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent )
self._controller = controller
#
info_message = '''When large database objects are no longer needed, they are not deleted immediately. This is an evolving, in-work system.'''
st = ClientGUICommon.BetterStaticText( self, label = info_message )
st.setWordWrap( True )
deferred_delete_listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
self._vacuum_listctrl = ClientGUIListCtrl.BetterListCtrl( deferred_delete_listctrl_panel, CGLC.COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.ID, 24, self._ConvertRowToListCtrlTuples )
deferred_delete_listctrl_panel.SetListCtrl( self._vacuum_listctrl )
# TODO: refresh button?
deferred_delete_listctrl_panel.AddButton( 'work hard now', self._DoWorkNow, enabled_check_func = self._CanWork )
#
self._vacuum_listctrl.SetData( deferred_delete_data )
self._vacuum_listctrl.Sort()
#
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, deferred_delete_listctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
self.widget().setLayout( vbox )
def _CanWork( self ):
# TODO: anything in the list?
return False
def _ConvertRowToListCtrlTuples( self, row ):
( name, num_rows ) = row
sort_name = name
pretty_name = name
if num_rows is None:
sort_num_rows = -1
pretty_num_rows = 'unknown'
else:
sort_num_rows = num_rows
pretty_num_rows = HydrusData.ToHumanInt( sort_num_rows )
display_tuple = ( pretty_name, pretty_num_rows )
sort_tuple = ( sort_name, sort_num_rows )
return ( display_tuple, sort_tuple )
def _DoWorkNow( self ):
# TODO: wake up the maintenance lad and tell it to burn time
# switch button to 'slow down'
pass
def _GetVacuumTimeEstimate( self, db_size ):
from hydrus.core import HydrusDB
vacuum_time_estimate = HydrusDB.GetApproxVacuumDuration( db_size )
pretty_vacuum_time_estimate = '{} to {}'.format( HydrusTime.TimeDeltaToPrettyTimeDelta( vacuum_time_estimate / 40 ), HydrusTime.TimeDeltaToPrettyTimeDelta( vacuum_time_estimate ) )
return ( vacuum_time_estimate, pretty_vacuum_time_estimate )
class ReviewThreads( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, controller ):
@ -3917,6 +4003,7 @@ class ReviewThreads( ClientGUIScrolledPanels.ReviewPanel ):
self.widget().setLayout( vbox )
class ReviewVacuumData( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, controller, vacuum_data ):

View File

@ -987,7 +987,7 @@ class GridLayout( QW.QGridLayout ):
return self._col_count
def setMargin( self, val ):
self.setContentsMargins( val, val, val, val )
@ -2167,22 +2167,63 @@ class TreeWidgetWithInheritedCheckState( QW.QTreeWidget ):
QW.QTreeWidget.__init__( self, *args, **kwargs )
self.itemClicked.connect( self._HandleItemClickedForCheckStateUpdate )
self.itemChanged.connect( self._HandleItemCheckStateUpdate )
def _HandleItemClickedForCheckStateUpdate( self, item, column ):
def _GetChildren( self, item: QW.QTreeWidgetItem ) -> typing.List[ QW.QTreeWidgetItem ]:
self._UpdateCheckState( item, item.checkState( 0 ) )
children = [ item.child( i ) for i in range( item.childCount() ) ]
return children
def _UpdateCheckState( self, item, check_state ):
def _HandleItemCheckStateUpdate( self, item, column ):
# this is an int, should be a checkstate
item.setCheckState( 0, check_state )
self.blockSignals( True )
for i in range( item.childCount() ):
self._UpdateChildrenCheckState( item, item.checkState( 0 ) )
self._UpdateParentCheckState( item )
self.blockSignals( False )
def _UpdateChildrenCheckState( self, item, check_state ):
for child in self._GetChildren( item ):
self._UpdateCheckState( item.child( i ), check_state )
child.setCheckState( 0, check_state )
self._UpdateChildrenCheckState( child, check_state )
def _UpdateParentCheckState( self, item: QW.QTreeWidgetItem ):
parent = item.parent()
if isinstance( parent, QW.QTreeWidgetItem ):
all_values = { child.checkState( 0 ) for child in self._GetChildren( parent ) }
if all_values == { QC.Qt.Checked }:
end_state = QC.Qt.Checked
elif all_values == { QC.Qt.Unchecked }:
end_state = QC.Qt.Unchecked
else:
end_state = QC.Qt.PartiallyChecked
if end_state != parent.checkState( 0 ):
parent.setCheckState( 0, end_state )
self._UpdateParentCheckState( parent )

View File

@ -528,26 +528,6 @@ class CanvasHoverFrame( QW.QFrame ):
focus_is_good = current_focus_tlw == self.window()
mouse_is_over_self_or_child = False
for tlw in list( QW.QApplication.topLevelWidgets() ):
if not tlw.isVisible():
continue
if tlw == self or ClientGUIFunctions.IsQtAncestor( tlw, self, through_tlws = True ):
if ClientGUIFunctions.MouseIsOverWidget( tlw ):
mouse_is_over_self_or_child = True
break
if self._ShouldBeShown():
self._RaiseHover()
@ -620,7 +600,7 @@ class CanvasHoverFrame( QW.QFrame ):
hide_focus_is_good = focus_is_good or current_focus_tlw is None # don't hide if focus is either gone to another problem or temporarily sperging-out due to a click-transition or similar
ready_to_show = in_position and not mouse_is_over_something_else_important and focus_is_good and not dialog_is_open and not menu_open
ready_to_hide = not menu_open and not mouse_is_over_self_or_child and ( not in_position or dialog_is_open or not hide_focus_is_good )
ready_to_hide = not menu_open and ( not in_position or dialog_is_open or not hide_focus_is_good )
def get_logic_report_string():

View File

@ -138,7 +138,7 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._allow_decompression_bombs.setToolTip( tt )
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimes( pre_import_panel, HC.ALLOWED_MIMES )
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimesTree( pre_import_panel, HC.ALLOWED_MIMES )
self._min_size = ClientGUIControls.NoneableBytesControl( pre_import_panel )
self._min_size.SetValue( 5 * 1024 )

View File

@ -1547,3 +1547,18 @@ register_column_type( COLUMN_LIST_PETITIONS_SUMMARY.ID, COLUMN_LIST_PETITIONS_SU
register_column_type( COLUMN_LIST_PETITIONS_SUMMARY.ID, COLUMN_LIST_PETITIONS_SUMMARY.CONTENT, 'content', False, 16, True )
default_column_list_sort_lookup[ COLUMN_LIST_PETITIONS_SUMMARY.ID ] = ( COLUMN_LIST_PETITIONS_SUMMARY.ACCOUNT_KEY, True )
class COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA( COLUMN_LIST_DEFINITION ):
ID = 72
NAME = 0
ROWS = 1
column_list_type_name_lookup[ COLUMN_LIST_VACUUM_DATA.ID ] = 'vacuum data'
register_column_type( COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.ID, COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.NAME, 'name', False, 64, True )
register_column_type( COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.ID, COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.ROWS, 'num rows', False, 12, True )
default_column_list_sort_lookup[ COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.ID ] = ( COLUMN_LIST_DEFERRED_DELETE_TABLE_DATA.NAME, True )

View File

@ -1654,7 +1654,7 @@ class PanelPredicateSystemMime( PanelPredicateSystemSingle ):
PanelPredicateSystemSingle.__init__( self, parent )
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimes( self, HC.SEARCHABLE_MIMES )
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimesTree( self, HC.SEARCHABLE_MIMES )
#

View File

@ -23,10 +23,13 @@ from hydrus.client.gui import ClientGUIShortcuts
from hydrus.client.gui import QtPorting as QP
from hydrus.client.gui.widgets import ClientGUIColourPicker
def AddGridboxStretchSpacer( layout: QW.QGridLayout ):
def AddGridboxStretchSpacer( win: QW.QWidget, layout: QW.QGridLayout ):
layout.addItem( QW.QSpacerItem( 10, 10, QW.QSizePolicy.Expanding, QW.QSizePolicy.Fixed ) )
widget = QW.QWidget( win )
QP.AddToLayout( layout, widget, CC.FLAGS_CENTER_PERPENDICULAR_EXPAND_DEPTH )
def WrapInGrid( parent, rows, expand_text = False, add_stretch_at_end = True ):
gridbox = QP.GridLayout( cols = 2 )

View File

@ -41,7 +41,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
SERIALISABLE_NAME = 'File Import Options'
SERIALISABLE_VERSION = 9
SERIALISABLE_VERSION = 10
def __init__( self ):
@ -278,6 +278,33 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return ( 9, new_serialisable_info )
if version == 9:
( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default ) = old_serialisable_info
( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context ) = pre_import_options
filetype_filter_predicate = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_filetype_filter_predicate )
mimes = list( filetype_filter_predicate.GetValue() )
if HC.GENERAL_APPLICATION in mimes:
mimes.append( HC.GENERAL_APPLICATION_ARCHIVE )
mimes.append( HC.GENERAL_IMAGE_PROJECT )
filetype_filter_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, value = mimes )
serialisable_filetype_filter_predicate = filetype_filter_predicate.GetSerialisableTuple()
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default )
return ( 10, new_serialisable_info )
def AllowsDecompressionBombs( self ):

View File

@ -631,7 +631,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
def SetLoginScripts( self, login_scripts ):
def SetLoginScripts( self, login_scripts, auto_link = False ):
with self._lock:
@ -663,18 +663,21 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
else:
credentials = {}
# if there is nothing to enter, turn it on by default, like HF click-through
active = len( login_script.GetCredentialDefinitions() ) == 0
validity = VALIDITY_UNTESTED
validity_error_text = ''
no_work_until = 0
no_work_until_reason = ''
self._domains_to_login_info[ login_domain ] = ( login_script_key_and_name, credentials, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
if auto_link:
credentials = {}
# if there is nothing to enter, turn it on by default, like HF click-through
active = len( login_script.GetCredentialDefinitions() ) == 0
validity = VALIDITY_UNTESTED
validity_error_text = ''
no_work_until = 0
no_work_until_reason = ''
self._domains_to_login_info[ login_domain ] = ( login_script_key_and_name, credentials, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )

View File

@ -1626,7 +1626,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PREDICATE
SERIALISABLE_NAME = 'File Search Predicate'
SERIALISABLE_VERSION = 6
SERIALISABLE_VERSION = 7
def __init__(
self,
@ -1984,6 +1984,30 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
return ( 6, new_serialisable_info )
if version == 6:
( predicate_type, serialisable_value, inclusive ) = old_serialisable_info
if predicate_type == PREDICATE_TYPE_SYSTEM_MIME:
mimes = list( serialisable_value )
if HC.GENERAL_APPLICATION in mimes:
mimes.append( HC.GENERAL_APPLICATION_ARCHIVE )
mimes.append( HC.GENERAL_IMAGE_PROJECT )
mimes = tuple( mimes )
serialisable_value = mimes
new_serialisable_info = ( predicate_type, serialisable_value, inclusive )
return ( 7, new_serialisable_info )
def GetCopy( self ):

View File

@ -100,7 +100,7 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 536
SOFTWARE_VERSION = 537
CLIENT_API_VERSION = 49
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -718,12 +718,14 @@ APPLICATION_KRITA = 55
IMAGE_SVG = 56
APPLICATION_XCF = 57
APPLICATION_GZIP = 58
IMAGE_HEIF = 59
IMAGE_HEIF_SEQUENCE = 60
IMAGE_HEIC = 61
IMAGE_HEIC_SEQUENCE = 62
IMAGE_AVIF = 63
IMAGE_AVIF_SEQUENCE = 64
GENERAL_APPLICATION_ARCHIVE = 59
GENERAL_IMAGE_PROJECT = 60
IMAGE_HEIF = 61
IMAGE_HEIF_SEQUENCE = 62
IMAGE_HEIC = 63
IMAGE_HEIC_SEQUENCE = 64
IMAGE_AVIF = 65
IMAGE_AVIF_SEQUENCE = 66
APPLICATION_OCTET_STREAM = 100
APPLICATION_UNKNOWN = 101
@ -732,7 +734,9 @@ GENERAL_FILETYPES = {
GENERAL_AUDIO,
GENERAL_IMAGE,
GENERAL_VIDEO,
GENERAL_ANIMATION
GENERAL_ANIMATION,
GENERAL_APPLICATION_ARCHIVE,
GENERAL_IMAGE_PROJECT
}
SEARCHABLE_MIMES = {
@ -793,7 +797,8 @@ DECOMPRESSION_BOMB_IMAGES = {
IMAGE_PNG
}
IMAGES = {
# Keep these as ordered lists, bro--we use them in a couple places in UI
IMAGES = [
IMAGE_JPEG,
IMAGE_PNG,
IMAGE_BMP,
@ -803,60 +808,68 @@ IMAGES = {
IMAGE_HEIF,
IMAGE_HEIC,
IMAGE_AVIF,
}
]
ANIMATIONS = {
ANIMATIONS = [
IMAGE_GIF,
IMAGE_APNG,
IMAGE_HEIF_SEQUENCE,
IMAGE_HEIC_SEQUENCE,
IMAGE_AVIF_SEQUENCE,
}
]
AUDIO = {
AUDIO_M4A,
AUDIO = [
AUDIO_MP3,
AUDIO_OGG,
AUDIO_FLAC,
AUDIO_WAVE,
AUDIO_WMA,
AUDIO_REALMEDIA,
AUDIO_TRUEAUDIO,
AUDIO_M4A,
AUDIO_MKV,
AUDIO_MP4,
AUDIO_WAVPACK
}
AUDIO_REALMEDIA,
AUDIO_TRUEAUDIO,
AUDIO_WAVE,
AUDIO_WAVPACK,
AUDIO_WMA
]
VIDEO = {
VIDEO = [
VIDEO_MP4,
VIDEO_WEBM,
VIDEO_MKV,
VIDEO_AVI,
VIDEO_FLV,
VIDEO_MOV,
VIDEO_MP4,
VIDEO_WMV,
VIDEO_MKV,
VIDEO_REALMEDIA,
VIDEO_WEBM,
VIDEO_MPEG,
VIDEO_OGV,
VIDEO_MPEG
}
VIDEO_REALMEDIA,
VIDEO_WMV
]
APPLICATIONS = {
IMAGE_SVG,
APPLICATIONS = [
APPLICATION_FLASH,
APPLICATION_PDF
]
IMAGE_PROJECT_FILES = [
APPLICATION_PSD,
APPLICATION_CLIP,
APPLICATION_SAI2,
APPLICATION_KRITA,
APPLICATION_XCF,
APPLICATION_PDF,
APPLICATION_ZIP,
APPLICATION_RAR,
IMAGE_SVG,
APPLICATION_XCF
]
ARCHIVES = [
APPLICATION_7Z,
APPLICATION_GZIP
}
APPLICATION_GZIP,
APPLICATION_RAR,
APPLICATION_ZIP
]
general_mimetypes_to_mime_groups = {
GENERAL_APPLICATION : APPLICATIONS,
GENERAL_APPLICATION_ARCHIVE : ARCHIVES,
GENERAL_IMAGE_PROJECT : IMAGE_PROJECT_FILES,
GENERAL_AUDIO : AUDIO,
GENERAL_IMAGE : IMAGES,
GENERAL_VIDEO : VIDEO,
@ -882,8 +895,6 @@ PIL_HEIF_MIMES = {
MIMES_THAT_DEFINITELY_HAVE_AUDIO = tuple( [ APPLICATION_FLASH ] + list( AUDIO ) )
MIMES_THAT_MAY_HAVE_AUDIO = tuple( list( MIMES_THAT_DEFINITELY_HAVE_AUDIO ) + list( VIDEO ) )
ARCHIVES = { APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR, APPLICATION_7Z, APPLICATION_GZIP }
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( { IMAGE_SVG, APPLICATION_FLASH, APPLICATION_CLIP, APPLICATION_PSD, APPLICATION_KRITA } )
FILES_THAT_CAN_HAVE_ICC_PROFILE = { IMAGE_JPEG, IMAGE_PNG, IMAGE_GIF, IMAGE_TIFF }.union( PIL_HEIF_MIMES )
@ -1033,6 +1044,8 @@ mime_string_lookup = {
UNDETERMINED_PNG : 'png or apng',
APPLICATION_UNKNOWN : 'unknown filetype',
GENERAL_APPLICATION : 'application',
GENERAL_APPLICATION_ARCHIVE : 'archive',
GENERAL_IMAGE_PROJECT : 'image project file',
GENERAL_AUDIO : 'audio',
GENERAL_IMAGE : 'image',
GENERAL_VIDEO : 'video',
@ -1100,6 +1113,8 @@ mime_mimetype_string_lookup = {
VIDEO_WEBM : 'video/webm',
APPLICATION_UNKNOWN : 'unknown filetype',
GENERAL_APPLICATION : 'application',
GENERAL_APPLICATION_ARCHIVE : 'archive',
GENERAL_IMAGE_PROJECT : 'image project file',
GENERAL_AUDIO : 'audio',
GENERAL_IMAGE : 'image',
GENERAL_VIDEO : 'video',

View File

@ -21,6 +21,7 @@ elif HC.PLATFORM_WINDOWS:
SWFRENDER_PATH = os.path.join( HC.BIN_DIR, 'swfrender_win32.exe' )
# to all out there who write libraries:
# hexagonit.swfheader is a perfect library. it is how you are supposed to do it.
def GetFlashProperties( path ):
@ -47,6 +48,7 @@ def GetFlashProperties( path ):
return ( ( width, height ), duration, num_frames )
def RenderPageToFile( path, temp_path, page_index ):
cmd = [ SWFRENDER_PATH, path, '-o', temp_path, '-p', str( page_index ) ]
@ -55,6 +57,9 @@ def RenderPageToFile( path, temp_path, page_index ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
sbp_kwargs[ 'stdout' ] = subprocess.DEVNULL
sbp_kwargs[ 'stderr' ] = subprocess.DEVNULL
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, **sbp_kwargs )

View File

@ -57,6 +57,7 @@ def EnableLoadTruncatedImages():
return False
if not hasattr( PILImage, 'DecompressionBombError' ):
# super old versions don't have this, so let's just make a stub, wew
@ -68,6 +69,7 @@ if not hasattr( PILImage, 'DecompressionBombError' ):
PILImage.DecompressionBombError = DBEStub
if not hasattr( PILImage, 'DecompressionBombWarning' ):
# super old versions don't have this, so let's just make a stub, wew
@ -79,9 +81,13 @@ if not hasattr( PILImage, 'DecompressionBombWarning' ):
PILImage.DecompressionBombWarning = DBWStub
warnings.simplefilter( 'ignore', PILImage.DecompressionBombWarning )
warnings.simplefilter( 'ignore', PILImage.DecompressionBombError )
# PIL moaning about weirdo TIFFs
warnings.filterwarnings( "ignore", "(Possibly )?corrupt EXIF data", UserWarning )
OLD_PIL_MAX_IMAGE_PIXELS = PILImage.MAX_IMAGE_PIXELS
PILImage.MAX_IMAGE_PIXELS = None # this turns off decomp check entirely, wew

View File

@ -9,7 +9,6 @@ cloudscraper>=1.2.33
html5lib>=1.0.1
lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -19,7 +18,6 @@ PySocks>=1.7.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
opencv-python-headless==4.5.5.64
@ -27,6 +25,6 @@ python-mpv==1.0.3
requests==2.31.0
QtPy==2.3.0
PySide6==6.4.1
PySide6==6.5.2
setuptools==65.5.1

View File

@ -9,7 +9,6 @@ cloudscraper>=1.2.33
html5lib>=1.0.1
lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -19,7 +18,6 @@ PySocks>=1.7.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
opencv-python-headless==4.5.5.64

View File

@ -9,7 +9,6 @@ cloudscraper>=1.2.33
html5lib>=1.0.1
lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -19,15 +18,12 @@ PySocks>=1.7.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
opencv-python-headless==4.5.5.64
python-mpv==1.0.3
requests==2.31.0
zope==5.5.0
QtPy==2.3.0
PyQt6==6.4.1
PyQt6-Qt6==6.5.0

View File

@ -9,7 +9,6 @@ cloudscraper>=1.2.33
html5lib>=1.0.1
lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -19,7 +18,6 @@ PySocks>=1.7.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
opencv-python-headless==4.5.5.64

View File

@ -9,7 +9,6 @@ cloudscraper>=1.2.33
html5lib>=1.0.1
lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -19,7 +18,6 @@ PySocks>=1.7.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
requests==2.31.0

View File

@ -1,2 +1,2 @@
QtPy==2.3.0
PySide6==6.4.1
QtPy==2.3.1
PySide6==6.5.2

View File

@ -1,2 +1,2 @@
QtPy==2.3.1
PySide6==6.5.0
PySide6==6.5.2

View File

@ -3,7 +3,6 @@ cryptography
cloudscraper>=1.2.33
html5lib>=1.0.1
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
Pillow>=9.1.1
pillow-heif>=0.12.0
@ -12,7 +11,6 @@ pyOpenSSL>=19.1.0
PyYAML>=5.0.0
Send2Trash>=1.5.0
service-identity>=18.1.0
six>=1.14.0
Twisted>=20.3.0
opencv-python-headless==4.5.5.64