Version 574

This commit is contained in:
Hydrus Network Developer 2024-05-08 15:25:53 -05:00
parent 1a06dc1824
commit 686dee1b84
No known key found for this signature in database
GPG Key ID: 76249F053212133C
44 changed files with 534 additions and 803 deletions

View File

@ -1,8 +1,7 @@
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
Version 3, May 2010
Copyright (C) 2010 by Kris Craig
Olympia, WA USA
Copyright (C) 2011 Hydrus Developer
Everyone is permitted to copy and distribute verbatim or modified
copies of this license document, and changing it is allowed as long
@ -21,4 +20,4 @@ where otherwise explicitly stated.
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION, AND MODIFICATION
0. You just DO WHAT THE FUCK YOU WANT TO.
0. You just DO WHAT THE FUCK YOU WANT TO.

View File

@ -7,6 +7,37 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 574](https://github.com/hydrusnetwork/hydrus/releases/tag/v574)
### local hashes cache
* we finally figured out the 'update 404' issue that some PTR-syncing users were getting, where PTR processing would halt with an error about an update file not being available on the server. long story short, SQLite was sometimes crossing a wire in the database on a crash, and this week I add some new maintenance code to fix this and catch it in future
* the local hash cache has a bunch of new resync/recovery code. it can now efficiently recover from missing hash_ids, excess hash_ids, desynced hash_ids, and even repopulate the master hash table if that guy has missing hash_ids (which can happen after severe db damage due to hard drive failure). it records all recovery info to the log
* the normal _database->regenerate->local hashes cache_ function now works entirely in this new resync code, making it significantly faster (previously it just deleted and re-added everything). this job also gets a nicer popup with a summary of any problems found
* when the client recovers from a bad shutdown, it now runs a quick sync on the latest hash_ids added to the local hashes cache to ensure that desync did not occur. fingers crossed, this will work super fast and ensure that we don't get the 404 problem (or related hash_id cross-wire problems) again
* on repository processing failure and a scheduling of update file maintenance, we now resync the update files in the local hash cache, meaning the 404 problem, if it does happen again, will now fix itself in the normal recovery code
* on update, everyone is going to get a full local hash cache resync, just to catch any lingering issues here. it should now work super fast!
* fixed an issue where the local hash and tags caches would not fully reset desynced results on a 'regenerate' call until a client restart
### misc
* thanks to a user, the default twitter downloader I added last week now gets full-size images. if you spammed a bunch of URLs last week, I apologise: please do a search for 'imported within the last 7 days/has a twitter url/height=1200px' and then copy/paste the results' tweet URLs into a new urls downloader. because of some special twitter settings, you shouldn't have to set 'download the file even if known url match' in the file import options; the downloader will discover the larger versions and download the full size files with no special settings needed. once done, assuming the file count is the same on both pages, go back to your first page and delete the 1200px tall files. then repeat for width=1200px!
* the filetype selector in system:filetype now expands to eat extra vertical space if the dialog is resized
* the filetype selector in file import options is moved a bit and also now expands to eat extra vertical space
* thanks to a user, the Microsoft document recognition now has fewer false negatives (it was detecting some docs as zips)
* when setting up an import folder, the dialog will now refuse to OK if you set a path that is 1) above the install dir or db dir or 2) above or below any of your file storage locations. shouldn't be possible to set up an import from your own file storage folder by accident any more
* added a new 'apply image ICC Profile colour adjustments' checkbox to _options->media_. this simply turns off ICC profile loading and application, for debug purposes
### boring cleanup
* the default SQLite page size is now 4096 bytes on Linux, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this
* fixed the last couple dozen missing layout flags across the program, which were ancient artifacts from the wx->Qt conversion
* fixed the WTFPL licence to be my copyright, lol
* deleted the local booru service management/UI code
* deleted the local booru service db/init code
* deleted the local booru service network code
* on update, the local booru service will be deleted from the database
## [Version 573](https://github.com/hydrusnetwork/hydrus/releases/tag/v573)
### new autocomplete tab, children
@ -393,53 +424,3 @@ title: Changelog
* wrote new widgets to edit `NumberTest`s of various sorts and spammed them to these (operator, value) system predicate UI panels. we are finally clearing out some 8+-year-old jank here
* rewrote the `num_notes` database search logic to use `NumberTest`s
* the system preds for height, width, and framerate now say 'has x' and 'no x' when set to `>0` or `=0`, although what these really mean is not perfectly defined
## [Version 563](https://github.com/hydrusnetwork/hydrus/releases/tag/v563)
### macOS improvements
* Thanks to a user, we have multiple improvements for macOS!
* There is a new icon for the macOS .app build of hydrus
* The macOS app will now appear as "Hydrus" in the menu bar instead of "Hydrus Network"
* - Use the native global menu bar on macOS and some Linux desktop environments
* - "options" will now appear as "Preferences..." and be under the Hydrus menu on macOS
* - "exit" will now appear as "Quit Hydrus" and be under the Hydrus menu on macOS
* "exit and force shutdown maintenance", "restart", and "shortcuts" will now be under the Hydrus menu on macOS
* The hydrus system tray icon is now enabled for macOS and "minimise to system tray" will be in the Hydrus menu when in advanced mode
* macOS debug dialog menus are now disabled by default
* The macOS build of hydrus now uses pyoxidizer 0.24.0 and Python 3.10
* The command palette and hyperlinks colors in the default Qt stylesheet now use palette based colors that should change based on the Qt style
* one thing hydev did: on macOS, Cmd+W _should_ now close any dialog or non-main-gui window, just like the Escape key
### shortcuts
* by default, Alt+Home/End/Left/Right now does the new thumbnail rearranging. **assuming they do not conflict with an existing mapping, all users will recieve this on update**
* by default, the shortcuts system now converts all non-number 'numpad' inputs (e.g. 'numpad Home', 'numpad Return', 'numpad Left') to just be normal inputs. a bunch of different keyboards have whack numpad assignments for non-numpad keys, so if it isn't a number, let's not, by default, make a fuss over the distinction. you can return to the old behaviour by unchecking the new checkbox under _file->shortcuts_
* the default shortcuts now no longer spam numpad variants anywhere. existing users can delete the surplus mappings (under 'thumbnails' and maybe some of the 'media' sets) if they like
### some UI QoL
* the _tag service_ menu button that appears in the autocomplete panel and sometimes some other places in advanced mode now shows a proper check mark in its menu beside its current value
* the _location context_ menu button on the other side of an autocomplete panel and some other places also now shows a check mark in its menu beside its current value
* the `OR` button on search autocomplete that creates new OR predicates now inherits the current file search domain. it was previously defaulting at all times to the fallback file domain and 'all known tags'
* the current search predicates list also now inherits the file search domain when you edit an OR predicate currently in use, same deal
* removed the 'favourites' submenu from the taglist menu when no tags are selected
* in any import context, the file log's arrow menu now supports deleting all the 'unknown' (outstanding, unstarted) items or setting them all to 'skipped'. the 'abort imports' button (with the stop icon) in HDD and urls import pages is removed
### misc
* fixed yet another dumb problem with the datetime control's paste button--although the paste was now 'working' on the UI side, the control wasn't saving that result on dialog ok. the fixes both the datetime button and the modified/file service time multi-column list editing
* a core asynchronous thread-checking timer in the program has been rewritten from a 20ms-resolution busy-wait to a <1ms proper wait/notify system. a bunch of stuff that works in a thread is now much faster to recognise that blocking UI work is done, and it is more thread-polite about how it does it!
* in the `setup_venv` scripts, if it needs to delete an old venv directory but fails to do so, the script now dumps out with an error saying 'hey, you probably have it open in a terminal/IDE, please close that and try again'. previously, it would just charge on and produce an odd file permission error as, e.g., the new venv setup tried to overwrite the in-use python exe
* added a `help->debug->gui->isolate existing mpv widgets` command to force regeneration of mpv windows and help test-out/hack-fix various 'every other of my mpv views has no audio' and 'my mpv loses xxx property after a system sleep/wake cycle' problems. if I've been working with you on this stuff, please give it a go and let me know if new mpv window creation is good or what!
* added a `BUGFIX: Disable off-screen window rescue` checkbox to `options->gui` that stops windows that think they are spawning off-screen from repositioning to a known safe screen. several Qt versions have had trouble with enumerating all the screens in a multiple monitor setup and thus the safe coordinate space, so if you have been hit by false positives here, you can now turn it off! (issue #1511)
* fixed another couple instances of error texts with empty formatting braces `{}`
### tag repository
* mapping petitions fetched from the server will now max out at 500k mapping rows or 10k unique tags or ten seconds of construction time. we had a 250k-unique-tag petition this last week and it broke something, so I'm slapping a bunch of safety rails on. let me know if these are too strict, too liberal, or if it messes with the fetch workflow at all--I don't _think_ it will, but we'll see
### build stuff
* now they have had time to breathe, I optimised the recently split Github build scripts. the 'send to an ubuntu runner and then upload' step is now removed from all three, so they are natively uploaded in the first runner step. it works just a little nicer and faster now, although it did require learning how to truncate and export a variable to the Github Environment Variables file in Powershell, aiiieeeee
* also, Github is moving from Node 16 to Node 20 soon, and I have moved two of the four actions we rely on to their newer v20 versions. a third action should be ready to update next week, and another, a general download file function, I have replaced with curl (for macOS) and Powershell's magical Invoke-WebRequest adventure

View File

@ -191,7 +191,7 @@ As a result, if you get a failure on trying to do a big update, try cutting the
If you narrow the gap down to just one version and still get an error, please let me know. I am very interested in these sorts of problems and will be happy to help figure out a fix with you (and everyone else who might be affected).
_All that said, and while updating is complex and every client is different, various user reports over the years suggest this route works and is efficient: 204 > 238 > 246 > 291 > 328 > 335 > 376 > 421 > 466 > 474 ? 480 > 521 > 527 (clean install) ? 558 > 571 (clean install)_
_All that said, and while updating is complex and every client is different, various user reports over the years suggest this route works and is efficient: 204 > 238 > 246 > 291 > 328 > 335 > 376 > 421 > 466 > 474 > 480 > 521 > 527 (clean install) > 535 > 558 > 571 (clean install)_
## Backing up

View File

@ -52,7 +52,7 @@ If, after a few months, you find you enjoy the software and would like to furthe
## license
These programs are free software. Everything I, hydrus dev, have made is under the Do What The Fuck You Want To Public License, Version 3, [as published](https://github.com/sirkris/WTFPL/blob/master/WTFPL.md) by Kris Craig.
These programs are free software. Everything I, hydrus dev, have made is under the Do What The Fuck You Want To Public License, Version 3:
``` title="license.txt"
--8<-- "license.txt"

View File

@ -34,6 +34,34 @@
<div class="content">
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
<ul>
<li>
<h2 id="version_574"><a href="#version_574">version 574</a></h2>
<ul>
<li><h3>local hashes cache</h3></li>
<li>we finally figured out the 'update 404' issue that some PTR-syncing users were getting, where PTR processing would halt with an error about an update file not being available on the server. long story short, SQLite was sometimes crossing a wire in the database on a crash, and this week I add some new maintenance code to fix this and catch it in future</li>
<li>the local hash cache has a bunch of new resync/recovery code. it can now efficiently recover from missing hash_ids, excess hash_ids, desynced hash_ids, and even repopulate the master hash table if that guy has missing hash_ids (which can happen after severe db damage due to hard drive failure). it records all recovery info to the log</li>
<li>the normal _database-&gt;regenerate-&gt;local hashes cache_ function now works entirely in this new resync code, making it significantly faster (previously it just deleted and re-added everything). this job also gets a nicer popup with a summary of any problems found</li>
<li>when the client recovers from a bad shutdown, it now runs a quick sync on the latest hash_ids added to the local hashes cache to ensure that desync did not occur. fingers crossed, this will work super fast and ensure that we don't get the 404 problem (or related hash_id cross-wire problems) again</li>
<li>on repository processing failure and a scheduling of update file maintenance, we now resync the update files in the local hash cache, meaning the 404 problem, if it does happen again, will now fix itself in the normal recovery code</li>
<li>on update, everyone is going to get a full local hash cache resync, just to catch any lingering issues here. it should now work super fast!</li>
<li>fixed an issue where the local hash and tags caches would not fully reset desynced results on a 'regenerate' call until a client restart</li>
<li><h3>misc</h3></li>
<li>thanks to a user, the default twitter downloader I added last week now gets full-size images. if you spammed a bunch of URLs last week, I apologise: please do a search for 'imported within the last 7 days/has a twitter url/height=1200px' and then copy/paste the results' tweet URLs into a new urls downloader. because of some special twitter settings, you shouldn't have to set 'download the file even if known url match' in the file import options; the downloader will discover the larger versions and download the full size files with no special settings needed. once done, assuming the file count is the same on both pages, go back to your first page and delete the 1200px tall files. then repeat for width=1200px!</li>
<li>the filetype selector in system:filetype now expands to eat extra vertical space if the dialog is resized</li>
<li>the filetype selector in file import options is moved a bit and also now expands to eat extra vertical space</li>
<li>thanks to a user, the Microsoft document recognition now has fewer false negatives (it was detecting some docs as zips)</li>
<li>when setting up an import folder, the dialog will now refuse to OK if you set a path that is 1) above the install dir or db dir or 2) above or below any of your file storage locations. shouldn't be possible to set up an import from your own file storage folder by accident any more</li>
<li>added a new 'apply image ICC Profile colour adjustments' checkbox to _options-&gt;media_. this simply turns off ICC profile loading and application, for debug purposes</li>
<li><h3>boring cleanup</h3></li>
<li>the default SQLite page size is now 4096 bytes on Linux, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this</li>
<li>fixed the last couple dozen missing layout flags across the program, which were ancient artifacts from the wx-&gt;Qt conversion</li>
<li>fixed the WTFPL licence to be my copyright, lol</li>
<li>deleted the local booru service management/UI code</li>
<li>deleted the local booru service db/init code</li>
<li>deleted the local booru service network code</li>
<li>on update, the local booru service will be deleted from the database</li>
</ul>
</li>
<li>
<h2 id="version_573"><a href="#version_573">version 573</a></h2>
<ul>

View File

@ -248,7 +248,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
def _GetUPnPServices( self ):
return self.services_manager.GetServices( ( HC.LOCAL_BOORU, HC.CLIENT_API_SERVICE ) )
return self.services_manager.GetServices( ( HC.CLIENT_API_SERVICE, ) )
def _GetWakeDelayPeriodMS( self ):
@ -967,6 +967,15 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
return self.new_options
def GetImportSensitiveDirectories( self ):
dirs_that_allow_internal_work = [ HC.BASE_DIR, self.db_dir ]
dirs_that_cannot_be_touched = self.client_files_manager.GetAllDirectoriesInUse()
return ( dirs_that_allow_internal_work, dirs_that_cannot_be_touched )
def InitClientFilesManager( self ):
def qt_code( missing_subfolders ):
@ -1006,6 +1015,15 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
def ReinitGlobalSettings( self ):
from hydrus.core.files.images import HydrusImageHandling
from hydrus.core.files.images import HydrusImageNormalisation
HydrusImageHandling.SetEnableLoadTruncatedImages( self.new_options.GetBoolean( 'enable_truncated_images_pil' ) )
HydrusImageNormalisation.SetDoICCProfileNormalisation( self.new_options.GetBoolean( 'do_icc_profile_normalisation' ) )
def InitModel( self ):
from hydrus.client import ClientManagers
@ -1324,9 +1342,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
self.CallBlockingToQt( self._splash, qt_code_style )
from hydrus.core.files.images import HydrusImageHandling
HydrusImageHandling.SetEnableLoadTruncatedImages( self.new_options.GetBoolean( 'enable_truncated_images_pil' ) )
self.ReinitGlobalSettings()
def qt_code_pregui():
@ -2068,7 +2084,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
previous_services = self.services_manager.GetServices()
upnp_services = [ service for service in services if service.GetServiceType() in ( HC.LOCAL_BOORU, HC.CLIENT_API_SERVICE ) ]
upnp_services = [ service for service in services if service.GetServiceType() in ( HC.CLIENT_API_SERVICE, ) ]
self.CallToThreadLongRunning( self.services_upnp_manager.SetServices, upnp_services )

View File

@ -1567,6 +1567,18 @@ class ClientFilesManager( object ):
def GetAllDirectoriesInUse( self ):
with self._file_storage_rwlock.read:
subfolders = self._GetAllSubfolders()
directories = { subfolder.base_location.path for subfolder in subfolders }
return directories
def GetCurrentFileBaseLocations( self ):
with self._file_storage_rwlock.read:

View File

@ -317,6 +317,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'booleans' ][ 'slideshow_always_play_duration_media_once_through' ] = False
self._dictionary[ 'booleans' ][ 'enable_truncated_images_pil' ] = True
self._dictionary[ 'booleans' ][ 'do_icc_profile_normalisation' ] = True
from hydrus.client.gui.canvas import ClientGUIMPV

View File

@ -77,7 +77,7 @@ def GenerateDefaultServiceDictionary( service_type ):
if service_type in ( HC.LOCAL_BOORU, HC.CLIENT_API_SERVICE ):
if service_type == HC.CLIENT_API_SERVICE:
dictionary[ 'port' ] = None
dictionary[ 'upnp_port' ] = None
@ -93,16 +93,7 @@ def GenerateDefaultServiceDictionary( service_type ):
dictionary[ 'external_host_override' ] = None
dictionary[ 'external_port_override' ] = None
if service_type == HC.LOCAL_BOORU:
allow_non_local_connections = True
elif service_type == HC.CLIENT_API_SERVICE:
allow_non_local_connections = False
dictionary[ 'allow_non_local_connections' ] = allow_non_local_connections
dictionary[ 'allow_non_local_connections' ] = False
dictionary[ 'use_https' ] = False
@ -175,10 +166,6 @@ def GenerateService( service_key, service_type, name, dictionary = None ):
cl = ServiceRemote
elif service_type == HC.LOCAL_BOORU:
cl = ServiceLocalBooru
elif service_type == HC.CLIENT_API_SERVICE:
cl = ServiceClientAPI
@ -495,78 +482,7 @@ class ServiceLocalServerService( Service ):
class ServiceLocalBooru( ServiceLocalServerService ):
def GetExternalShareURL( self, share_key ):
if self._use_https:
scheme = 'https'
else:
scheme = 'http'
if self._external_scheme_override is not None:
scheme = self._external_scheme_override
if self._external_host_override is None:
host = HydrusNATPunch.GetExternalIP()
else:
host = self._external_host_override
if self._external_port_override is None:
if self._upnp_port is None:
port = ':{}'.format( self._port )
else:
port = ':{}'.format( self._upnp_port )
else:
port = self._external_port_override
if port != '':
port = ':{}'.format( port )
url = '{}://{}{}/gallery?share_key={}'.format( scheme, host, port, share_key.hex() )
return url
def GetInternalShareURL( self, share_key ):
internal_ip = '127.0.0.1'
internal_port = self._port
if self._use_https:
scheme = 'https'
else:
scheme = 'http'
url = '{}://{}:{}/gallery?share_key={}'.format( scheme, internal_ip, internal_port, share_key.hex() )
return url
class ServiceClientAPI( ServiceLocalServerService ):
pass

View File

@ -139,6 +139,7 @@ class ImageRendererCache( object ):
self._data_cache = ClientCachesBase.DataCache( self._controller, 'image cache', cache_size, timeout = cache_timeout )
self._controller.sub( self, 'NotifyNewOptions', 'notify_new_options' )
self._controller.sub( self, 'Clear', 'clear_image_cache' )
def Clear( self ):
@ -301,7 +302,7 @@ class ThumbnailCache( object ):
self._controller.CallToThreadLongRunning( self.MainLoop )
self._controller.sub( self, 'Clear', 'reset_thumbnail_cache' )
self._controller.sub( self, 'Clear', 'clear_thumbnail_cache' )
self._controller.sub( self, 'ClearThumbnails', 'clear_thumbnails' )
self._controller.sub( self, 'NotifyNewOptions', 'notify_new_options' )

View File

@ -1278,7 +1278,6 @@ class DB( HydrusDB.HydrusDB ):
( CC.TRASH_SERVICE_KEY, HC.LOCAL_FILE_TRASH_DOMAIN, 'trash' ),
( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, HC.LOCAL_TAG, 'my tags' ),
( CC.DEFAULT_LOCAL_DOWNLOADER_TAG_SERVICE_KEY, HC.LOCAL_TAG, 'downloader tags' ),
( CC.LOCAL_BOORU_SERVICE_KEY, HC.LOCAL_BOORU, 'local booru' ),
( CC.LOCAL_NOTES_SERVICE_KEY, HC.LOCAL_NOTES, 'local notes' ),
( CC.DEFAULT_FAVOURITES_RATING_SERVICE_KEY, HC.LOCAL_RATING_LIKE, 'favourites' ),
( CC.CLIENT_API_SERVICE_KEY, HC.CLIENT_API_SERVICE, 'client api' )
@ -4630,10 +4629,6 @@ class DB( HydrusDB.HydrusDB ):
info_types = { HC.SERVICE_INFO_NUM_FILE_HASHES }
elif service_type == HC.LOCAL_BOORU:
info_types = { HC.SERVICE_INFO_NUM_SHARES }
else:
info_types = set()
@ -4766,13 +4761,6 @@ class DB( HydrusDB.HydrusDB ):
info = self.modules_ratings.GetIncDecServiceCount( service_id )
elif service_type == HC.LOCAL_BOORU:
if info_type == HC.SERVICE_INFO_NUM_SHARES:
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM yaml_dumps WHERE dump_type = ?;', ( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU, ) ).fetchone()
if info is None:
@ -6786,9 +6774,6 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'inbox_hashes': result = self._FilterInboxHashes( *args, **kwargs )
elif action == 'is_an_orphan': result = self._IsAnOrphan( *args, **kwargs )
elif action == 'last_shutdown_work_time': result = self.modules_db_maintenance.GetLastShutdownWorkTime( *args, **kwargs )
elif action == 'local_booru_share_keys': result = self.modules_serialisable.GetYAMLDumpNames( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU )
elif action == 'local_booru_share': result = self.modules_serialisable.GetYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
elif action == 'local_booru_shares': result = self.modules_serialisable.GetYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU )
elif action == 'maintenance_due': result = self._GetMaintenanceDue( *args, **kwargs )
elif action == 'media_predicates': result = self.modules_tag_display.GetMediaPredicates( *args, **kwargs )
elif action == 'media_result': result = self._GetMediaResultFromHash( *args, **kwargs )
@ -6933,22 +6918,20 @@ class DB( HydrusDB.HydrusDB ):
try:
job_status.SetStatusTitle( 'regenerating local hash cache' )
job_status.SetStatusTitle( 'resynchronising local hashes cache' )
self._controller.pub( 'modal_message', job_status )
message = 'generating local hash cache'
message = 'generating local hashes cache'
job_status.SetStatusText( message )
self._controller.frame_splash_status.SetSubtext( message )
self.modules_hashes_local_cache.Repopulate()
self.modules_hashes_local_cache.Resync( job_status )
finally:
job_status.SetStatusText( 'done!' )
job_status.FinishAndDismiss( 5 )
job_status.Finish()
@ -10255,6 +10238,89 @@ class DB( HydrusDB.HydrusDB ):
if version == 573:
try:
self.modules_hashes_local_cache.Resync()
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to force a local hashes resync failed! Please let hydrus dev know!'
self.pub_initial_message( message )
try:
domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
domain_manager.Initialise()
parsers = domain_manager.GetParsers()
parser_names = { parser.GetName() for parser in parsers }
# checking for floog's downloader
if 'fxtwitter api status parser' not in parser_names and 'vxtwitter api status parser' not in parser_names:
domain_manager.OverwriteDefaultURLClasses( [
'twitter image (with format)',
'twitter image (without format)'
])
#
domain_manager.TryToLinkURLClassesAndParsers()
#
self.modules_serialisable.SetJSONDump( domain_manager )
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to update some downloaders failed! Please let hydrus dev know!'
self.pub_initial_message( message )
try:
service_id = self.modules_services.GetServiceId( CC.LOCAL_BOORU_SERVICE_KEY )
try:
self._DeleteService( service_id )
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to delete the local booru stub failed! Please let hydrus dev know!'
self.pub_initial_message( message )
except HydrusExceptions.DataMissing:
# idempotency
pass
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to delete the local booru stub failed! Please let hydrus dev know!'
self.pub_initial_message( message )
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
@ -10744,7 +10810,6 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'content_updates': self._ProcessContentUpdatePackage( *args, **kwargs )
elif action == 'cull_file_viewing_statistics': self.modules_files_viewing_stats.CullFileViewingStatistics( *args, **kwargs )
elif action == 'db_integrity': self.modules_db_maintenance.CheckDBIntegrity( *args, **kwargs )
elif action == 'delete_local_booru_share': self.modules_serialisable.DeleteYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
elif action == 'delete_pending': self._DeletePending( *args, **kwargs )
elif action == 'delete_serialisable_named': self.modules_serialisable.DeleteJSONDumpNamed( *args, **kwargs )
elif action == 'delete_service_info': self._DeleteServiceInfo( *args, **kwargs )
@ -10764,7 +10829,6 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'ideal_client_files_locations': self.modules_files_physical_storage.SetIdealClientFilesLocations( *args, **kwargs )
elif action == 'import_file': result = self._ImportFile( *args, **kwargs )
elif action == 'import_update': self._ImportUpdate( *args, **kwargs )
elif action == 'local_booru_share': self.modules_serialisable.SetYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
elif action == 'maintain_hashed_serialisables': result = self.modules_serialisable.MaintainHashedStorage( *args, **kwargs )
elif action == 'maintain_similar_files_search_for_potential_duplicates': result = self._PerceptualHashesSearchForPotentialDuplicates( *args, **kwargs )
elif action == 'maintain_similar_files_tree': self.modules_similar_files.MaintainTree( *args, **kwargs )

View File

@ -6,11 +6,11 @@ from hydrus.core import HydrusData
from hydrus.core import HydrusDB
from hydrus.core import HydrusDBBase
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusLists
from hydrus.core import HydrusTags
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientThreading
from hydrus.client.db import ClientDBFilesStorage
from hydrus.client.db import ClientDBMappingsCounts
from hydrus.client.db import ClientDBMaster
@ -33,6 +33,15 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
ClientDBModule.ClientDBModule.__init__( self, 'client hashes local cache', cursor )
def _DoLastShutdownWasBadWork( self ):
# We just had a crash, oh no! There is a chance we are desynced here, so let's see what was recently added and make sure we are good.
last_twenty_hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM local_hashes_cache ORDER BY hash_id DESC LIMIT 20;' ) )
self.SyncHashIds( last_twenty_hash_ids )
def _GetInitialTableGenerationDict( self ) -> dict:
return {
@ -87,7 +96,7 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
def _RepairRepopulateTables( self, table_names, cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper ):
self.Repopulate()
self.Resync()
cursor_transaction_wrapper.CommitAndBegin()
@ -103,6 +112,8 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
self._Execute( 'DELETE FROM local_hashes_cache;' )
self._hash_ids_to_hashes_cache = {}
def DropHashIdsFromCache( self, hash_ids ):
@ -175,6 +186,8 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
def GetHashIdsToHashes( self, hash_ids = None, hashes = None, create_new_hash_ids = True ) -> typing.Dict[ int, bytes ]:
hash_ids_to_hashes = {}
if hash_ids is not None:
self._PopulateHashIdsToHashesCache( hash_ids )
@ -215,23 +228,149 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
return result is not None
def Repopulate( self ):
def Resync( self, job_status = None ):
self.ClearCache()
if job_status is None:
job_status = ClientThreading.JobStatus( cancellable = True )
CG.client_controller.frame_splash_status.SetSubtext( 'reading local file data' )
text = 'fetching local file hashes'
local_hash_ids = self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id )
job_status.SetStatusText( text )
CG.client_controller.frame_splash_status.SetSubtext( text )
all_hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM local_hashes_cache;' ) )
all_hash_ids.update( self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id ) )
self.SyncHashIds( all_hash_ids, job_status = job_status )
def SyncHashIds( self, all_hash_ids: typing.Collection[ int ], job_status = None ):
if job_status is None:
job_status = ClientThreading.JobStatus( cancellable = True )
if not isinstance( all_hash_ids, list ):
all_hash_ids = list( all_hash_ids )
BLOCK_SIZE = 10000
num_to_do = len( local_hash_ids )
num_to_do = len( all_hash_ids )
for ( i, block_of_hash_ids ) in enumerate( HydrusLists.SplitListIntoChunks( local_hash_ids, BLOCK_SIZE ) ):
all_excess_hash_ids = set()
all_missing_hash_ids = set()
all_incorrect_hash_ids = set()
for ( i, block_of_hash_ids ) in enumerate( HydrusLists.SplitListIntoChunks( all_hash_ids, BLOCK_SIZE ) ):
CG.client_controller.frame_splash_status.SetSubtext( 'caching local file data {}'.format( HydrusData.ConvertValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) ) )
if job_status.IsCancelled():
break
self.AddHashIdsToCache( block_of_hash_ids )
block_of_hash_ids = set( block_of_hash_ids )
text = 'syncing local hashes {}'.format( HydrusData.ConvertValueRangeToPrettyString( i * BLOCK_SIZE, num_to_do ) )
CG.client_controller.frame_splash_status.SetSubtext( text )
job_status.SetStatusText( text )
with self._MakeTemporaryIntegerTable( block_of_hash_ids, 'hash_id' ) as temp_table_name:
table_join = self.modules_files_storage.GetTableJoinLimitedByFileDomain( self.modules_services.combined_local_file_service_id, temp_table_name, HC.CONTENT_STATUS_CURRENT )
local_hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {table_join};' ) )
excess_hash_ids = block_of_hash_ids.difference( local_hash_ids )
if len( excess_hash_ids ) > 0:
self.DropHashIdsFromCache( excess_hash_ids )
all_excess_hash_ids.update( excess_hash_ids )
missing_hash_ids = { hash_id for hash_id in local_hash_ids if not self.HasHashId( hash_id ) }
if len( missing_hash_ids ) > 0:
self.AddHashIdsToCache( missing_hash_ids )
all_missing_hash_ids.update( missing_hash_ids )
present_local_hash_ids = local_hash_ids.difference( missing_hash_ids )
my_hash_ids_to_hashes = self.GetHashIdsToHashes( hash_ids = present_local_hash_ids )
master_hash_ids_to_hashes = self.modules_hashes.GetHashIdsToHashes( hash_ids = present_local_hash_ids )
incorrect_hash_ids = { hash_id for hash_id in list( my_hash_ids_to_hashes.keys() ) if my_hash_ids_to_hashes[ hash_id ] != master_hash_ids_to_hashes[ hash_id ] }
if len( incorrect_hash_ids ) > 0:
self.DropHashIdsFromCache( incorrect_hash_ids )
self.AddHashIdsToCache( incorrect_hash_ids )
all_incorrect_hash_ids.update( incorrect_hash_ids )
status_text_info = []
if len( all_excess_hash_ids ) > 0:
bad_hash_ids_text = ', '.join( ( str( hash_id ) for hash_id in sorted( all_excess_hash_ids ) ) )
HydrusData.Print( f'Deleted excess desynced local hash_ids: {bad_hash_ids_text}' )
status_text_info.append( f'{HydrusData.ToHumanInt( len( all_excess_hash_ids ) ) } excess hash records' )
if len( all_missing_hash_ids ) > 0:
bad_hash_ids_text = ', '.join( ( str( hash_id ) for hash_id in sorted( all_missing_hash_ids ) ) )
HydrusData.Print( f'Added missing desynced local hash_ids: {bad_hash_ids_text}' )
status_text_info.append( f'{HydrusData.ToHumanInt( len( all_missing_hash_ids ) ) } missing hash records' )
if len( all_incorrect_hash_ids ) > 0:
bad_hash_ids_text = ', '.join( ( str( hash_id ) for hash_id in sorted( all_incorrect_hash_ids ) ) )
HydrusData.Print( f'Fixed incorrect desynced local hash_ids: {bad_hash_ids_text}' )
status_text_info.append( f'{HydrusData.ToHumanInt( len( all_incorrect_hash_ids ) ) } incorrect hash records' )
if len( status_text_info ) > 0:
job_status.SetStatusText( '\n'.join( status_text_info ) )
else:
job_status.SetStatusText( 'Done with no errors found!' )
job_status.Finish()
def SyncHashes( self, hashes: typing.Collection[ bytes ] ):
"""
This guy double-checks the hashes against the local store and the master store, because they may well differ in a desync!
"""
all_hash_ids = set( self.GetHashIds( hashes ) )
all_hash_ids.update( self.modules_hashes.GetHashIds( hashes ) )
self.SyncHashIds( all_hash_ids )
@ -320,6 +459,8 @@ class ClientDBCacheLocalTags( ClientDBModule.ClientDBModule ):
self._Execute( 'DELETE FROM local_tags_cache;' )
self._tag_ids_to_tags_cache = {}
def DropTagIdsFromCache( self, tag_ids ):
@ -369,6 +510,8 @@ class ClientDBCacheLocalTags( ClientDBModule.ClientDBModule ):
def GetTagIdsToTags( self, tag_ids = None, tags = None ) -> typing.Dict[ int, str ]:
tag_ids_to_tags = {}
if tag_ids is not None:
self._PopulateTagIdsToTagsCache( tag_ids )

View File

@ -90,7 +90,7 @@ class ClientDBMasterHashes( ClientDBModule.ClientDBModule ):
if hash_id not in uncached_hash_ids_to_hashes:
# TODO: ultimately move this to the 'recover from missing definitions' stuff I am building in ClientDB, since the local hash cache may have it
# TODO: ultimately move this to the 'recover from missing definitions' stuff I am building in ClientDB, since the local hashes cache may have it
# for now though, screw it
# I shouldn't be able to see this here, but this is emergency code, screw it.
@ -120,7 +120,7 @@ class ClientDBMasterHashes( ClientDBModule.ClientDBModule ):
pubbed_error = True
HydrusData.DebugPrint( 'Database master hash definition error: hash_id {} was missing! Recovered from local hash cache with hash {}.'.format( hash_id, hash.hex() ) )
HydrusData.DebugPrint( 'Database master hash definition error: hash_id {} was missing! Recovered from local hashes cache with hash {}.'.format( hash_id, hash.hex() ) )
self._Execute( 'INSERT OR IGNORE INTO hashes ( hash_id, hash ) VALUES ( ?, ? );', ( hash_id, sqlite3.Binary( hash ) ) )

View File

@ -273,6 +273,8 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
update_hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM {};'.format( table_join ) ) )
self.modules_hashes_local_cache.SyncHashIds( update_hash_ids )
# so we are also going to pull from here in case there are orphan records!!!
other_table_join = self.modules_files_storage.GetTableJoinLimitedByFileDomain( self.modules_services.combined_local_file_service_id, repository_updates_table_name, HC.CONTENT_STATUS_CURRENT )

View File

@ -33,6 +33,7 @@ from hydrus.core.files import HydrusFileHandling
from hydrus.core.files import HydrusPSDHandling
from hydrus.core.files import HydrusVideoHandling
from hydrus.core.files.images import HydrusImageHandling
from hydrus.core.files.images import HydrusImageNormalisation
from hydrus.core.networking import HydrusNetwork
from hydrus.client import ClientApplicationCommand as CAC
@ -3233,8 +3234,8 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
ClientGUIMenus.AppendSeparator( regen_submenu )
ClientGUIMenus.AppendMenuItem( regen_submenu, 'all deleted files' + HC.UNICODE_ELLIPSIS, 'Resynchronise the store of all known deleted files.', self._RegenerateCombinedDeletedFiles )
ClientGUIMenus.AppendMenuItem( regen_submenu, 'local hash cache' + HC.UNICODE_ELLIPSIS, 'Repopulate the cache hydrus uses for fast hash lookup for local files.', self._RegenerateLocalHashCache )
ClientGUIMenus.AppendMenuItem( regen_submenu, 'local tag cache' + HC.UNICODE_ELLIPSIS, 'Repopulate the cache hydrus uses for fast tag lookup for local files.', self._RegenerateLocalTagCache )
ClientGUIMenus.AppendMenuItem( regen_submenu, 'local hashes cache' + HC.UNICODE_ELLIPSIS, 'Repopulate the cache hydrus uses for fast hash lookup for local files.', self._RegenerateLocalHashCache )
ClientGUIMenus.AppendMenuItem( regen_submenu, 'local tags cache' + HC.UNICODE_ELLIPSIS, 'Repopulate the cache hydrus uses for fast tag lookup for local files.', self._RegenerateLocalTagCache )
ClientGUIMenus.AppendSeparator( regen_submenu )
@ -3480,7 +3481,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
ClientGUIMenus.AppendMenuItem( memory_actions, 'run fast memory maintenance', 'Tell all the fast caches to maintain themselves.', self._controller.MaintainMemoryFast )
ClientGUIMenus.AppendMenuItem( memory_actions, 'run slow memory maintenance', 'Tell all the slow caches to maintain themselves.', self._controller.MaintainMemorySlow )
ClientGUIMenus.AppendMenuItem( memory_actions, 'clear all rendering caches', 'Tell the image rendering system to forget all current images, tiles, and thumbs. This will often free up a bunch of memory immediately.', self._controller.ClearCaches )
ClientGUIMenus.AppendMenuItem( memory_actions, 'clear thumbnail cache', 'Tell the thumbnail cache to forget everything and redraw all current thumbs.', self._controller.pub, 'reset_thumbnail_cache' )
ClientGUIMenus.AppendMenuItem( memory_actions, 'clear thumbnail cache', 'Tell the thumbnail cache to forget everything and redraw all current thumbs.', self._controller.pub, 'clear_thumbnail_cache' )
if HydrusMemory.PYMPLER_OK:
@ -4418,7 +4419,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
self._controller.pub( 'notify_new_colourset' )
self._controller.pub( 'notify_new_favourite_tags' )
HydrusImageHandling.SetEnableLoadTruncatedImages( self._controller.new_options.GetBoolean( 'enable_truncated_images_pil' ) )
CG.client_controller.ReinitGlobalSettings()
self._menu_item_help_darkmode.setChecked( CG.client_controller.new_options.GetString( 'current_colourset' ) == 'darkmode' )
@ -5231,9 +5232,9 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
def _RegenerateLocalHashCache( self ):
message = 'This will delete and then recreate the local hash cache, which keeps a small record of hashes for files on your hard drive. It isn\'t super important, but it speeds most operations up, and this routine fixes it when broken.'
message = 'This will check and repair any bad rows in the local hashes cache, which keeps a small record of hashes for files on your hard drive. The cache isn\'t super important, but it speeds most operations up, and this routine fixes it when broken/desynced.'
message += '\n' * 2
message += 'If you have a lot of files, it can take a long time, during which the gui may hang.'
message += 'If you have a lot of files, it can take a minute, during which the gui may hang.'
message += '\n' * 2
message += 'If you do not have a specific reason to run this, it is pointless.'
@ -6681,7 +6682,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
HG.blurhash_mode = not HG.blurhash_mode
self._controller.pub( 'reset_thumbnail_cache' )
self._controller.pub( 'clear_thumbnail_cache' )
elif name == 'cache_report_mode':

View File

@ -401,7 +401,7 @@ class DialogInputTags( Dialog ):
QP.AddToLayout( vbox, self._tags, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( vbox, self._tag_autocomplete )
QP.AddToLayout( vbox, self._tag_autocomplete, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, b_box, CC.FLAGS_ON_RIGHT )
self.setLayout( vbox )

View File

@ -83,7 +83,7 @@ class EditLoginCredentialsPanel( ClientGUIScrolledPanels.EditPanel ):
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, control, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, control_st )
QP.AddToLayout( hbox, control_st, CC.FLAGS_CENTER_PERPENDICULAR )
rows.append( ( credential_definition.GetName() + ': ', hbox ) )

View File

@ -84,12 +84,10 @@ class OptionsPanelMimesTree( OptionsPanel ):
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._my_tree )
QP.AddToLayout( vbox, self._my_tree, CC.FLAGS_EXPAND_BOTH_WAYS )
self.setLayout( vbox )
#self._my_tree.itemClicked.connect( self._ItemClicked )
def _GetMimesForGeneralMimeType( self, general_mime_type ):
@ -126,226 +124,3 @@ class OptionsPanelMimesTree( OptionsPanel ):
class OptionsPanelMimes( OptionsPanel ):
BUTTON_CURRENTLY_HIDDEN = '\u25B6'
BUTTON_CURRENTLY_SHOWING = '\u25BC'
def __init__( self, parent, selectable_mimes ):
OptionsPanel.__init__( self, parent )
self._selectable_mimes = set( selectable_mimes )
self._mimes_to_checkboxes = {}
self._general_mime_types_to_checkboxes = {}
self._general_mime_types_to_buttons = {}
general_mime_types = []
general_mime_types.append( HC.GENERAL_IMAGE )
general_mime_types.append( HC.GENERAL_ANIMATION )
general_mime_types.append( HC.GENERAL_VIDEO )
general_mime_types.append( HC.GENERAL_AUDIO )
general_mime_types.append( HC.GENERAL_APPLICATION )
gridbox = QP.GridLayout( cols = 3 )
gridbox.setColumnStretch( 2, 1 )
for general_mime_type in general_mime_types:
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
if len( mimes_in_type ) == 0:
continue
general_mime_checkbox = QW.QCheckBox( HC.mime_string_lookup[ general_mime_type ], self )
general_mime_checkbox.clicked.connect( self.EventMimeGroupCheckbox )
self._general_mime_types_to_checkboxes[ general_mime_type ] = general_mime_checkbox
QP.AddToLayout( gridbox, general_mime_checkbox, CC.FLAGS_CENTER_PERPENDICULAR )
show_hide_button = ClientGUICommon.BetterButton( self, self.BUTTON_CURRENTLY_HIDDEN, self._ButtonShowHide, general_mime_type )
max_width = ClientGUIFunctions.ConvertTextToPixelWidth( show_hide_button, 5 )
show_hide_button.setMaximumWidth( max_width )
self._general_mime_types_to_buttons[ general_mime_type ] = show_hide_button
QP.AddToLayout( gridbox, show_hide_button, CC.FLAGS_CENTER_PERPENDICULAR )
vbox = QP.VBoxLayout()
for mime in mimes_in_type:
m_checkbox = QW.QCheckBox( HC.mime_string_lookup[ mime ], self )
m_checkbox.clicked.connect( self.EventMimeCheckbox )
m_checkbox.setVisible( False )
self._mimes_to_checkboxes[ mime ] = m_checkbox
QP.AddToLayout( vbox, m_checkbox, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( gridbox, vbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
self.setLayout( gridbox )
def _DoInitialHideShow( self ):
for ( general_mime_type, general_mime_checkbox ) in list( self._general_mime_types_to_checkboxes.items() ):
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
should_show = general_mime_checkbox.checkState() == QC.Qt.PartiallyChecked
if not should_show:
self._ButtonShowHide( general_mime_type )
def _GetMimesForGeneralMimeType( self, general_mime_type ):
mimes_in_type = HC.general_mimetypes_to_mime_groups[ general_mime_type ]
mimes_in_type = [ mime for mime in mimes_in_type if mime in self._selectable_mimes ]
return mimes_in_type
def _ButtonShowHide( self, general_mime_type ):
button = self._general_mime_types_to_buttons[ general_mime_type ]
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
should_show = button.text() == self.BUTTON_CURRENTLY_HIDDEN
for mime in mimes_in_type:
self._mimes_to_checkboxes[ mime ].setVisible( should_show )
if should_show:
button.setText( self.BUTTON_CURRENTLY_SHOWING )
else:
button.setText( self.BUTTON_CURRENTLY_HIDDEN )
def _UpdateMimeGroupCheckboxes( self ):
for ( general_mime_type, general_mime_checkbox ) in self._general_mime_types_to_checkboxes.items():
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
all_checkbox_values = { self._mimes_to_checkboxes[ mime ].isChecked() for mime in mimes_in_type }
all_false = True not in all_checkbox_values
all_true = False not in all_checkbox_values
if all_false:
check_state = QC.Qt.Unchecked
elif all_true:
check_state = QC.Qt.Checked
else:
check_state = QC.Qt.PartiallyChecked
if check_state == QC.Qt.PartiallyChecked:
general_mime_checkbox.setTristate( True )
general_mime_checkbox.setCheckState( check_state )
if check_state != QC.Qt.PartiallyChecked:
general_mime_checkbox.setTristate( False )
def EventMimeCheckbox( self ):
self._UpdateMimeGroupCheckboxes()
def EventMimeGroupCheckbox( self ):
for ( general_mime_type, general_mime_checkbox ) in list( self._general_mime_types_to_checkboxes.items() ):
check_state = general_mime_checkbox.checkState()
mime_check_state = None
if check_state == QC.Qt.Unchecked:
mime_check_state = False
elif check_state == QC.Qt.Checked:
mime_check_state = True
if mime_check_state is not None:
general_mime_checkbox.setTristate( False )
mimes_in_type = self._GetMimesForGeneralMimeType( general_mime_type )
for mime in mimes_in_type:
self._mimes_to_checkboxes[ mime ].setChecked( mime_check_state )
def GetValue( self ):
mimes = tuple( [ mime for ( mime, checkbox ) in list( self._mimes_to_checkboxes.items() ) if checkbox.isChecked() ] )
return mimes
def SetValue( self, checked_mimes ):
checked_mimes = ClientSearch.ConvertSummaryFiletypesToSpecific( checked_mimes, only_searchable = False )
for ( mime, checkbox ) in self._mimes_to_checkboxes.items():
if mime in checked_mimes:
checkbox.setChecked( True )
else:
checkbox.setChecked( False )
self._UpdateMimeGroupCheckboxes()
#self._DoInitialHideShow()

View File

@ -1274,6 +1274,7 @@ class PopupMessageManager( QW.QFrame ):
# This was originally a reviewpanel subclass which is a scroll area subclass, but having it in a scroll area didn't work out with dynamically updating size as the widget contents change.
class PopupMessageDialogPanel( QW.QWidget ):
@ -1291,7 +1292,7 @@ class PopupMessageDialogPanel( QW.QWidget ):
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._message_window )
QP.AddToLayout( vbox, self._message_window, CC.FLAGS_EXPAND_BOTH_WAYS )
self.setLayout( vbox )

View File

@ -1982,11 +1982,11 @@ class EditFileNotesPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolle
button_hbox = QP.HBoxLayout()
QP.AddToLayout( button_hbox, self._add_button )
QP.AddToLayout( button_hbox, self._edit_button )
QP.AddToLayout( button_hbox, self._delete_button )
QP.AddToLayout( button_hbox, self._copy_button )
QP.AddToLayout( button_hbox, self._paste_button )
QP.AddToLayout( button_hbox, self._add_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._edit_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._delete_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._copy_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._paste_button, CC.FLAGS_CENTER_PERPENDICULAR )
vbox = QP.VBoxLayout()
@ -2520,8 +2520,8 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
self._copy_button.hide()
QP.AddToLayout( button_hbox, self._copy_button )
QP.AddToLayout( button_hbox, self._paste_button )
QP.AddToLayout( button_hbox, self._copy_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._paste_button, CC.FLAGS_CENTER_PERPENDICULAR )
vbox = QP.VBoxLayout()

View File

@ -2429,6 +2429,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._load_images_with_pil = QW.QCheckBox( system_panel )
self._load_images_with_pil.setToolTip( ClientGUIFunctions.WrapToolTip( 'We are expecting to drop CV and move to PIL exclusively. This used to be a test option but is now default true and may soon be retired.' ) )
self._do_icc_profile_normalisation = QW.QCheckBox( system_panel )
self._do_icc_profile_normalisation.setToolTip( ClientGUIFunctions.WrapToolTip( 'Should PIL attempt to load ICC Profiles and normalise the colours of an image? This is usually fine, but when it janks out due to an additional OS/GPU ICC Profile, we can turn it off here.' ) )
self._enable_truncated_images_pil = QW.QCheckBox( system_panel )
self._enable_truncated_images_pil.setToolTip( ClientGUIFunctions.WrapToolTip( 'Should PIL be allowed to load broken images that are missing some data? This is usually fine, but some years ago we had stability problems when this was mixed with OpenCV. Now it is default on, but if you need to, you can disable it here.' ) )
@ -2516,6 +2519,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._hide_uninteresting_modified_time.setChecked( self._new_options.GetBoolean( 'hide_uninteresting_modified_time' ) )
self._load_images_with_pil.setChecked( self._new_options.GetBoolean( 'load_images_with_pil' ) )
self._enable_truncated_images_pil.setChecked( self._new_options.GetBoolean( 'enable_truncated_images_pil' ) )
self._do_icc_profile_normalisation.setChecked( self._new_options.GetBoolean( 'do_icc_profile_normalisation' ) )
self._use_system_ffmpeg.setChecked( self._new_options.GetBoolean( 'use_system_ffmpeg' ) )
self._always_loop_animations.setChecked( self._new_options.GetBoolean( 'always_loop_gifs' ) )
self._draw_transparency_checkerboard_media_canvas.setChecked( self._new_options.GetBoolean( 'draw_transparency_checkerboard_media_canvas' ) )
@ -2617,6 +2621,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows.append( ( 'Set a new mpv.conf on dialog ok?:', self._mpv_conf_path ) )
rows.append( ( 'Prefer system FFMPEG:', self._use_system_ffmpeg ) )
rows.append( ( 'Apply image ICC Profile colour adjustments:', self._do_icc_profile_normalisation ) )
rows.append( ( 'Allow loading of truncated images:', self._enable_truncated_images_pil ) )
rows.append( ( 'Load images with PIL:', self._load_images_with_pil ) )
@ -2858,6 +2863,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._new_options.SetBoolean( 'hide_uninteresting_modified_time', self._hide_uninteresting_modified_time.isChecked() )
self._new_options.SetBoolean( 'load_images_with_pil', self._load_images_with_pil.isChecked() )
self._new_options.SetBoolean( 'enable_truncated_images_pil', self._enable_truncated_images_pil.isChecked() )
self._new_options.SetBoolean( 'do_icc_profile_normalisation', self._do_icc_profile_normalisation.isChecked() )
self._new_options.SetBoolean( 'use_system_ffmpeg', self._use_system_ffmpeg.isChecked() )
self._new_options.SetBoolean( 'always_loop_gifs', self._always_loop_animations.isChecked() )
self._new_options.SetBoolean( 'draw_transparency_checkerboard_media_canvas', self._draw_transparency_checkerboard_media_canvas.isChecked() )
@ -5137,6 +5143,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
CG.client_controller.WriteSynchronous( 'serialisable', self._new_options )
# TODO: move all this, including 'original options' gubbins, to the manageoptions call. this dialog shouldn't care about these signals
# we do this to convert tuples to lists and so on
test_new_options = self._new_options.Duplicate()
@ -5151,7 +5158,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
if res_changed or type_changed or dpr_changed:
CG.client_controller.pub( 'reset_thumbnail_cache' )
CG.client_controller.pub( 'clear_thumbnail_cache' )
except Exception as e:

View File

@ -2592,7 +2592,9 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
self._i_am_local_tag_service = self._service.GetServiceType() == HC.LOCAL_TAG
self._tags_box_sorter = ClientGUIListBoxes.StaticBoxSorterForListBoxTags( self, 'tags', self._tag_presentation_location, show_siblings_sort = True )
tags_panel = QW.QWidget( self )
self._tags_box_sorter = ClientGUIListBoxes.StaticBoxSorterForListBoxTags( tags_panel, 'tags', self._tag_presentation_location, show_siblings_sort = True )
self._tags_box = ClientGUIListBoxes.ListBoxTagsMediaTagsDialog( self._tags_box_sorter, self._tag_presentation_location, self.EnterTags, self.RemoveTags )
@ -2661,7 +2663,7 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
#
self._add_tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.AddTags, self._location_context, self._tag_service_key )
self._add_tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( tags_panel, self.AddTags, self._location_context, self._tag_service_key )
self._add_tag_box.movePageLeft.connect( self.movePageLeft )
self._add_tag_box.movePageRight.connect( self.movePageRight )
@ -2691,14 +2693,16 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._tags_box_sorter, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( vbox, self._add_tag_box )
QP.AddToLayout( vbox, self._add_tag_box, CC.FLAGS_EXPAND_PERPENDICULAR )
tags_panel.setLayout( vbox )
#
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, self._suggested_tags, CC.FLAGS_EXPAND_BOTH_WAYS_POLITE )
QP.AddToLayout( hbox, vbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
QP.AddToLayout( hbox, self._suggested_tags, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( hbox, tags_panel, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
#
@ -4746,8 +4750,8 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
input_box = QP.HBoxLayout()
QP.AddToLayout( input_box, self._old_input )
QP.AddToLayout( input_box, self._new_input )
QP.AddToLayout( input_box, self._old_input, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( input_box, self._new_input, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox = QP.VBoxLayout()

View File

@ -769,8 +769,8 @@ class DateTimesCtrl( QW.QWidget ):
button_hbox = QP.HBoxLayout()
QP.AddToLayout( button_hbox, self._copy_button )
QP.AddToLayout( button_hbox, self._paste_button )
QP.AddToLayout( button_hbox, self._copy_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( button_hbox, self._paste_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( vbox, button_hbox, CC.FLAGS_ON_RIGHT )

View File

@ -82,7 +82,7 @@ class DialogNullipotent( DialogThatTakesScrollablePanel ):
buttonbox = QP.HBoxLayout()
QP.AddToLayout( buttonbox, self._close )
QP.AddToLayout( buttonbox, self._close, CC.FLAGS_CENTER_PERPENDICULAR )
return buttonbox
@ -105,8 +105,8 @@ class DialogApplyCancel( DialogThatTakesScrollablePanel ):
buttonbox = QP.HBoxLayout()
QP.AddToLayout( buttonbox, self._apply )
QP.AddToLayout( buttonbox, self._cancel )
QP.AddToLayout( buttonbox, self._apply, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( buttonbox, self._cancel, CC.FLAGS_CENTER_PERPENDICULAR )
return buttonbox

View File

@ -3,6 +3,7 @@ from qtpy import QtCore as QC
from hydrus.core import HydrusConstants as HC
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientGlobals as CG
from hydrus.client.gui import ClientGUIShortcuts
from hydrus.client.gui import ClientGUITopLevelWindows
@ -167,7 +168,7 @@ class CanvasFrame( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindow
vbox = QP.VBoxLayout( margin = 0 )
QP.AddToLayout( vbox, self._canvas_window )
QP.AddToLayout( vbox, self._canvas_window, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
self.setLayout( vbox )

View File

@ -476,9 +476,24 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
ClientGUIDialogsMessage.ShowWarning( self, f'The path you have entered--"{path}"--does not exist! The dialog will not force you to correct it, but this import folder will do no work as long as the location is missing!' )
if HC.BASE_DIR.startswith( path ) or CG.client_controller.GetDBDir().startswith( path ):
( dirs_that_allow_internal_work, dirs_that_cannot_be_touched ) = CG.client_controller.GetImportSensitiveDirectories()
sensitive_paths = list( dirs_that_allow_internal_work ) + list( dirs_that_cannot_be_touched )
for sensitive_path in sensitive_paths:
raise HydrusExceptions.VetoException( 'You cannot set an import path that includes your install or database directory!' )
if sensitive_path.startswith( path ):
raise HydrusExceptions.VetoException( f'You cannot set an import path that includes certain sensitive directories. The problem directory in this case was "{sensitive_path}". Please choose another location.' )
if sensitive_path not in dirs_that_allow_internal_work:
if path.startswith( sensitive_path ):
raise HydrusExceptions.VetoException( f'You cannot set an import path that is inside certain sensitive directories. The problem directory in this case was "{sensitive_path}". Please choose another location.' )
if self._action_successful.GetValue() == CC.IMPORT_FOLDER_MOVE:

View File

@ -83,6 +83,8 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
pre_import_panel = ClientGUICommon.StaticBox( self._specific_options_panel, 'pre-import checks' )
filetype_selector_panel = ClientGUICommon.StaticBox( pre_import_panel, 'allowed filetypes' )
self._exclude_deleted = QW.QCheckBox( pre_import_panel )
tt = 'By default, the client will not try to reimport files that it knows were deleted before. This is a good setting and should be left on in general.'
@ -131,14 +133,16 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
#
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimesTree( pre_import_panel, HC.ALLOWED_MIMES )
#
self._allow_decompression_bombs = QW.QCheckBox( pre_import_panel )
tt = 'This is an old setting, it basically just rejects all jpegs and pngs with more than a 1GB bitmap, or about 250-350 Megapixels. In can be useful if you have an older computer that will die at a 16,000x22,000 png.'
self._allow_decompression_bombs.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
self._mimes = ClientGUIOptionsPanels.OptionsPanelMimesTree( pre_import_panel, HC.ALLOWED_MIMES )
self._min_size = ClientGUIBytes.NoneableBytesControl( pre_import_panel )
self._min_size.SetValue( 5 * 1024 )
@ -222,6 +226,12 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
default_panel.hide()
#
filetype_selector_panel.Add( self._mimes, CC.FLAGS_EXPAND_BOTH_WAYS )
pre_import_panel.Add( filetype_selector_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
#
rows = []
@ -241,7 +251,6 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._preimport_url_check_looks_for_neighbours.setVisible( False )
rows.append( ( 'allowed filetypes: ', self._mimes ) )
rows.append( ( 'allow decompression bombs: ', self._allow_decompression_bombs ) )
rows.append( ( 'minimum filesize: ', self._min_size ) )
rows.append( ( 'maximum filesize: ', self._max_size ) )
@ -281,13 +290,13 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
#
presentation_static_box.Add( self._presentation_import_options_edit_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
presentation_static_box.Add( self._presentation_import_options_edit_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
#
specific_vbox = QP.VBoxLayout()
QP.AddToLayout( specific_vbox, pre_import_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( specific_vbox, pre_import_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( specific_vbox, destination_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( specific_vbox, post_import_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( specific_vbox, presentation_static_box, CC.FLAGS_EXPAND_PERPENDICULAR )
@ -301,9 +310,7 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_ON_RIGHT )
QP.AddToLayout( vbox, default_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, self._load_default_options, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, self._specific_options_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
vbox.addStretch( 1 )
QP.AddToLayout( vbox, self._specific_options_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
self.widget().setLayout( vbox )
@ -929,19 +936,19 @@ class EditPresentationImportOptions( ClientGUIScrolledPanels.EditPanel ):
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, self._presentation_status, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._presentation_inbox, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._presentation_location, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._presentation_status, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._presentation_inbox, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._presentation_location, CC.FLAGS_CENTER_PERPENDICULAR )
#
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
vbox.addStretch( 1 )
self.widget().setLayout( vbox )
vbox.addStretch( 1 )
#
self._presentation_status.currentIndexChanged.connect( self._UpdateInboxChoices )

View File

@ -522,6 +522,7 @@ class COLUMN_LIST_LOCAL_BOORU_SHARES( COLUMN_LIST_DEFINITION ):
INFO = 1
EXPIRES = 2
FILES = 3
column_list_type_name_lookup[ COLUMN_LIST_LOCAL_BOORU_SHARES.ID ] = 'local booru shares'

View File

@ -108,15 +108,15 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
gridbox = QP.GridLayout( cols = 3 )
QP.AddToLayout( gridbox, self._button_7 )
QP.AddToLayout( gridbox, self._button_8 )
QP.AddToLayout( gridbox, self._button_9 )
QP.AddToLayout( gridbox, self._button_4 )
QP.AddToLayout( gridbox, self._button_5 )
QP.AddToLayout( gridbox, self._button_6 )
QP.AddToLayout( gridbox, self._button_1 )
QP.AddToLayout( gridbox, self._button_2 )
QP.AddToLayout( gridbox, self._button_3 )
QP.AddToLayout( gridbox, self._button_7, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_8, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_9, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_4, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_5, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_6, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_1, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_2, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( gridbox, self._button_3, CC.FLAGS_EXPAND_BOTH_WAYS )
self.setLayout( gridbox )

View File

@ -1670,9 +1670,7 @@ class PanelPredicateSystemMime( PanelPredicateSystemSingle ):
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText( self, 'system:filetype' ), CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._mimes, CC.FLAGS_CENTER_PERPENDICULAR )
hbox.addStretch( 1 )
QP.AddToLayout( hbox, self._mimes, CC.FLAGS_EXPAND_BOTH_WAYS )
self.setLayout( hbox )

View File

@ -413,12 +413,16 @@ class EditPredicatesPanel( ClientGUIScrolledPanels.EditPanel ):
QP.AddToLayout( vbox, button, CC.FLAGS_EXPAND_PERPENDICULAR )
stretch_needed = True
for panel in self._editable_pred_panels:
if isinstance( panel, ClientGUIPredicatesOR.ORPredicateControl ):
if isinstance( panel, ( ClientGUIPredicatesOR.ORPredicateControl, ClientGUIPredicatesSingle.PanelPredicateSystemMime ) ):
flags = CC.FLAGS_EXPAND_BOTH_WAYS
stretch_needed = False
else:
flags = CC.FLAGS_EXPAND_PERPENDICULAR
@ -427,6 +431,11 @@ class EditPredicatesPanel( ClientGUIScrolledPanels.EditPanel ):
QP.AddToLayout( vbox, panel, flags )
if stretch_needed:
vbox.addStretch( 1 )
self.widget().setLayout( vbox )
@ -810,20 +819,34 @@ class FleshOutPredicatePanel( ClientGUIScrolledPanels.EditPanel ):
for button in static_pred_buttons:
preds = button.GetPredicates()
QP.AddToLayout( page_vbox, button, CC.FLAGS_EXPAND_PERPENDICULAR )
button.predicatesChosen.connect( self.StaticButtonClicked )
button.predicatesRemoved.connect( self.StaticRemoveButtonClicked )
stretch_needed = True
for panel in editable_pred_panels:
QP.AddToLayout( page_vbox, panel, CC.FLAGS_EXPAND_PERPENDICULAR )
if isinstance( panel, self._PredOKPanel ) and isinstance( panel.GetPredicatePanel(), ClientGUIPredicatesSingle.PanelPredicateSystemMime ):
flags = CC.FLAGS_EXPAND_BOTH_WAYS
stretch_needed = False
else:
flags = CC.FLAGS_EXPAND_PERPENDICULAR
QP.AddToLayout( page_vbox, panel, flags )
page_vbox.addStretch( 1 )
if stretch_needed:
page_vbox.addStretch( 1 )
page_panel.setLayout( page_vbox )
@ -946,6 +969,11 @@ class FleshOutPredicatePanel( ClientGUIScrolledPanels.EditPanel ):
self._parent.SubPanelOK( predicates )
def GetPredicatePanel( self ):
return self._predicate_panel
def keyPressEvent( self, event ):
( modifier, key ) = ClientGUIShortcuts.ConvertKeyEventToSimpleTuple( event )

View File

@ -350,7 +350,7 @@ class EditClientServicePanel( ClientGUIScrolledPanels.EditPanel ):
self._panels.append( EditServiceTagSubPanel( self, self._dictionary ) )
if self._service_type in ( HC.CLIENT_API_SERVICE, HC.LOCAL_BOORU ):
if self._service_type == HC.CLIENT_API_SERVICE:
self._panels.append( EditServiceClientServerSubPanel( self, self._service_type, self._dictionary ) )
@ -1105,12 +1105,7 @@ class EditServiceClientServerSubPanel( ClientGUICommon.StaticBox ):
self._client_server_options_panel = ClientGUICommon.StaticBox( self, 'options' )
if service_type == HC.LOCAL_BOORU:
name = 'local booru'
default_port = 45868
elif service_type == HC.CLIENT_API_SERVICE:
if service_type == HC.CLIENT_API_SERVICE:
name = 'client api'
default_port = 45869
@ -1182,7 +1177,7 @@ class EditServiceClientServerSubPanel( ClientGUICommon.StaticBox ):
rows.append( ( 'normie-friendly welcome page', self._use_normie_eris ) )
rows.append( ( 'upnp port', self._upnp ) )
if service_type == HC.LOCAL_BOORU:
if False: # some old local booru gubbins--maybe delete?
rows.append( ( 'scheme (http/https) override when copying external links', self._external_scheme_override ) )
rows.append( ( 'host override when copying external links', self._external_host_override ) )
@ -4046,7 +4041,6 @@ class ReviewServicesPanel( ClientGUIScrolledPanels.ReviewPanel ):
elif service_type == HC.LOCAL_RATING_LIKE: service_type_name = 'like/dislike ratings'
elif service_type == HC.LOCAL_RATING_NUMERICAL: service_type_name = 'numerical ratings'
elif service_type == HC.LOCAL_RATING_INCDEC: service_type_name = 'inc/dec ratings'
elif service_type == HC.LOCAL_BOORU: service_type_name = 'booru'
elif service_type == HC.CLIENT_API_SERVICE: service_type_name = 'client api'
elif service_type == HC.IPFS: service_type_name = 'ipfs'
else: continue

View File

@ -61,14 +61,7 @@ from hydrus.client.search import ClientSearchAutocomplete
from hydrus.client.search import ClientSearchParseSystemPredicates
from hydrus.client.gui import ClientGUIPopupMessages
LOCAL_BOORU_INT_PARAMS = set()
LOCAL_BOORU_BYTE_PARAMS = { 'share_key', 'hash' }
LOCAL_BOORU_STRING_PARAMS = set()
LOCAL_BOORU_JSON_PARAMS = set()
LOCAL_BOORU_JSON_BYTE_LIST_PARAMS = set()
# if a variable name isn't defined here, a GET with it won't work
CLIENT_API_INT_PARAMS = { 'file_id', 'file_sort_type', 'potentials_search_type', 'pixel_duplicates', 'max_hamming_distance', 'max_num_pairs' }
CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'service_key', 'Hydrus-Client-API-Access-Key', 'Hydrus-Client-API-Session-Key', 'file_service_key', 'deleted_file_service_key', 'tag_service_key', 'tag_service_key_1', 'tag_service_key_2', 'rating_service_key', 'job_status_key' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'search', 'service_name', 'reason', 'tag_display_type', 'source_hash_type', 'desired_hash_type' }
@ -273,13 +266,6 @@ def GetServiceKeyFromName( service_name: str ):
return service_key
def ParseLocalBooruGETArgs( requests_args ):
args = HydrusNetworkVariableHandling.ParseTwistedRequestGETArgs( requests_args, LOCAL_BOORU_INT_PARAMS, LOCAL_BOORU_BYTE_PARAMS, LOCAL_BOORU_STRING_PARAMS, LOCAL_BOORU_JSON_PARAMS, LOCAL_BOORU_JSON_BYTE_LIST_PARAMS )
return args
def ParseClientLegacyArgs( args: dict ):
# adding this v514, so delete when appropriate
@ -975,285 +961,6 @@ def ConvertTagListToPredicates( request, tag_list, do_permission_check = True, e
return predicates
class HydrusResourceBooru( HydrusServerResources.HydrusResource ):
def _callbackParseGETArgs( self, request: HydrusServerRequest.HydrusRequest ):
parsed_request_args = ParseLocalBooruGETArgs( request.args )
request.parsed_request_args = parsed_request_args
return request
def _callbackParsePOSTArgs( self, request: HydrusServerRequest.HydrusRequest ):
return request
def _reportDataUsed( self, request, num_bytes ):
self._service.ReportDataUsed( num_bytes )
def _checkService( self, request: HydrusServerRequest.HydrusRequest ):
HydrusServerResources.HydrusResource._checkService( self, request )
if not self._service.BandwidthOK():
raise HydrusExceptions.BandwidthException( 'This service has run out of bandwidth. Please try again later.' )
class HydrusResourceBooruFile( HydrusResourceBooru ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
share_key = request.parsed_request_args[ 'share_key' ]
hash = request.parsed_request_args[ 'hash' ]
is_attachment = request.parsed_request_args.GetValue( 'download', bool, default_value = False )
CG.client_controller.local_booru_manager.CheckFileAuthorised( share_key, hash )
media_result = CG.client_controller.local_booru_manager.GetMediaResult( share_key, hash )
try:
mime = media_result.GetMime()
path = CG.client_controller.client_files_manager.GetFilePath( hash, mime )
except HydrusExceptions.FileMissingException:
raise HydrusExceptions.NotFoundException( 'Could not find that file!' )
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, path = path, is_attachment = is_attachment )
return response_context
class HydrusResourceBooruGallery( HydrusResourceBooru ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
# in future, make this a standard frame with a search key that'll load xml or yaml AJAX stuff
# with file info included, so the page can sort and whatever
share_key = request.parsed_request_args.GetValue( 'share_key', bytes )
local_booru_manager = CG.client_controller.local_booru_manager
local_booru_manager.CheckShareAuthorised( share_key )
( name, text, timeout, media_results ) = local_booru_manager.GetGalleryInfo( share_key )
body = '''<html>
<head>'''
if name == '': body += '''
<title>hydrus network local booru share</title>'''
else: body += '''
<title>''' + name + '''</title>'''
body += '''
<link href="hydrus.ico" rel="shortcut icon" />
<link href="style.css" rel="stylesheet" type="text/css" />'''
( thumbnail_width, thumbnail_height ) = HC.options[ 'thumbnail_dimensions' ]
body += '''
<style>
.thumbnail_container { width: ''' + str( thumbnail_width ) + '''px; height: ''' + str( thumbnail_height ) + '''px; }
</style>'''
body += '''
</head>
<body>'''
body += '''
<div class="timeout">This share ''' + HydrusTime.TimestampToPrettyExpires( timeout ) + '''.</div>'''
if name != '': body += '''
<h3>''' + name + '''</h3>'''
if text != '':
newline = '''</p>
<p>'''
body += '''
<p>''' + text.replace( '\n', newline ).replace( '\n', newline ) + '''</p>'''
body+= '''
<div class="media">'''
for media_result in media_results:
hash = media_result.GetHash()
mime = media_result.GetMime()
# if mime in flash or pdf or whatever, get other thumbnail
body += '''
<span class="thumbnail">
<span class="thumbnail_container">
<a href="page?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''">
<img src="thumbnail?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''" />
</a>
</span>
</span>'''
body += '''
</div>
<div class="footer"><a href="https://hydrusnetwork.github.io/hydrus/">hydrus network</a></div>
</body>
</html>'''
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.TEXT_HTML, body = body )
return response_context
class HydrusResourceBooruPage( HydrusResourceBooru ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
share_key = request.parsed_request_args.GetValue( 'share_key', bytes )
hash = request.parsed_request_args.GetValue( 'hash', bytes )
local_booru_manager = CG.client_controller.local_booru_manager
local_booru_manager.CheckFileAuthorised( share_key, hash )
( name, text, timeout, media_result ) = local_booru_manager.GetPageInfo( share_key, hash )
body = '''<html>
<head>'''
if name == '': body += '''
<title>hydrus network local booru share</title>'''
else: body += '''
<title>''' + name + '''</title>'''
body += '''
<link href="hydrus.ico" rel="shortcut icon" />
<link href="style.css" rel="stylesheet" type="text/css" />'''
body += '''
</head>
<body>'''
body += '''
<div class="timeout">This share ''' + HydrusTime.TimestampToPrettyExpires( timeout ) + '''.</div>'''
if name != '': body += '''
<h3>''' + name + '''</h3>'''
if text != '':
newline = '''</p>
<p>'''
body += '''
<p>''' + text.replace( '\n', newline ).replace( '\n', newline ) + '''</p>'''
body+= '''
<div class="media">'''
mime = media_result.GetMime()
if mime in HC.IMAGES or mime in HC.VIEWABLE_ANIMATIONS:
( width, height ) = media_result.GetResolution()
body += '''
<img width="''' + str( width ) + '''" height="''' + str( height ) + '''" src="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''" />'''
elif mime in HC.VIDEO:
( width, height ) = media_result.GetResolution()
body += '''
<video width="''' + str( width ) + '''" height="''' + str( height ) + '''" controls="" loop="" autoplay="" src="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''" />
<p><a href="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''">link to ''' + HC.mime_string_lookup[ mime ] + ''' file</a></p>'''
elif mime == HC.APPLICATION_FLASH:
( width, height ) = media_result.GetResolution()
body += '''
<embed width="''' + str( width ) + '''" height="''' + str( height ) + '''" src="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''" />
<p><a href="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''">link to ''' + HC.mime_string_lookup[ mime ] + ''' file</a></p>'''
else:
body += '''
<p><a href="file?share_key=''' + share_key.hex() + '''&hash=''' + hash.hex() + '''">link to ''' + HC.mime_string_lookup[ mime ] + ''' file</a></p>'''
body += '''
</div>
<div class="footer"><a href="https://hydrusnetwork.github.io/hydrus/">hydrus network</a></div>
</body>
</html>'''
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.TEXT_HTML, body = body )
return response_context
class HydrusResourceBooruThumbnail( HydrusResourceBooru ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
share_key = request.parsed_request_args.GetValue( 'share_key', bytes )
hash = request.parsed_request_args.GetValue( 'hash', bytes )
local_booru_manager = CG.client_controller.local_booru_manager
local_booru_manager.CheckFileAuthorised( share_key, hash )
media_result = local_booru_manager.GetMediaResult( share_key, hash )
mime = media_result.GetMime()
response_context_mime = HC.IMAGE_PNG
if mime in HC.MIMES_WITH_THUMBNAILS:
try:
path = CG.client_controller.client_files_manager.GetThumbnailPath( media_result )
if not os.path.exists( path ):
# not _supposed_ to happen, but it seems in odd situations it can
raise HydrusExceptions.FileMissingException()
except HydrusExceptions.FileMissingException:
path = HydrusFileHandling.mimes_to_default_thumbnail_paths[ mime ]
else:
path = HydrusFileHandling.mimes_to_default_thumbnail_paths[ mime ]
response_mime = HydrusFileHandling.GetThumbnailMime( path )
response_context = HydrusServerResources.ResponseContext( 200, mime = response_mime, path = path )
return response_context
class HydrusResourceClientAPI( HydrusServerResources.HydrusResource ):
BLOCKED_WHEN_BUSY = True

View File

@ -105,7 +105,7 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 573
SOFTWARE_VERSION = 574
CLIENT_API_VERSION = 64
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -458,7 +458,7 @@ LOCAL_FILE_SERVICES = SPECIFIC_LOCAL_FILE_SERVICES + ( COMBINED_LOCAL_FILE, COMB
LOCAL_FILE_SERVICES_IN_NICE_ORDER = ( LOCAL_FILE_DOMAIN, COMBINED_LOCAL_MEDIA, LOCAL_FILE_TRASH_DOMAIN, LOCAL_FILE_UPDATE_DOMAIN, COMBINED_LOCAL_FILE )
LOCAL_TAG_SERVICES = ( LOCAL_TAG, )
LOCAL_SERVICES = LOCAL_FILE_SERVICES + LOCAL_TAG_SERVICES + ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, LOCAL_RATING_INCDEC, LOCAL_BOORU, LOCAL_NOTES, CLIENT_API_SERVICE )
LOCAL_SERVICES = LOCAL_FILE_SERVICES + LOCAL_TAG_SERVICES + ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, LOCAL_RATING_INCDEC, LOCAL_NOTES, CLIENT_API_SERVICE )
STAR_RATINGS_SERVICES = ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, RATING_LIKE_REPOSITORY, RATING_NUMERICAL_REPOSITORY )
RATINGS_SERVICES = ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, LOCAL_RATING_INCDEC, RATING_LIKE_REPOSITORY, RATING_NUMERICAL_REPOSITORY )

View File

@ -108,17 +108,12 @@ def VacuumDB( db_path ):
c.execute( 'PRAGMA journal_mode = TRUNCATE;' )
if HC.PLATFORM_WINDOWS:
ideal_page_size = 4096
else:
ideal_page_size = 1024
# this used to be 1024 for Linux users, so we do want to check and coerce back to SQLite default, 4096
( page_size, ) = c.execute( 'PRAGMA page_size;' ).fetchone()
ideal_page_size = 4096
if page_size != ideal_page_size:
c.execute( 'PRAGMA journal_mode = TRUNCATE;' )
@ -131,6 +126,7 @@ def VacuumDB( db_path ):
c.execute( 'PRAGMA journal_mode = {};'.format( HG.db_journal_mode ) )
class HydrusDB( HydrusDBBase.DBBase ):
READ_WRITE_ACTIONS = []
@ -678,6 +674,14 @@ class HydrusDB( HydrusDBBase.DBBase ):
module.Repair( version, self._cursor_transaction_wrapper )
if HG.controller.LastShutdownWasBad():
for module in self._modules:
module.DoLastShutdownWasBadWork()
def _ReportOverupdatedDB( self, version ):

View File

@ -68,6 +68,11 @@ class HydrusDBModule( HydrusDBBase.DBBase ):
self._Execute( create_query_without_name.format( table_name ) )
def _DoLastShutdownWasBadWork( self ):
pass
def _GetCriticalTableNames( self ) -> typing.Collection[ str ]:
return set()
@ -162,6 +167,11 @@ class HydrusDBModule( HydrusDBBase.DBBase ):
def DoLastShutdownWasBadWork( self ):
self._DoLastShutdownWasBadWork()
def GetExpectedServiceTableNames( self ) -> typing.Collection[ str ]:
table_generation_dict = self._GetServicesTableGenerationDict()

View File

@ -41,9 +41,14 @@ def SetEnableLoadTruncatedImages( value: bool ):
if hasattr( PILImageFile, 'LOAD_TRUNCATED_IMAGES' ):
# this can now cause load hangs due to the trunc load code adding infinite fake EOFs to the file stream, wew lad
# hence debug only
PILImageFile.LOAD_TRUNCATED_IMAGES = value
if PILImageFile.LOAD_TRUNCATED_IMAGES != value:
# this has previously caused load hangs due to the trunc load code adding infinite fake EOFs to the file stream, wew lad
PILImageFile.LOAD_TRUNCATED_IMAGES = value
HG.controller.pub( 'clear_image_cache' )
HG.controller.pub( 'clear_image_tile_cache' )
return True

View File

@ -9,6 +9,7 @@ from PIL import ImageCms as PILImageCms
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core.files.images import HydrusImageColours
from hydrus.core.files.images import HydrusImageMetadata
@ -21,6 +22,21 @@ except:
PIL_SRGB_PROFILE = PILImageCms.createProfile( 'sRGB' )
DO_ICC_PROFILE_NORMALISATION = True
def SetDoICCProfileNormalisation( value: bool ):
global DO_ICC_PROFILE_NORMALISATION
if value != DO_ICC_PROFILE_NORMALISATION:
DO_ICC_PROFILE_NORMALISATION = value
HG.controller.pub( 'clear_image_cache' )
HG.controller.pub( 'clear_image_tile_cache' )
def NormaliseNumPyImageToUInt8( numpy_image: numpy.array ):
if numpy_image.dtype == numpy.uint16:
@ -110,7 +126,7 @@ def DequantizeFreshlyLoadedNumPyImage( numpy_image: numpy.array ) -> numpy.array
def DequantizePILImage( pil_image: PILImage.Image ) -> PILImage.Image:
if HydrusImageMetadata.HasICCProfile( pil_image ):
if HydrusImageMetadata.HasICCProfile( pil_image ) and DO_ICC_PROFILE_NORMALISATION:
try:

View File

@ -29,14 +29,9 @@ class TestManagers( unittest.TestCase ):
repo = ClientServices.GenerateService( repo_key, repo_type, repo_name )
other_key = HydrusData.GenerateKey()
other = ClientServices.GenerateService( other_key, HC.LOCAL_BOORU, 'booru' )
services = []
services.append( repo )
services.append( other )
HG.test_controller.SetRead( 'services', services )
@ -48,8 +43,6 @@ class TestManagers( unittest.TestCase ):
test_service( service, repo_key, repo_type, repo_name )
service = services_manager.GetService( other_key )
#
services = services_manager.GetServices( ( HC.TAG_REPOSITORY, ) )
@ -62,13 +55,11 @@ class TestManagers( unittest.TestCase ):
services = []
services.append( repo )
HG.test_controller.SetRead( 'services', services )
services_manager.RefreshServices()
self.assertRaises( Exception, services_manager.GetService, other_key )
self.assertRaises( Exception, services_manager.GetService, repo_key )
def test_undo( self ):

View File

@ -2023,7 +2023,7 @@ class TestClientDB( unittest.TestCase ):
#
NUM_DEFAULT_SERVICES = 14
NUM_DEFAULT_SERVICES = 13
services = self._read( 'services' )

View File

@ -218,7 +218,6 @@ class Controller( object ):
self._name_read_responses = {}
self._name_read_responses[ 'local_booru_share_keys' ] = []
self._name_read_responses[ 'messaging_sessions' ] = []
self._name_read_responses[ 'options' ] = ClientDefaults.GetClientDefaultOptions()
self._name_read_responses[ 'file_system_predicates' ] = []
@ -674,6 +673,11 @@ class Controller( object ):
return False
def LastShutdownWasBad( self ):
return False
def PageAlive( self, page_key ):
return False

View File

@ -1,8 +1,7 @@
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
Version 3, May 2010
Copyright (C) 2010 by Kris Craig
Olympia, WA USA
Copyright (C) 2011 Hydrus Developer
Everyone is permitted to copy and distribute verbatim or modified
copies of this license document, and changing it is allowed as long
@ -21,4 +20,4 @@ where otherwise explicitly stated.
DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION, AND MODIFICATION
0. You just DO WHAT THE FUCK YOU WANT TO.
0. You just DO WHAT THE FUCK YOU WANT TO.

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB