Version 457
This commit is contained in:
parent
6cd2995275
commit
61ea185821
Binary file not shown.
After Width: | Height: | Size: 15 KiB |
|
@ -10,16 +10,6 @@
|
|||
<p>I am always changing and adding little things. The best way to learn is just to look around. If you think a shortcut should probably do something, try it out! If you can't find something, let me know and I'll try to add it!</p>
|
||||
<h3 id="advanced_mode"><a href="#advanced_mode">advanced mode</a></h3>
|
||||
<p>To avoid confusing clutter, several advanced menu items and buttons are hidden by default. When you are comfortable with the program, hit <i>help->advanced mode</i> to reveal them!</p>
|
||||
<h3 id="wildcards"><a href="#wildcards">searching with wildcards</a></h3>
|
||||
<p>The autocomplete tag dropdown supports wildcard searching with '*'.</p>
|
||||
<p><img src="wildcard_gelion.png"/></p>
|
||||
<p>The '*' will match any number of characters. Every normal autocomplete search has a secret '*' on the end that you don't see, which is how full words get matched from you only typing in a few letters.</p>
|
||||
<p>This is useful when you can only remember part of a word, or can't spell part of it. You can put '*' characters anywhere, but you should experiment to get used to the exact way these searches work. Some results can be surprising!</p>
|
||||
<p><img src="wildcard_vage.png"/></p>
|
||||
<p>You can select the special predicate inserted at the top of your autocomplete results (the highlighted '*gelion' and '*va*ge*' above). <b>It will return all files that match that wildcard,</b> i.e. every file for every other tag in the dropdown list.</p>
|
||||
<p>This is particularly useful if you have a number of files with commonly structured over-informationed tags, like this:</p>
|
||||
<p><img src="wildcard_cool_pic.png"/></p>
|
||||
<p>In this case, selecting the 'title:cool pic*' predicate will return all three images in the same search, where you can conveniently give them some more-easily searched tags like 'series:cool pic' and 'page:1', 'page:2', 'page:3'.</p>
|
||||
<h3 id="exclude_deleted_files"><a href="#exclude_deleted_files">exclude deleted files</a></h3>
|
||||
<p>In the client's options is a checkbox to exclude deleted files. It recurs pretty much anywhere you can import, under 'import file options'. If you select this, any file you ever deleted will be excluded from all future remote searches and import operations. This can stop you from importing/downloading and filtering out the same bad files several times over. The default is off. You may wish to have it set one way most of the time, but switch it the other just for one specific import or search.</p>
|
||||
<h3 id="ime"><a href="#ime">inputting non-english lanuages</a></h3>
|
||||
|
|
|
@ -8,6 +8,38 @@
|
|||
<div class="content">
|
||||
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
|
||||
<ul>
|
||||
<li><h3 id="version_457"><a href="#version_457">version 457</a></h3></li>
|
||||
<ul>
|
||||
<li>smoother menubar updates:</li>
|
||||
<li>improved the way the menubar menus update. rather than generating a whole new (e.g. 'pages') menu and replacing the existing out of date one, now there is a static menu skeleton that has subsections or labels updated in place. this means fewer objects changing, less flicker/jank, and should allow you to upload pending even if you have, say, a bunch of subscriptions running</li>
|
||||
<li>.</li>
|
||||
<li>misc:</li>
|
||||
<li>thanks to a user's help, the filetype parser now detects pngs (this mostly happens during import) much faster! the problem previously was determining if a png is actually an apng--figuring out if they are truly apngs is now done with very fast file header scanning, rather than the previous method that booted ffmpeg. this brings filetype parse time for pngs down from 50-150ms to 1-2ms</li>
|
||||
<li>getting apng metadata is also now faster. num_frames is now pulled from the file header, it no longer has to be manually counted by ffmpeg</li>
|
||||
<li>clicking the session weight item in the 'pages' menu now gives you more detailed info on your session weight, including on currently closed pages in the undo list</li>
|
||||
<li>stripped out a lot of ancient wx-era safety code that stops the client from doing certain UI work while it is minimised or minimised to tray. also brushed up some ugly update routines for menus refresh and modal message presentation that could lead to a pile-up of updates as soon as the client was unminimised, causing lag or freezes. with luck, the client should be better about restoring itself from minimised to system tray. if you minimise to tray, feedback on how this works out for you would be appreciated</li>
|
||||
<li>when a network job stalls with the 'this domain had some errors recently' message, the cog menu on the widget now allows you to 'scrub domain errors' and try again immediately</li>
|
||||
<li>if your search has system:limit, then any tag search you type in the autocomplete will now search the database, not your thumbnails. previously, the hack to enable this behaviour was to flip 'searching immediately' off. let's see if this new behaviour is ultimately confusing/annoying, I am mixed on it and think this subtle search option needs more thought and UI to make it more obvious and user friendly</li>
|
||||
<li>if you have autocomplete tag search typed, and results from thumbnails displayed, and you flip 'searching immediately' off, the search will now automatically update and give you full database numbers immediately</li>
|
||||
<li>.</li>
|
||||
<li>help:</li>
|
||||
<li>I moved 'searching with wildcards' from the advanced help to the 'more getting started with files' help here: https://hydrusnetwork.github.io/hydrus/help/getting_started_more_files.html</li>
|
||||
<li>I also wrote a more detailed description of what the autocomplete dropdown buttons do in that page</li>
|
||||
<li>I also wrote a brief description of how a system:limit query will try to clip according to the current file sort, getting the n 'biggest files' and so on</li>
|
||||
<li>.</li>
|
||||
<li>boring code cleanup:</li>
|
||||
<li>cleaned some network job widget update calls</li>
|
||||
<li>improved some misc autocomplete search status tracking</li>
|
||||
<li>improved some account object permission checking and tests. accounts now never say they have permissions (e.g. if you click the 'see account permissions' button on review services) if they are banned or expired</li>
|
||||
<li>file and pages menus now uses the new update routine</li>
|
||||
<li>pending menu now uses the new update routine, with an emphasis on anti-jitter so you can interact while it is updating</li>
|
||||
<li>database, network, service, and undo menu now use newer async update code and also use the new update routine</li>
|
||||
<li>cleaned up help and tags menu init code</li>
|
||||
<li>the signal that causes the pending menu to update is now only sent on tag changes if the tag service is a repository (previously, local-only updates were janking this for no reason)</li>
|
||||
<li>the pending menu now updates its sibling/parent numbers when repository processing causes a clever row change to stuff you have pending</li>
|
||||
<li>also, some menubar items that only show when in advanced mode now update their visibility when advanced mode is flipped on or off</li>
|
||||
<li>misc menubar code cleanup and improvements</li>
|
||||
</ul>
|
||||
<li><h3 id="version_456"><a href="#version_456">version 456</a></h3></li>
|
||||
<ul>
|
||||
<li>misc:</li>
|
||||
|
|
|
@ -7,6 +7,47 @@
|
|||
<body>
|
||||
<div class="content">
|
||||
<p><a href="getting_started_files.html"><---- Back</a></p>
|
||||
<h3 id="wildcards"><a href="#wildcards">searching with wildcards</a></h3>
|
||||
<p>The autocomplete tag dropdown supports wildcard searching with '*'.</p>
|
||||
<p><img src="wildcard_gelion.png"/></p>
|
||||
<p>The '*' will match any number of characters. Every normal autocomplete search has a secret '*' on the end that you don't see, which is how full words get matched from you only typing in a few letters.</p>
|
||||
<p>This is useful when you can only remember part of a word, or can't spell part of it. You can put '*' characters anywhere, but you should experiment to get used to the exact way these searches work. Some results can be surprising!</p>
|
||||
<p><img src="wildcard_vage.png"/></p>
|
||||
<p>You can select the special predicate inserted at the top of your autocomplete results (the highlighted '*gelion' and '*va*ge*' above). <b>It will return all files that match that wildcard,</b> i.e. every file for every other tag in the dropdown list.</p>
|
||||
<p>This is particularly useful if you have a number of files with commonly structured over-informationed tags, like this:</p>
|
||||
<p><img src="wildcard_cool_pic.png"/></p>
|
||||
<p>In this case, selecting the 'title:cool pic*' predicate will return all three images in the same search, where you can conveniently give them some more-easily searched tags like 'series:cool pic' and 'page:1', 'page:2', 'page:3'.</p>
|
||||
<h3 id="more_searching"><a href="#more_searching">more searching</a></h3>
|
||||
<p>Let's look at the tag autocomplete dropdown again:</p>
|
||||
<p><img src="ac_dropdown.png"/></p>
|
||||
<ul>
|
||||
<li>
|
||||
<p><b>favourite searches star</b></p>
|
||||
<p>Once you get experience with the client, have a play with this. Rather than leaving common search pages open, save them in here and load them up as needed. You will keep your client lightweight and save time.</p>
|
||||
</li>
|
||||
<li>
|
||||
<p><b>include current/pending tags</b></p>
|
||||
<p>Turn these on and off to control whether tag <i>search predicates</i> apply to tags the exist, or limit just to those pending to be uploaded to a tag repository. Just searching 'pending' tags is useful if you want to scan what you have pending to go up to the PTR--just turn off 'current' tags and search 'system:num tags > 0'.</p>
|
||||
</li>
|
||||
<li>
|
||||
<p><b>searching immediately</b></p>
|
||||
<p>This controls whether a change to the search tags will instantly run the new search and get new results. Turning this off is helpful if you want to add, remove, or replace several heavy search terms in a row without getting UI lag.</p>
|
||||
</li>
|
||||
<li>
|
||||
<p><b>OR</b></p>
|
||||
<p>You only see this if you have 'advanced mode' on. It is an experimental module. Have a play with it--it lets you enter some pretty complicated tags!</p>
|
||||
</li>
|
||||
<li>
|
||||
<p><b>file/tag domains</b></p>
|
||||
<p>By default, you will search in 'my files' and 'all known tags' domain. This is the intersection of your local media files (on your hard disk) and the union of all known tag searches. If you search for 'character:samus aran', then you will get file results from your 'my files' domain that have 'character:samus aran' in any tag service. For most purposes, this search domain is fine, but as you use the client more, you may want to access different search domains.</p>
|
||||
<p>For instance, if you change the file domain to 'trash', then you will instead get files that are in your trash. Setting the tag domain to 'my tags' will ignore other tag services (e.g. the PTR) for all tag search predicates, so a 'system:num_tags' or a 'character:samus aran' will only look 'my tags'.</p>
|
||||
<p>Turning on 'advanced mode' gives access to more search domains. Some of them are subtly complicated and only useful for clever jobs--most of the time, you still want 'my files' and 'all known tags'.</p>
|
||||
</li>
|
||||
</ul>
|
||||
<h3 id="sorting_with_system_limit"><a href="#sorting_with_system_limit">sorting with system limit</a></h3>
|
||||
<p>If you add system:limit to a search, the client will consider what that page's file sort currently is. If it is simple enough--something like file size or import time--then it will sort your results before they come back and clip the limit according to that sort, getting the n 'largest file size' or 'newest imports' and so on. This can be a great way to set up a lightweight filtering page for 'the 256 biggest videos in my inbox'.</p>
|
||||
<p>If you change the sort, hydrus will not refresh the search, it'll just re-sort the n files you have. Hit F5 to refresh the search with a new sort.</p>
|
||||
<p>Not all sorts are supported. Anything complicated like tag sort will result in a random sample instead.</p>
|
||||
<h3 id="intro"><a href="#intro">exporting and uploading</a></h3>
|
||||
<p>There are many ways to export files from the client:</p>
|
||||
<ul>
|
||||
|
|
|
@ -787,6 +787,11 @@ class FileSystemPredicates( object ):
|
|||
return self._has_system_everything
|
||||
|
||||
|
||||
def HasSystemLimit( self ):
|
||||
|
||||
return self._limit is not None
|
||||
|
||||
|
||||
def MustBeArchive( self ): return self._archive
|
||||
|
||||
def MustBeInbox( self ): return self._inbox
|
||||
|
|
|
@ -12250,7 +12250,14 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self.modules_files_storage.RescindPendFiles( service_id, hash_ids )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_key == CC.COMBINED_LOCAL_FILE_SERVICE_KEY:
|
||||
|
||||
notify_new_downloads = True
|
||||
|
||||
else:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
elif action == HC.CONTENT_UPDATE_RESCIND_PETITION:
|
||||
|
||||
|
@ -12439,6 +12446,11 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_parent_tag_ids.update( ( child_tag_id, parent_tag_id ) )
|
||||
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
elif action in ( HC.CONTENT_UPDATE_PEND, HC.CONTENT_UPDATE_PETITION ):
|
||||
|
||||
( child_tag, parent_tag ) = row
|
||||
|
@ -12471,7 +12483,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_parent_tag_ids.update( ( child_tag_id, parent_tag_id ) )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
elif action in ( HC.CONTENT_UPDATE_RESCIND_PEND, HC.CONTENT_UPDATE_RESCIND_PETITION ):
|
||||
|
||||
|
@ -12501,7 +12516,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_parent_tag_ids.update( ( child_tag_id, parent_tag_id ) )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
|
||||
notify_new_parents = True
|
||||
|
@ -12536,6 +12554,11 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_sibling_tag_ids.update( ( bad_tag_id, good_tag_id ) )
|
||||
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
elif action in ( HC.CONTENT_UPDATE_PEND, HC.CONTENT_UPDATE_PETITION ):
|
||||
|
||||
( bad_tag, good_tag ) = row
|
||||
|
@ -12568,7 +12591,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_sibling_tag_ids.update( ( bad_tag_id, good_tag_id ) )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
elif action in ( HC.CONTENT_UPDATE_RESCIND_PEND, HC.CONTENT_UPDATE_RESCIND_PETITION ):
|
||||
|
||||
|
@ -12598,7 +12624,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
changed_sibling_tag_ids.update( ( bad_tag_id, good_tag_id ) )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
|
||||
notify_new_siblings = True
|
||||
|
@ -12698,7 +12727,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._UpdateMappings( service_id, mappings_ids = ultimate_mappings_ids, deleted_mappings_ids = ultimate_deleted_mappings_ids, pending_mappings_ids = ultimate_pending_mappings_ids, pending_rescinded_mappings_ids = ultimate_pending_rescinded_mappings_ids, petitioned_mappings_ids = ultimate_petitioned_mappings_ids, petitioned_rescinded_mappings_ids = ultimate_petitioned_rescinded_mappings_ids )
|
||||
|
||||
notify_new_pending = True
|
||||
if service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
notify_new_pending = True
|
||||
|
||||
|
||||
|
||||
if len( changed_sibling_tag_ids ) > 0:
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1876,9 +1876,23 @@ class GalleryImportPanel( ClientGUICommon.StaticBox ):
|
|||
|
||||
( file_network_job, gallery_network_job ) = self._gallery_import.GetNetworkJobs()
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
if file_network_job is None:
|
||||
|
||||
self._file_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
|
||||
|
||||
self._gallery_download_control.SetNetworkJob( gallery_network_job )
|
||||
if gallery_network_job is None:
|
||||
|
||||
self._gallery_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._gallery_download_control.SetNetworkJob( gallery_network_job )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -2513,9 +2527,23 @@ class WatcherReviewPanel( ClientGUICommon.StaticBox ):
|
|||
|
||||
( file_network_job, checker_network_job ) = self._watcher.GetNetworkJobs()
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
if file_network_job is None:
|
||||
|
||||
self._file_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
|
||||
|
||||
self._checker_download_control.SetNetworkJob( checker_network_job )
|
||||
if checker_network_job is None:
|
||||
|
||||
self._checker_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._checker_download_control.SetNetworkJob( checker_network_job )
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1131,7 +1131,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return domains_to_login_info
|
||||
|
||||
|
||||
def GenerateTestNetworkJobPresentationContextFactory( window, network_job_control ):
|
||||
def GenerateTestNetworkJobPresentationContextFactory( window: QW.QWidget, network_job_control: ClientGUINetworkJobControl.NetworkJobControl ):
|
||||
|
||||
def network_job_presentation_context_factory( network_job ):
|
||||
|
||||
|
@ -1142,7 +1142,14 @@ def GenerateTestNetworkJobPresentationContextFactory( window, network_job_contro
|
|||
return
|
||||
|
||||
|
||||
network_job_control.SetNetworkJob( nj )
|
||||
if nj is None:
|
||||
|
||||
network_job_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
network_job_control.SetNetworkJob( nj )
|
||||
|
||||
|
||||
|
||||
def enter_call():
|
||||
|
|
|
@ -180,7 +180,7 @@ def GetEventCallable( callable, *args, **kwargs ):
|
|||
|
||||
return event_callable
|
||||
|
||||
def SanitiseLabel( label ):
|
||||
def SanitiseLabel( label: str ) -> str:
|
||||
|
||||
if label == '':
|
||||
|
||||
|
@ -189,3 +189,15 @@ def SanitiseLabel( label ):
|
|||
|
||||
return label.replace( '&', '&&' )
|
||||
|
||||
def SetMenuItemLabel( menu_item: QW.QAction, label: str ):
|
||||
|
||||
label = SanitiseLabel( label )
|
||||
|
||||
menu_item.setText( label )
|
||||
|
||||
def SetMenuTitle( menu: QW.QMenu, label: str ):
|
||||
|
||||
label = SanitiseLabel( label )
|
||||
|
||||
menu.setTitle( label )
|
||||
|
||||
|
|
|
@ -687,7 +687,10 @@ class PopupMessageManager( QW.QWidget ):
|
|||
|
||||
def _DoDebugHide( self ):
|
||||
|
||||
if not QP.isValid( self ): return
|
||||
if not QP.isValid( self ):
|
||||
|
||||
return
|
||||
|
||||
|
||||
parent = self.parentWidget()
|
||||
|
||||
|
@ -1113,6 +1116,7 @@ class PopupMessageManager( QW.QWidget ):
|
|||
raise
|
||||
|
||||
|
||||
|
||||
# This was originally a reviewpanel subclass which is a scroll area subclass, but having it in a scroll area didn't work out with dynamically updating size as the widget contents change.
|
||||
class PopupMessageDialogPanel( QW.QWidget ):
|
||||
|
||||
|
|
|
@ -175,6 +175,11 @@ class NetworkJobControl( QW.QFrame ):
|
|||
ClientGUIMenus.AppendMenuItem( menu, 'reattempt connection now', 'Stop waiting on a connection error and reattempt the job now.', self._network_job.OverrideConnectionErrorWait )
|
||||
|
||||
|
||||
if not self._network_job.DomainOK():
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'scrub domain errors', 'Clear recent domain errors and allow this job to go now.', self._network_job.ScrubDomainErrors )
|
||||
|
||||
|
||||
if self._network_job.CurrentlyWaitingOnServersideBandwidth():
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'reattempt request now (server reports low bandwidth)', 'Stop waiting on a serverside bandwidth delay and reattempt the job now.', self._network_job.OverrideServersideBandwidthWait )
|
||||
|
@ -330,7 +335,16 @@ class NetworkJobControl( QW.QFrame ):
|
|||
|
||||
def ClearNetworkJob( self ):
|
||||
|
||||
self.SetNetworkJob( None )
|
||||
if self._network_job is not None:
|
||||
|
||||
self._network_job = None
|
||||
|
||||
self._gauge.setToolTip( '' )
|
||||
|
||||
self._Update()
|
||||
|
||||
HG.client_controller.gui.UnregisterUIUpdateWindow( self )
|
||||
|
||||
|
||||
|
||||
def FlipAutoOverrideBandwidth( self ):
|
||||
|
@ -338,33 +352,17 @@ class NetworkJobControl( QW.QFrame ):
|
|||
self._auto_override_bandwidth_rules = not self._auto_override_bandwidth_rules
|
||||
|
||||
|
||||
def SetNetworkJob( self, network_job: typing.Optional[ ClientNetworkingJobs.NetworkJob ] ):
|
||||
def SetNetworkJob( self, network_job: ClientNetworkingJobs.NetworkJob ):
|
||||
|
||||
if network_job is None:
|
||||
if self._network_job != network_job:
|
||||
|
||||
if self._network_job is not None:
|
||||
|
||||
self._network_job = None
|
||||
|
||||
self._gauge.setToolTip( '' )
|
||||
|
||||
self._Update()
|
||||
|
||||
HG.client_controller.gui.UnregisterUIUpdateWindow( self )
|
||||
|
||||
self._network_job = network_job
|
||||
|
||||
else:
|
||||
self._gauge.setToolTip( self._network_job.GetURL() )
|
||||
|
||||
if self._network_job != network_job:
|
||||
|
||||
self._network_job = network_job
|
||||
|
||||
self._gauge.setToolTip( self._network_job.GetURL() )
|
||||
|
||||
self._Update()
|
||||
|
||||
HG.client_controller.gui.RegisterUIUpdateWindow( self )
|
||||
|
||||
self._Update()
|
||||
|
||||
HG.client_controller.gui.RegisterUIUpdateWindow( self )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -3700,9 +3700,23 @@ class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
|||
|
||||
( file_network_job, page_network_job ) = self._simple_downloader_import.GetNetworkJobs()
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
if file_network_job is None:
|
||||
|
||||
self._file_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
|
||||
|
||||
self._page_download_control.SetNetworkJob( page_network_job )
|
||||
if page_network_job is None:
|
||||
|
||||
self._page_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._page_download_control.SetNetworkJob( page_network_job )
|
||||
|
||||
|
||||
|
||||
def CheckAbleToClose( self ):
|
||||
|
@ -3923,9 +3937,23 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
|||
|
||||
( file_network_job, gallery_network_job ) = self._urls_import.GetNetworkJobs()
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
if file_network_job is None:
|
||||
|
||||
self._file_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._file_download_control.SetNetworkJob( file_network_job )
|
||||
|
||||
|
||||
self._gallery_download_control.SetNetworkJob( gallery_network_job )
|
||||
if gallery_network_job is None:
|
||||
|
||||
self._gallery_download_control.ClearNetworkJob()
|
||||
|
||||
else:
|
||||
|
||||
self._gallery_download_control.SetNetworkJob( gallery_network_job )
|
||||
|
||||
|
||||
|
||||
def CheckAbleToClose( self ):
|
||||
|
|
|
@ -31,6 +31,18 @@ from hydrus.client.gui.pages import ClientGUIResults
|
|||
from hydrus.client.gui.pages import ClientGUISession
|
||||
from hydrus.client.gui.pages import ClientGUISessionLegacy # to get serialisable data types loaded
|
||||
|
||||
def ConvertNumHashesToWeight( num_hashes: int ) -> int:
|
||||
|
||||
return num_hashes
|
||||
|
||||
def ConvertNumHashesAndSeedsToWeight( num_hashes: int, num_seeds: int ) -> int:
|
||||
|
||||
return ConvertNumHashesToWeight( num_hashes ) + ConvertNumSeedsToWeight( num_seeds )
|
||||
|
||||
def ConvertNumSeedsToWeight( num_seeds: int ) -> int:
|
||||
|
||||
return num_seeds * 20
|
||||
|
||||
class DialogPageChooser( ClientGUIDialogs.Dialog ):
|
||||
|
||||
def __init__( self, parent, controller ):
|
||||
|
@ -784,12 +796,19 @@ class Page( QW.QSplitter ):
|
|||
return ( hpos, vpos )
|
||||
|
||||
|
||||
def GetTotalWeight( self ):
|
||||
def GetTotalNumHashesAndSeeds( self ):
|
||||
|
||||
num_hashes = len( self.GetHashes() )
|
||||
num_seeds = self._management_controller.GetNumSeeds()
|
||||
|
||||
return num_hashes + ( num_seeds * 20 )
|
||||
return ( num_hashes, num_seeds )
|
||||
|
||||
|
||||
def GetTotalWeight( self ) -> int:
|
||||
|
||||
( num_hashes, num_seeds ) = self.GetTotalNumHashesAndSeeds()
|
||||
|
||||
return ConvertNumHashesAndSeedsToWeight( num_hashes, num_seeds )
|
||||
|
||||
|
||||
def IsCurrentSessionPageDirty( self ):
|
||||
|
@ -1241,7 +1260,6 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
else:
|
||||
|
||||
self._controller.pub( 'notify_closed_page', page )
|
||||
self._controller.pub( 'notify_new_undo' )
|
||||
|
||||
|
||||
return True
|
||||
|
@ -2492,7 +2510,23 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
|
||||
def GetTotalWeight( self ):
|
||||
def GetTotalNumHashesAndSeeds( self ) -> int:
|
||||
|
||||
total_num_hashes = 0
|
||||
total_num_seeds = 0
|
||||
|
||||
for page in self._GetPages():
|
||||
|
||||
( num_hashes, num_seeds ) = page.GetTotalNumHashesAndSeeds()
|
||||
|
||||
total_num_hashes += num_hashes
|
||||
total_num_seeds += num_seeds
|
||||
|
||||
|
||||
return ( total_num_hashes, total_num_seeds )
|
||||
|
||||
|
||||
def GetTotalWeight( self ) -> int:
|
||||
|
||||
total_weight = sum( ( page.GetTotalWeight() for page in self._GetPages() ) )
|
||||
|
||||
|
@ -2874,7 +2908,14 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
WARNING_TOTAL_PAGES = self._controller.new_options.GetInteger( 'total_pages_warning' )
|
||||
MAX_TOTAL_PAGES = 500
|
||||
|
||||
( total_active_page_count, total_closed_page_count, total_active_weight, total_closed_weight ) = self._controller.gui.GetTotalPageCounts()
|
||||
(
|
||||
total_active_page_count,
|
||||
total_active_num_hashes,
|
||||
total_active_num_seeds,
|
||||
total_closed_page_count,
|
||||
total_closed_num_hashes,
|
||||
total_closed_num_seeds
|
||||
) = self._controller.gui.GetTotalPageCounts()
|
||||
|
||||
if total_active_page_count + total_closed_page_count >= WARNING_TOTAL_PAGES:
|
||||
|
||||
|
|
|
@ -165,7 +165,7 @@ def ReadFetch(
|
|||
|
||||
fetch_from_db = True
|
||||
|
||||
if synchronised and qt_media_callable is not None:
|
||||
if synchronised and qt_media_callable is not None and not file_search_context.GetSystemPredicates().HasSystemLimit():
|
||||
|
||||
try:
|
||||
|
||||
|
@ -1783,7 +1783,9 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
def _SignalNewSearchState( self ):
|
||||
|
||||
file_search_context = self.GetFileSearchContext()
|
||||
self._file_search_context.SetPredicates( self._predicates_listbox.GetPredicates() )
|
||||
|
||||
file_search_context = self._file_search_context.Duplicate()
|
||||
|
||||
self.searchChanged.emit( file_search_context )
|
||||
|
||||
|
@ -2040,6 +2042,8 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
HG.client_controller.CallLaterQtSafe( self, 0.2, 'set stub predicates', self.SetStubPredicates, job_key, stub_predicates, parsed_autocomplete_text )
|
||||
|
||||
fsc = self.GetFileSearchContext()
|
||||
|
||||
if self._under_construction_or_predicate is None:
|
||||
|
||||
under_construction_or_predicate = None
|
||||
|
@ -2049,7 +2053,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
under_construction_or_predicate = self._under_construction_or_predicate.Duplicate()
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( ReadFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, self._media_callable, self._file_search_context.Duplicate(), self._search_pause_play.IsOn(), self._include_unusual_predicate_types, self._results_cache, under_construction_or_predicate, self._force_system_everything )
|
||||
HG.client_controller.CallToThread( ReadFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, self._media_callable, fsc, self._search_pause_play.IsOn(), self._include_unusual_predicate_types, self._results_cache, under_construction_or_predicate, self._force_system_everything )
|
||||
|
||||
|
||||
def _ShouldTakeResponsibilityForEnter( self ):
|
||||
|
@ -2195,11 +2199,11 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
self._file_search_context = file_search_context.Duplicate()
|
||||
|
||||
self._predicates_listbox.SetPredicates( self._file_search_context.GetPredicates() )
|
||||
|
||||
self._ChangeFileService( self._file_search_context.GetFileServiceKey() )
|
||||
self._ChangeTagService( self._file_search_context.GetTagSearchContext().service_key )
|
||||
|
||||
self._predicates_listbox.SetPredicates( self._file_search_context.GetPredicates() )
|
||||
|
||||
self._SignalNewSearchState()
|
||||
|
||||
|
||||
|
@ -2231,6 +2235,12 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
self._RestoreTextCtrlFocus()
|
||||
|
||||
if not self._search_pause_play.IsOn() and not self._file_search_context.GetSystemPredicates().HasSystemLimit():
|
||||
|
||||
# update if user goes from sync to non-sync
|
||||
self._SetListDirty()
|
||||
|
||||
|
||||
|
||||
def PausePlaySearch( self ):
|
||||
|
||||
|
|
|
@ -714,6 +714,11 @@ class CheckboxManagerOptions( CheckboxManager ):
|
|||
|
||||
new_options.InvertBoolean( self._boolean_name )
|
||||
|
||||
if self._boolean_name == 'advanced_mode':
|
||||
|
||||
HG.client_controller.pub( 'notify_advanced_mode' )
|
||||
|
||||
|
||||
HG.client_controller.pub( 'checkbox_manager_inverted' )
|
||||
HG.client_controller.pub( 'notify_new_menu_option' )
|
||||
|
||||
|
|
|
@ -2016,6 +2016,26 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def ScrubDomainErrors( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
try:
|
||||
|
||||
domain = ConvertURLIntoSecondLevelDomain( url )
|
||||
|
||||
except:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if domain in self._second_level_domains_to_network_infrastructure_errors:
|
||||
|
||||
del self._second_level_domains_to_network_infrastructure_errors[ domain ]
|
||||
|
||||
|
||||
|
||||
|
||||
def SetClean( self ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -1098,6 +1098,16 @@ class NetworkJob( object ):
|
|||
|
||||
|
||||
|
||||
def ScrubDomainErrors( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self.engine.domain_manager.ScrubDomainErrors( self._url )
|
||||
|
||||
self._wake_time_float = 0.0
|
||||
|
||||
|
||||
|
||||
def SetError( self, e, error ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -81,7 +81,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 456
|
||||
SOFTWARE_VERSION = 457
|
||||
CLIENT_API_VERSION = 20
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import hashlib
|
||||
import os
|
||||
import struct
|
||||
|
||||
from hydrus.core import HydrusAudioHandling
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
|
@ -311,7 +312,7 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
elif mime == HC.UNDETERMINED_PNG:
|
||||
|
||||
if HydrusVideoHandling.HasVideoStream( path ):
|
||||
if IsPNGAnimated( bit_to_check ):
|
||||
|
||||
return HC.IMAGE_APNG
|
||||
|
||||
|
@ -379,3 +380,22 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
return HC.APPLICATION_UNKNOWN
|
||||
|
||||
def IsPNGAnimated( file_header_bytes ):
|
||||
|
||||
if file_header_bytes[ 37: ].startswith( b'acTL' ):
|
||||
|
||||
# this is an animated png
|
||||
|
||||
# acTL chunk in an animated png is 4 bytes of num frames, then 4 bytes of num times to loop
|
||||
# https://wiki.mozilla.org/APNG_Specification#.60acTL.60:_The_Animation_Control_Chunk
|
||||
|
||||
num_frames = HydrusVideoHandling.GetAPNGNumFrames( file_header_bytes )
|
||||
|
||||
if num_frames > 1:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
|
|
@ -68,7 +68,6 @@ mpv_report_mode = False
|
|||
force_idle_mode = False
|
||||
no_page_limit_mode = False
|
||||
thumbnail_debug_mode = False
|
||||
currently_uploading_pending = False
|
||||
|
||||
do_idle_shutdown_work = False
|
||||
shutdown_complete = False
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import numpy
|
||||
import os
|
||||
import re
|
||||
import struct
|
||||
import subprocess
|
||||
|
||||
from hydrus.core import HydrusAudioHandling
|
||||
|
@ -42,6 +43,12 @@ def CheckFFMPEGError( lines ):
|
|||
raise HydrusExceptions.DamagedOrUnusualFileException( 'FFMPEG could not parse.' )
|
||||
|
||||
|
||||
def GetAPNGNumFrames( file_header_bytes ):
|
||||
|
||||
( num_frames, ) = struct.unpack( '>I', file_header_bytes[ 41 : 45 ] )
|
||||
|
||||
return num_frames
|
||||
|
||||
def GetFFMPEGVersion():
|
||||
|
||||
cmd = [ FFMPEG_PATH, '-version' ]
|
||||
|
@ -217,6 +224,34 @@ def GetFFMPEGInfoLines( path, count_frames_manually = False, only_first_second =
|
|||
|
||||
return lines
|
||||
|
||||
def GetFFMPEGAPNGProperties( path ):
|
||||
|
||||
with open( path, 'rb' ) as f:
|
||||
|
||||
file_header_bytes = f.read( 256 )
|
||||
|
||||
|
||||
num_frames = GetAPNGNumFrames( file_header_bytes )
|
||||
|
||||
lines = GetFFMPEGInfoLines( path )
|
||||
|
||||
resolution = ParseFFMPEGVideoResolution( lines )
|
||||
|
||||
( fps, confident_fps ) = ParseFFMPEGFPS( lines )
|
||||
|
||||
if not confident_fps:
|
||||
|
||||
fps = 24
|
||||
|
||||
|
||||
duration = num_frames / fps
|
||||
|
||||
duration_in_ms = int( duration * 1000 )
|
||||
|
||||
has_audio = False
|
||||
|
||||
return ( resolution, duration_in_ms, num_frames, has_audio )
|
||||
|
||||
def GetFFMPEGVideoProperties( path, force_count_frames_manually = False ):
|
||||
|
||||
lines_for_first_second = GetFFMPEGInfoLines( path, count_frames_manually = True, only_first_second = True )
|
||||
|
|
|
@ -172,6 +172,22 @@ class Account( object ):
|
|||
return self.__repr__()
|
||||
|
||||
|
||||
def _CheckBanned( self ):
|
||||
|
||||
if self._IsBanned():
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'This account is banned: ' + self._GetBannedString() )
|
||||
|
||||
|
||||
|
||||
def _CheckExpired( self ):
|
||||
|
||||
if self._IsExpired():
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'This account is expired: ' + self._GetExpiresString() )
|
||||
|
||||
|
||||
|
||||
def _CheckFunctional( self ):
|
||||
|
||||
if self._created == 0:
|
||||
|
@ -179,20 +195,15 @@ class Account( object ):
|
|||
raise HydrusExceptions.ConflictException( 'account is unsynced' )
|
||||
|
||||
|
||||
if self._account_type.HasPermission( HC.CONTENT_TYPE_SERVICES, HC.PERMISSION_ACTION_MODERATE ):
|
||||
if self._IsAdmin():
|
||||
|
||||
return # admins can do anything
|
||||
# admins can do anything
|
||||
return
|
||||
|
||||
|
||||
if self._IsBanned():
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'This account is banned: ' + self._GetBannedString() )
|
||||
|
||||
self._CheckBanned()
|
||||
|
||||
if self._IsExpired():
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'This account is expired: ' + self._GetExpiresString() )
|
||||
|
||||
self._CheckExpired()
|
||||
|
||||
if not self._account_type.BandwidthOK( self._bandwidth_tracker ):
|
||||
|
||||
|
@ -219,6 +230,11 @@ class Account( object ):
|
|||
return HydrusData.ConvertTimestampToPrettyExpires( self._expires )
|
||||
|
||||
|
||||
def _IsAdmin( self ):
|
||||
|
||||
return self._account_type.HasPermission( HC.CONTENT_TYPE_SERVICES, HC.PERMISSION_ACTION_MODERATE )
|
||||
|
||||
|
||||
def _IsBanned( self ):
|
||||
|
||||
if self._banned_info is None:
|
||||
|
@ -299,6 +315,15 @@ class Account( object ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._IsAdmin():
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._CheckBanned()
|
||||
|
||||
self._CheckExpired()
|
||||
|
||||
if not self._account_type.HasPermission( content_type, action ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'You do not have permission to do that.' )
|
||||
|
@ -441,6 +466,16 @@ class Account( object ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._IsAdmin():
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if self._IsBanned() or self._IsExpired():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return self._account_type.HasPermission( content_type, action )
|
||||
|
||||
|
||||
|
|
|
@ -3017,7 +3017,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
def _RepositoryProcessClientToServerUpdate( self, service_key, account, client_to_server_update, timestamp ):
|
||||
def _RepositoryProcessClientToServerUpdate( self, service_key: bytes, account: HydrusNetwork.Account, client_to_server_update: HydrusNetwork.ClientToServerUpdate, timestamp: int ):
|
||||
|
||||
service_id = self._GetServiceId( service_key )
|
||||
|
||||
|
|
|
@ -65,7 +65,7 @@ class TestServerDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
r_keys = self._read( 'registration_keys', self._tag_service_key, self._tag_service_account, 5, self._deletee_user_account_type.GetAccountTypeKey(), 86400 * 365 )
|
||||
r_keys = self._read( 'registration_keys', self._tag_service_key, self._tag_service_account, 5, self._deletee_user_account_type.GetAccountTypeKey(), HydrusData.GetNow() + 86400 * 365 )
|
||||
|
||||
access_keys = [ self._read( 'access_key', self._tag_service_key, r_key ) for r_key in r_keys ]
|
||||
|
||||
|
@ -97,7 +97,7 @@ class TestServerDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
r_keys = self._read( 'registration_keys', self._tag_service_key, self._tag_service_account, 5, self._regular_user_account_type.GetAccountTypeKey(), 86400 * 365 )
|
||||
r_keys = self._read( 'registration_keys', self._tag_service_key, self._tag_service_account, 5, self._regular_user_account_type.GetAccountTypeKey(), HydrusData.GetNow() + 86400 * 365 )
|
||||
|
||||
self.assertEqual( len( r_keys ), 5 )
|
||||
|
||||
|
|
Loading…
Reference in New Issue