Version 401

This commit is contained in:
Hydrus Network Developer 2020-06-17 16:31:54 -05:00
parent d7a0876b43
commit 3126b5d1a1
41 changed files with 678 additions and 284 deletions

View File

@ -4,11 +4,12 @@ The hydrus network client is an application written for Anon and other internet-
I am continually working on the software and try to put out a new release every Wednesday by 8pm EST.
This github repository is currently a weekly sync with my home dev environment, where I work on hydrus by myself. Feel free to fork, but please do not make pull requests. I am also not active on Github, so if you have feedback of any sort, please email me, post on my 8kun or Endchan boards, or message me on tumblr or twitter or the discord.
This github repository is currently a weekly sync with my home dev environment, where I work on hydrus by myself. **Feel free to fork and do whatever you like with my code, but please do not make pull requests.** The [issue tracker here on Github](https://github.com/hydrusnetwork/hydrus/issues) is active and run by blessed volunteer users. I am not active here on Github, and I have difficulty keeping up with social media in general, but I welcome feedback of any sort and will eventually catch up with and reply to email, the 8kun or Endchan boards, tumblr, twitter, or the discord.
The client can do quite a lot! Please check out the help inside the release or [here](http://hydrusnetwork.github.io/hydrus/help), which includes a comprehensive getting started guide.
* [homepage](http://hydrusnetwork.github.io/hydrus/)
* [issue tracker](https://github.com/hydrusnetwork/hydrus/issues)
* [email](mailto:hydrus.admin@gmail.com)
* [8kun board](https://8kun.net/hydrus/index.html)
* [endchan bunker](https://endchan.net/hydrus/)

View File

@ -8,6 +8,35 @@
<div class="content">
<h3>changelog</h3>
<ul>
<li><h3>version 401</h3></li>
<ul>
<li>subscriptions:</li>
<li>as subs can now load more flexibly, previously hardcoded waits are now eliminated:</li>
<li>- the subscriptions manager now only waits three seconds after initial session load to boot (previously 15)</li>
<li>- the subscriptions manager now wakes as soon as the subscriptions dialog is ok'd or cancelled</li>
<li>- a timing calculation that would delay the work of a sub up to five or fifteen minutes if more queries would come due for sync in that time window (in order previously to batch to reduce read/write) is now eliminated--subs will now start as soon as any query is due. if you were ever confused why a query that seemed due did not boot after dialog ok or other wake-up event, this _should_ no longer happen</li>
<li>re-added the import/export/duplicate buttons to manage subs. export and dupe may need to do db work for a couple of seconds and will have a yes/no confirmation on larger jobs</li>
<li>the import button on manage subs accepts and converts the old 'legacy' subscription object format, including a copy/paste of the objects backed up to disk in the v400 update</li>
<li>fixed an issue where creating a subscription query and then deleting it in the same manage subs dialog session would result in surplus data being written to the db (which the next dialog launch would note and clear out)</li>
<li>an unusual error with pre-run domain checking, exposed by the new subscription code and e621 subs, where the gallery url has also recently changed, is now fixed</li>
<li>.</li>
<li>issue tracker:</li>
<li>the Github issue tracker (https://github.com/hydrusnetwork/hydrus/issues) is turned on again! it is now run by a team of volunteer users. the idea is going to be to try to merge duplicate feature suggestions with the proper platform and put some discussion and cognition and prioritisation into idea development before it gets to my desk, so I can be more focused and productive and so 95% of feature suggestions do not simply get banished to the shadow realm of the back of my todo</li>
<li>this is mostly intended for wishlist and other suggestions, as the tsunami was just getting too much for me to handle, but we'll see how it goes for things like bug reports as well. I'll still take any sort of report through my normal channels, if you are uncomfortable with github, or if you wish for me to forward an item to the issue tracker anonymously</li>
<li>the website, help documents, and hydrus help menu links have been updated regarding the issue tracker</li>
<li>.</li>
<li>the rest:</li>
<li>improved how the database 'update default downloader objects' job works, ensuring that new defaults are better at simply take the place of existing objects and do not break/reset existing url class to parser links</li>
<li>tightened up how automatic url class to parser linking works, eliminating some surplus and potentially bad data related to api links. furthermore, whenever the links between url classes and parsers update, existing surplus data, which may creep in when api links change, is now cleaned from the data structure</li>
<li>rolling out updated e621 url class and parser to deal with their alternate gallery url format</li>
<li>rolling out an updated derpibooru parser that will link to the new api class correctly</li>
<li>thanks to a user's submission, rolling out updated versions of the new default nitter parsers that pull creator:username tags</li>
<li>before every subprocess launch, and when waiting for all subprocess communication (e.g. to ffmpeg), now tests regularly for program shutdown. if an unusual situation develops where a subscription is doing a file import job while the OS is shutting down, and that system shut down would hang or is hanging on a 'ffmpeg can't be launched now' dialog, the hydrus client should now notice this and bomb out, rather than going for that never-running ffmpeg. this may not fix all instances of this issue, and further feedback on the client not closing down cleanly with the OS is welcome.</li>
<li>when adding a new path to the 'migrate database' panel, any symbolic links will be converted to canonical equivalents</li>
<li>added some location checks and appropriate errors when the database is doing file storage rebalancing</li>
<li>fixed an issue uploading swfs, video, or audio to the server when it is launched from a frozen executable build</li>
<li>misc code cleanup</li>
</ul>
<li><h3>version 400</h3></li>
<ul>
<li>subscription data overhaul:</li>

View File

@ -7,19 +7,20 @@
<body>
<div class="content">
<h3>contact and links</h3>
<p>Please send me all your bug reports, questions, ideas, and comments. It is always interesting to see how other people are using my software and what they generally think of it. Most of the changes every week are suggested by users.</p>
<p>You can contact me by email, twitter, tumblr, discord, or the 8kun/Endchan boards--I do not mind which. I'm not active on github (I use it mostly as a mirror of my home dev environment) and do not check its messages or issues. I often like to spend a day or so to think before replying to non-urgent messages, but I do try to reply to everything.</p>
<p>I am on the discord on Saturday afternoon, USA time, and Wednesday after I put the release out. If that is not a good time for you, feel free to leave me a DM and I will get to you when I can. There are also plenty of other hydrus users who idle who would be happy to help with any sort of support question.</p>
<p>I welcome all your bug reports, questions, ideas, and comments. It is always interesting to see how other people are using my software and what they generally think of it. Most of the changes every week are suggested by users.</p>
<p>You can contact me by email, twitter, tumblr, discord, or the 8kun/Endchan boards--I do not mind which. Please know that I have difficulty with social media, and while I try to reply to all messages, it sometimes takes me a while to catch up.</p>
<p>The <a href="https://github.com/hydrusnetwork/hydrus/issues">Github Issue Tracker</a> was turned off for some time, as it did not fit my workflow and I could not keep up, but it is now running again, managed by a team of volunteer users. Please feel free to submit feature requests there if you are comfortable with Github. I am not socially active on Github, and it is mostly just a mirror of my home dev environment, where I work alone.</p>
<p>I am on the discord on Saturday afternoon, USA time, if you would like to talk live, and briefly on Wednesday after I put the release out. If that is not a good time for you, feel free to leave me a DM and I will get to you when I can. There are also plenty of other hydrus users who idle who would be happy to help with any sort of support question.</p>
<p>I delete all tweets and resolved email conversations after three months. So, if you think you are waiting for a reply, or I said I was going to work on something you care about and seem to have forgotten, please do nudge me.</p>
<p>If you have a problem with something on someone else's server, please, <span class="warning">do not come to me</span>, as I cannot help. If your ex-gf's nudes have leaked onto the internet or you just find something terribly offensive, I cannot help you at all.</p>
<p>Anyway:</p>
<ul>
<li><a href="https://hydrusnetwork.github.io/hydrus/">homepage</a></li>
<li><a href="https://github.com/hydrusnetwork/hydrus">github</a></li>
<li><a href="https://github.com/hydrusnetwork/hydrus/issues">issue tracker</a></li>
<li><a href="https://8kun.top/hydrus/index.html">8kun board</a> (<a href="https://endchan.net/hydrus/">endchan bunker</a> <a href="https://endchan.org/hydrus/">(.org)</a>)</li>
<li><a href="http://hydrus.tumblr.com">tumblr</a> (<a href="http://hydrus.tumblr.com/rss">rss</a>)</li>
<li><a href="https://github.com/hydrusnetwork/hydrus/releases">new downloads</a></li>
<li><a href="https://www.mediafire.com/hydrus">old downloads</a></li>
<li><a href="https://github.com/hydrusnetwork/hydrus">github</a></li>
<li><a href="https://twitter.com/hydrusnetwork">twitter</a></li>
<li><a href="mailto:hydrus.admin@gmail.com">email</a></li>
<li><a href="https://discord.gg/wPHPCUZ">discord</a></li>

View File

@ -532,6 +532,7 @@ class GlobalPixmaps( object ):
self.tumblr = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'tumblr.png' ) )
self.discord = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'discord.png' ) )
self.patreon = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'patreon.png' ) )
self.github = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'github.png' ) )
self.first = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'first.png' ) )
self.previous = QG.QPixmap( os.path.join( HC.STATIC_DIR, 'previous.png' ) )

View File

@ -133,6 +133,9 @@ class App( QW.QApplication ):
HG.client_controller.gui.SaveAndClose()
HG.view_shutdown = True
HG.model_shutdown = True
class Controller( HydrusController.HydrusController ):

View File

@ -11678,6 +11678,11 @@ class DB( HydrusDB.HydrusDB ):
def _RelocateClientFiles( self, prefix, source, dest ):
if not os.path.exists( dest ):
raise Exception( 'Was commanded to move prefix "{}" from "{}" to "{}", but that destination does not exist!'.format( prefix, source, dest ) )
full_source = os.path.join( source, prefix )
full_dest = os.path.join( dest, prefix )
@ -14774,6 +14779,39 @@ class DB( HydrusDB.HydrusDB ):
if version == 400:
try:
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
domain_manager.Initialise()
#
domain_manager.OverwriteDefaultURLClasses( [ 'e621 gallery page (alternate format)' ] )
#
domain_manager.OverwriteDefaultParsers( [ 'nitter tweet parser', 'nitter tweet parser (video from koto.reisen)', 'e621 gallery page parser', 'derpibooru gallery page api parser' ] )
#
domain_manager.TryToLinkURLClassesAndParsers()
#
self._SetJSONDump( domain_manager )
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to update some downloaders failed! Please let hydrus dev know!'
self.pub_initial_message( message )
self._controller.pub( 'splash_set_title_text', 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )

View File

@ -489,8 +489,8 @@ class ServicesManager( object ):
self._controller = controller
self._lock = threading.Lock()
self._keys_to_services: typing.Dict[ bytes, ClientServices.Service ] = {}
self._services_sorted: typing.List[ ClientServices.Service ] = []
self._keys_to_services = {}
self._services_sorted = []
self.RefreshServices()

View File

@ -3648,7 +3648,7 @@ class StringProcessor( HydrusSerialisable.SerialisableBase ):
StringProcessingStep.__init__( self )
self._processing_steps: typing.List[ StringProcessingStep ] = []
self._processing_steps = []
def _GetSerialisableInfo( self ):

View File

@ -798,6 +798,8 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs( hide_terminal = False )
HydrusData.CheckProgramIsNotShuttingDown()
subprocess.Popen( cmd, **sbp_kwargs )
time_waited = 0
@ -2718,6 +2720,10 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
HG.client_controller.subscriptions_manager.SetSubscriptions( subscriptions )
else:
HG.client_controller.subscriptions_manager.Wake()
@ -4650,6 +4656,8 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
links = QW.QMenu( menu )
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'site', 'Open hydrus\'s website, which is mostly a mirror of the local help.', CC.global_pixmaps().file_repository, ClientPaths.LaunchURLInWebBrowser, 'https://hydrusnetwork.github.io/hydrus/' )
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'github repository', 'Open the hydrus github repository.', CC.global_pixmaps().github, ClientPaths.LaunchURLInWebBrowser, 'https://github.com/hydrusnetwork/hydrus' )
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'issue tracker', 'Open the github issue tracker, which is run by users.', CC.global_pixmaps().github, ClientPaths.LaunchURLInWebBrowser, 'https://github.com/hydrusnetwork/hydrus/issues' )
site = ClientGUIMenus.AppendMenuBitmapItem( links, '8kun board', 'Open hydrus dev\'s 8kun board, where he makes release posts and other status updates.', CC.global_pixmaps().eight_kun, ClientPaths.LaunchURLInWebBrowser, 'https://8kun.top/hydrus/index.html' )
site = ClientGUIMenus.AppendMenuItem( links, 'Endchan board bunker', 'Open hydrus dev\'s Endchan board, the bunker for when 8kun is unavailable. Try .org if .net is unavailable.', ClientPaths.LaunchURLInWebBrowser, 'https://endchan.net/hydrus/index.html' )
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'twitter', 'Open hydrus dev\'s twitter, where he makes general progress updates and emergency notifications.', CC.global_pixmaps().twitter, ClientPaths.LaunchURLInWebBrowser, 'https://twitter.com/hydrusnetwork' )

View File

@ -363,7 +363,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
#
eligible_url_classes = [ url_class for url_class in url_classes if url_class.GetURLType() in ( HC.URL_TYPE_POST, HC.URL_TYPE_WATCHABLE ) and url_class.GetMatchKey() in self._url_class_keys_to_parser_keys ]
eligible_url_classes = [ url_class for url_class in url_classes if url_class.GetURLType() in ( HC.URL_TYPE_POST, HC.URL_TYPE_WATCHABLE ) and url_class.GetClassKey() in self._url_class_keys_to_parser_keys ]
self._list_ctrl.AddDatas( eligible_url_classes )
@ -388,7 +388,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
def _ConvertDataToListCtrlTuples( self, url_class ):
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
name = url_class.GetName()
url_type = url_class.GetURLType()
@ -422,7 +422,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
for url_class in url_classes_to_clear:
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_tag_import_options:
@ -442,7 +442,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
url_class = selected[0]
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_tag_import_options:
@ -472,7 +472,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
if dlg.exec() == QW.QDialog.Accepted:
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
tag_import_options = panel.GetValue()
@ -490,7 +490,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
def _GetDefaultTagImportOptions( self, url_class ):
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_tag_import_options:
@ -525,7 +525,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
url_class = selected[0]
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_tag_import_options:
@ -560,7 +560,7 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
for url_class in self._list_ctrl.GetData( only_selected = True ):
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
self._url_class_keys_to_tag_import_options[ url_class_key ] = tag_import_options.Duplicate()
@ -929,7 +929,7 @@ class EditDownloaderDisplayPanel( ClientGUIScrolledPanels.EditPanel ):
self._gug_keys_to_gugs = { gug.GetGUGKey() : gug for gug in self._gugs }
self._url_classes = url_classes
self._url_class_keys_to_url_classes = { url_class.GetMatchKey() : url_class for url_class in self._url_classes }
self._url_class_keys_to_url_classes = { url_class.GetClassKey() : url_class for url_class in self._url_classes }
self._network_engine = network_engine
@ -3913,7 +3913,7 @@ class EditTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
url_classes = domain_manager.GetURLClasses()
url_class_keys_to_url_classes = { url_class.GetMatchKey() : url_class for url_class in url_classes }
url_class_keys_to_url_classes = { url_class.GetClassKey() : url_class for url_class in url_classes }
url_class_names_and_default_tag_import_options = sorted( ( ( url_class_keys_to_url_classes[ url_class_key ].GetName(), url_class_keys_to_default_tag_import_options[ url_class_key ] ) for url_class_key in list( url_class_keys_to_default_tag_import_options.keys() ) if url_class_key in url_class_keys_to_url_classes ) )
@ -5117,7 +5117,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
def _GetValue( self ):
url_class_key = self._original_url_class.GetMatchKey()
url_class_key = self._original_url_class.GetClassKey()
name = self._name.text()
url_type = self._url_type.GetValue()
preferred_scheme = self._preferred_scheme.GetValue()
@ -5495,7 +5495,7 @@ class EditURLClassesPanel( ClientGUIScrolledPanels.EditPanel ):
HydrusSerialisable.SetNonDupeName( url_class, self._GetExistingNames() )
url_class.RegenerateMatchKey()
url_class.RegenerateClassKey()
self._list_ctrl.AddDatas( ( url_class, ) )
@ -5614,7 +5614,7 @@ class EditURLClassLinksPanel( ClientGUIScrolledPanels.EditPanel ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
self._url_classes = url_classes
self._url_class_keys_to_url_classes = { url_class.GetMatchKey() : url_class for url_class in self._url_classes }
self._url_class_keys_to_url_classes = { url_class.GetClassKey() : url_class for url_class in self._url_classes }
self._parsers = parsers
self._parser_keys_to_parsers = { parser.GetParserKey() : parser for parser in self._parsers }
@ -5672,7 +5672,7 @@ class EditURLClassLinksPanel( ClientGUIScrolledPanels.EditPanel ):
continue
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in url_class_keys_to_parser_keys:

View File

@ -170,6 +170,8 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
def _AddPath( self, path, starting_weight = 1 ):
path = os.path.realpath( path )
if path in self._locations_to_ideal_weights:
QW.QMessageBox.warning( self, 'Warning', 'You already have that location entered!' )
@ -803,6 +805,8 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
path = dlg.GetPath()
path = os.path.realpath( path )
if path in self._locations_to_ideal_weights:
QW.QMessageBox.warning( self, 'Warning', 'That path already exists as a regular file location! Please choose another.' )

View File

@ -337,7 +337,7 @@ class StringToStringDictButton( ClientGUICommon.BetterButton ):
ClientGUICommon.BetterButton.__init__( self, parent, label, self._Edit )
self._value: typing.Dict[ str, str ] = {}
self._value = {}
def _Edit( self ):

View File

@ -1,4 +1,6 @@
import os
import threading
import time
import typing
from qtpy import QtCore as QC
@ -1197,8 +1199,7 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
subscriptions_panel.AddSeparator()
# disabled for now
#subscriptions_panel.AddImportExportButtons( ( ClientImportSubscriptions.Subscription, ), self._AddSubscription )
subscriptions_panel.AddImportExportButtons( ( ClientImportSubscriptionLegacy.SubscriptionLegacy, ClientImportSubscriptions.SubscriptionContainer ), self._AddSubscription, custom_get_callable = self._GetSelectedSubsAsExportableContainers )
subscriptions_panel.NewButtonRow()
@ -1248,7 +1249,62 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self.widget().setLayout( vbox )
def _AddSubscription( self, subscription ):
def _AddSubscription( self, unknown_subscription ):
if isinstance( unknown_subscription, ( ClientImportSubscriptionLegacy.SubscriptionLegacy, ClientImportSubscriptions.SubscriptionContainer ) ):
if isinstance( unknown_subscription, ClientImportSubscriptionLegacy.SubscriptionLegacy ):
( subscription, query_log_containers ) = ClientImportSubscriptionLegacy.ConvertLegacySubscriptionToNew( unknown_subscription )
elif isinstance( unknown_subscription, ClientImportSubscriptions.SubscriptionContainer ):
subscription = unknown_subscription.subscription
query_log_containers = unknown_subscription.query_log_containers
old_names_to_query_log_containers = { query_log_container.GetName() : query_log_container for query_log_container in query_log_containers }
there_were_missing_query_log_containers = False
for query_header in subscription.GetQueryHeaders():
old_query_log_container_name = query_header.GetQueryLogContainerName()
new_query_log_container_name = ClientImportSubscriptionQuery.GenerateQueryLogContainerName()
query_header.SetQueryLogContainerName( new_query_log_container_name )
if old_query_log_container_name in old_names_to_query_log_containers:
old_names_to_query_log_containers[ old_query_log_container_name ].SetName( new_query_log_container_name )
else:
there_were_missing_query_log_containers = True
if there_were_missing_query_log_containers:
message = 'When importing this subscription, "{}", there was missing log data! I will still let you add it, but some of its queries are incomplete. If you are ok with this, ok and then immediately re-open the manage subscriptions dialog to reinitialise the missing data back to zero (and clear any orphaned data that came with this). If you are not ok with this, cancel out now or cancel out of the whole manage subs dialog.'.format( subscription.GetName() )
result = ClientGUIDialogsQuick.GetYesNo( self, message, title = 'missing query log data!', yes_label = 'import it anyway', no_label = 'back out now' )
if result != QW.QDialog.Accepted:
return
new_names_to_query_log_containers = { query_log_container.GetName() : query_log_container for query_log_container in query_log_containers }
self._names_to_edited_query_log_containers.update( new_names_to_query_log_containers )
elif isinstance( unknown_subscription, ClientImportSubscriptions.Subscription ):
subscription = unknown_subscription
subscription.SetNonDupeName( self._GetExistingNames() )
@ -1514,13 +1570,78 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
return names
def _GetExportObject( self ):
def _GetSelectedSubsAsExportableContainers( self ):
subs_to_export = self._subscriptions.GetData( only_selected = True )
required_query_log_headers = []
for sub in subs_to_export:
required_query_log_headers.extend( sub.GetQueryHeaders() )
missing_query_headers = [ query_header for query_header in required_query_log_headers if query_header.GetQueryLogContainerName() not in self._names_to_edited_query_log_containers ]
if len( missing_query_headers ) > 0:
if len( missing_query_headers ) > 25:
message = 'Exporting or duplicating the current selection means reading query data for {} queries from the database. This may take just a couple of seconds, or, for hundreds of thousands of cached URLs, it could be a couple of minutes (and a whack of memory). Do not panic, it will get there in the end. Do you want to do the export?'.format( HydrusData.ToHumanInt( len( missing_query_headers ) ) )
result = ClientGUIDialogsQuick.GetYesNo( self, message )
if result != QW.QDialog.Accepted:
return None
self.setEnabled( False )
done = threading.Event()
done_call = lambda: done.set()
HG.client_controller.CallToThread( AsyncGetQueryLogContainers, self, missing_query_headers, self._CATCHQueryLogContainers, done_call )
while True:
if not QP.isValid( self ):
return None
if done.is_set():
break
else:
time.sleep( 0.25 )
QW.QApplication.instance().processEvents()
self.setEnabled( True )
to_export = HydrusSerialisable.SerialisableList()
for subscription in self._subscriptions.GetData( only_selected = True ):
for sub in subs_to_export:
to_export.append( subscription )
query_log_container_names = [ query_header.GetQueryLogContainerName() for query_header in sub.GetQueryHeaders() ]
query_log_containers = [ self._names_to_edited_query_log_containers[ query_log_container_name ] for query_log_container_name in query_log_container_names ]
subscription_container = ClientImportSubscriptions.SubscriptionContainer()
subscription_container.subscription = sub
subscription_container.query_log_containers = HydrusSerialisable.SerialisableList( query_log_containers )
# duplicate important here to make sure we aren't linked with existing objects on a dupe call
to_export.append( subscription_container.Duplicate() )
if len( to_export ) == 0:
@ -1537,32 +1658,6 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
def _ImportObject( self, obj ):
if isinstance( obj, HydrusSerialisable.SerialisableList ):
for sub_obj in obj:
self._ImportObject( sub_obj )
else:
if isinstance( obj, ClientImportSubscriptions.Subscription ):
subscription = obj
subscription.SetNonDupeName( self._GetExistingNames() )
self._subscriptions.AddDatas( ( subscription, ) )
else:
QW.QMessageBox.warning( self, 'Warning', 'That was not a subscription--it was a: '+type(obj).__name__ )
def _STARTReset( self ):
message = 'Resetting these subscriptions will delete all their remembered urls, meaning when they next run, they will try to download them all over again. This may be expensive in time and data. Only do it if you are willing to wait. Do you want to do it?'
@ -1762,16 +1857,18 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
subscriptions = self._subscriptions.GetData()
edited_query_log_containers = list( self._names_to_edited_query_log_containers.values() )
new_query_log_container_names = set()
required_query_log_container_names = set()
for subscription in subscriptions:
new_query_log_container_names.update( subscription.GetAllQueryLogContainerNames() )
required_query_log_container_names.update( subscription.GetAllQueryLogContainerNames() )
deletee_query_log_container_names = self._existing_query_log_container_names.difference( new_query_log_container_names )
edited_query_log_containers = list( self._names_to_edited_query_log_containers.values() )
edited_query_log_containers = [ query_log_container for query_log_container in edited_query_log_containers if query_log_container.GetName() in required_query_log_container_names ]
deletee_query_log_container_names = self._existing_query_log_container_names.difference( required_query_log_container_names )
return ( subscriptions, edited_query_log_containers, deletee_query_log_container_names )

View File

@ -1500,7 +1500,7 @@ class StatusBar( QW.QStatusBar ):
QW.QStatusBar.__init__( self )
self._labels: typing.List[ QW.QLabel ] = []
self._labels = []
for w in status_widths:

View File

@ -599,7 +599,14 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
post_url = self.file_seed_data
( url_to_check, parser ) = HG.client_controller.network_engine.domain_manager.GetURLToFetchAndParser( post_url )
try:
( url_to_check, parser ) = HG.client_controller.network_engine.domain_manager.GetURLToFetchAndParser( post_url )
except HydrusExceptions.URLClassException:
url_to_check = post_url
else:

View File

@ -192,7 +192,14 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
def GetExampleNetworkJob( self, network_job_factory ):
( url_to_check, parser ) = HG.client_controller.network_engine.domain_manager.GetURLToFetchAndParser( self.url )
try:
( url_to_check, parser ) = HG.client_controller.network_engine.domain_manager.GetURLToFetchAndParser( self.url )
except HydrusExceptions.URLClassException:
url_to_check = self.url
network_job = network_job_factory( 'GET', url_to_check )

View File

@ -609,8 +609,8 @@ class NoteImportOptions( HydrusSerialisable.SerialisableBase ):
self._get_notes = False
self._extend_existing_note_if_possible = True
self._conflict_resolution = NOTE_IMPORT_CONFLICT_IGNORE
self._all_name_override: typing.Optional[ str ] = None
self._names_to_name_overrides: typing.Dict[ str, str ] = dict()
self._all_name_override = None
self._names_to_name_overrides = dict()
def _GetSerialisableInfo( self ):

View File

@ -994,7 +994,7 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
with self._lock:
urls = [u for u in urls if len( u ) > 1] # > _1_ to take out the occasional whitespace
urls = [ u for u in urls if len( u ) > 1 ] # > _1_ to take out the occasional whitespace
file_seeds = []

View File

@ -482,7 +482,7 @@ class SubscriptionLegacy( HydrusSerialisable.SerialisableBaseNamed ):
self._gug_key_and_name = gug_key_and_name
self._queries: typing.List[ SubscriptionQueryLegacy ] = []
self._queries = []
new_options = HG.client_controller.new_options
@ -1883,80 +1883,80 @@ class SubscriptionLegacy( HydrusSerialisable.SerialisableBaseNamed ):
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION_LEGACY ] = SubscriptionLegacy
def ConvertLegacySubscriptionToNew( legacy_subscription: SubscriptionLegacy ):
(
name,
gug_key_and_name,
queries,
checker_options,
initial_file_limit,
periodic_file_limit,
paused,
file_import_options,
tag_import_options,
no_work_until,
no_work_until_reason
) = legacy_subscription.ToTuple()
subscription = ClientImportSubscriptions.Subscription( name )
subscription.SetTuple(
gug_key_and_name,
checker_options,
initial_file_limit,
periodic_file_limit,
paused,
file_import_options,
tag_import_options,
no_work_until
)
(
show_a_popup_while_working,
publish_files_to_popup_button,
publish_files_to_page,
publish_label_override,
merge_query_publish_events
) = legacy_subscription.GetPresentationOptions()
subscription.SetPresentationOptions(
show_a_popup_while_working,
publish_files_to_popup_button,
publish_files_to_page,
publish_label_override,
merge_query_publish_events
)
query_headers = []
query_log_containers = []
for query in queries:
query_header = ClientImportSubscriptionQuery.SubscriptionQueryHeader()
( query_text, check_now, last_check_time, next_check_time, query_paused, status ) = query.ToTuple()
query_header.SetQueryText( query_text )
query_header.SetDisplayName( query.GetDisplayName() )
query_header.SetCheckNow( check_now )
query_header.SetLastCheckTime( last_check_time )
query_header.SetNextCheckTime( next_check_time )
query_header.SetPaused( query_paused )
query_header.SetCheckerStatus( status )
query_header.SetTagImportOptions( query.GetTagImportOptions() )
query_log_container = ClientImportSubscriptionQuery.SubscriptionQueryLogContainer( query_header.GetQueryLogContainerName() )
query_log_container.SetGallerySeedLog( query.GetGallerySeedLog() )
query_log_container.SetFileSeedCache( query.GetFileSeedCache() )
query_header.SyncToQueryLogContainer( checker_options, query_log_container )
query_headers.append( query_header )
query_log_containers.append( query_log_container )
subscription.SetQueryHeaders( query_headers )
return ( subscription, query_log_containers )
(
name,
gug_key_and_name,
queries,
checker_options,
initial_file_limit,
periodic_file_limit,
paused,
file_import_options,
tag_import_options,
no_work_until,
no_work_until_reason
) = legacy_subscription.ToTuple()
subscription = ClientImportSubscriptions.Subscription( name )
subscription.SetTuple(
gug_key_and_name,
checker_options,
initial_file_limit,
periodic_file_limit,
paused,
file_import_options,
tag_import_options,
no_work_until
)
(
show_a_popup_while_working,
publish_files_to_popup_button,
publish_files_to_page,
publish_label_override,
merge_query_publish_events
) = legacy_subscription.GetPresentationOptions()
subscription.SetPresentationOptions(
show_a_popup_while_working,
publish_files_to_popup_button,
publish_files_to_page,
publish_label_override,
merge_query_publish_events
)
query_headers = []
query_log_containers = []
for query in queries:
query_header = ClientImportSubscriptionQuery.SubscriptionQueryHeader()
( query_text, check_now, last_check_time, next_check_time, query_paused, status ) = query.ToTuple()
query_header.SetQueryText( query_text )
query_header.SetDisplayName( query.GetDisplayName() )
query_header.SetCheckNow( check_now )
query_header.SetLastCheckTime( last_check_time )
query_header.SetNextCheckTime( next_check_time )
query_header.SetPaused( query_paused )
query_header.SetCheckerStatus( status )
query_header.SetTagImportOptions( query.GetTagImportOptions() )
query_log_container = ClientImportSubscriptionQuery.SubscriptionQueryLogContainer( query_header.GetQueryLogContainerName() )
query_log_container.SetGallerySeedLog( query.GetGallerySeedLog() )
query_log_container.SetFileSeedCache( query.GetFileSeedCache() )
query_header.SyncToQueryLogContainer( checker_options, query_log_container )
query_headers.append( query_header )
query_log_containers.append( query_log_container )
subscription.SetQueryHeaders( query_headers )
return ( subscription, query_log_containers )

View File

@ -14,7 +14,7 @@ from hydrus.client.networking import ClientNetworkingContexts
from hydrus.client.networking import ClientNetworkingDomain
from hydrus.client.networking import ClientNetworkingJobs
def GenerateSubQueryName() -> str:
def GenerateQueryLogContainerName() -> str:
return HydrusData.GenerateKey().hex()
@ -84,7 +84,7 @@ class SubscriptionQueryHeader( HydrusSerialisable.SerialisableBase ):
HydrusSerialisable.SerialisableBase.__init__( self )
self._query_log_container_name = GenerateSubQueryName()
self._query_log_container_name = GenerateQueryLogContainerName()
self._query_text = 'query'
self._display_name = None
self._check_now = False
@ -694,6 +694,13 @@ class SubscriptionQueryHeader( HydrusSerialisable.SerialisableBase ):
self._paused = paused
def SetQueryLogContainerName( self, query_log_container_name: str ):
self._query_log_container_name = query_log_container_name
self.SetQueryLogContainerStatus( LOG_CONTAINER_UNSYNCED )
def SetQueryLogContainerStatus( self, log_container_status: int ):
self._query_log_container_status = log_container_status

View File

@ -1057,22 +1057,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
return None
# if there are three queries due say fifty seconds after our first one runs, we should wait that little bit longer
LAUNCH_WINDOW = 5 * 60
earliest_next_work_time = min( next_work_times )
latest_nearby_next_work_time = max( ( work_time for work_time in next_work_times if work_time < earliest_next_work_time + LAUNCH_WINDOW ) )
# but if we are expecting to launch it right now (e.g. check_now call), we won't wait
if HydrusData.TimeUntil( earliest_next_work_time ) < 60:
best_next_work_time = earliest_next_work_time
else:
best_next_work_time = latest_nearby_next_work_time
best_next_work_time = min( next_work_times )
if not HydrusData.TimeHasPassed( self._no_work_until ):
@ -1395,6 +1380,42 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ] = Subscription
LOG_CONTAINER_SYNCED = 0
LOG_CONTAINER_UNSYNCED = 1
LOG_CONTAINER_MISSING = 2
class SubscriptionContainer( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION_CONTAINER
SERIALISABLE_NAME = 'Subscription with all data'
SERIALISABLE_VERSION = 1
def __init__( self ):
HydrusSerialisable.SerialisableBase.__init__( self )
self.subscription = Subscription( 'default' )
self.query_log_containers = HydrusSerialisable.SerialisableList()
def _GetSerialisableInfo( self ):
serialisable_subscription = self.subscription.GetSerialisableTuple()
serialisable_query_log_containers = self.query_log_containers.GetSerialisableTuple()
return ( serialisable_subscription, serialisable_query_log_containers )
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
( serialisable_subscription, serialisable_query_log_containers ) = serialisable_info
self.subscription = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_subscription )
self.query_log_containers = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_query_log_containers )
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION_CONTAINER ] = SubscriptionContainer
class SubscriptionJob( object ):
def __init__( self, controller, subscription ):
@ -1485,7 +1506,7 @@ class SubscriptionsManager( object ):
if len( self._names_to_running_subscription_info ) > 0:
return 1
return 0.5
else:
@ -1493,11 +1514,11 @@ class SubscriptionsManager( object ):
if subscription is not None:
return 1
return 0.5
else:
return 15
return 5
@ -1585,12 +1606,16 @@ class SubscriptionsManager( object ):
else:
if just_finished_work:
p1 = HG.client_controller.options[ 'pause_subs_sync' ]
p2 = HG.client_controller.new_options.GetBoolean( 'pause_all_new_network_traffic' )
stopped_because_pause = p1 or p2
if just_finished_work and not stopped_because_pause:
# don't want to have a load/save cycle repeating over and over
# even with the new data format, we don't want to have a load/save cycle repeating _too_ much, just to stop any weird cascades
# this sets min resolution of a single sub repeat cycle
# we'll clear it when we have data breakup done
BUFFER_TIME = 60 * 60
BUFFER_TIME = 120
next_work_time = max( next_work_time, HydrusData.GetNow() + BUFFER_TIME )
@ -1617,7 +1642,7 @@ class SubscriptionsManager( object ):
try:
self._wake_event.wait( 15 )
self._wake_event.wait( 3 )
while not ( HG.view_shutdown or self._shutdown ):
@ -1687,6 +1712,8 @@ class SubscriptionsManager( object ):
self._UpdateSubscriptionInfo( subscription )
self._wake_event.set()
def ShowSnapshot( self ):

View File

@ -204,7 +204,7 @@ class NetworkEngine( object ):
return True
elif not job.BandwidthOK():
elif not job.TryToStartBandwidth():
return True

View File

@ -492,6 +492,22 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
self._RecalcCache()
def _CleanURLClassKeysToParserKeys( self ):
api_pairs = ConvertURLClassesIntoAPIPairs( self._url_classes )
# anything that goes to an api url will be parsed by that api's parser--it can't have its own
for ( a, b ) in api_pairs:
unparseable_url_class_key = a.GetClassKey()
if unparseable_url_class_key in self._url_class_keys_to_parser_keys:
del self._url_class_keys_to_parser_keys[ unparseable_url_class_key ]
def _GetDefaultTagImportOptionsForURL( self, url ):
url_class = self._GetURLClass( url )
@ -516,7 +532,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
return self._file_post_default_tag_import_options
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_default_tag_import_options:
@ -672,7 +688,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
raise HydrusExceptions.URLClassException( 'Could not find a parser for ' + url + '!' + os.linesep * 2 + str( e ) )
url_class_key = parser_url_class.GetMatchKey()
url_class_key = parser_url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_parser_keys:
@ -815,7 +831,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
for url_class in url_classes:
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
name = url_class.GetName()
@ -1046,7 +1062,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
# absent irrelevant variables, do we have the exact same object already in?
name = new_url_class.GetName()
match_key = new_url_class.GetMatchKey()
match_key = new_url_class.GetClassKey()
example_url = new_url_class.GetExampleURL()
dupe_url_classes = [ url_class.Duplicate() for url_class in self._url_classes ]
@ -1054,7 +1070,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
for dupe_url_class in dupe_url_classes:
dupe_url_class.SetName( name )
dupe_url_class.SetMatchKey( match_key )
dupe_url_class.SetClassKey( match_key )
dupe_url_class.SetExampleURL( example_url )
if dupe_url_class.DumpToString() == new_url_class.DumpToString():
@ -1095,7 +1111,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
for url_class in new_url_classes:
url_class.RegenerateMatchKey()
url_class.RegenerateClassKey()
for parser in new_parsers:
@ -1169,6 +1185,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
self._url_class_keys_to_parser_keys.update( new_url_class_keys_to_parser_keys )
self._CleanURLClassKeysToParserKeys()
# let's do a trytolink just in case there are loose ends due to some dupe being discarded earlier (e.g. url match is new, but parser was not).
@ -1220,7 +1238,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
else:
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in self._url_class_keys_to_display:
@ -1700,9 +1718,20 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
default_gugs = ClientDefaults.GetDefaultGUGs()
existing_gug_names_to_keys = { gug.GetName() : gug.GetGUGKey() for gug in self._gugs }
for gug in default_gugs:
gug.RegenerateGUGKey()
gug_name = gug.GetName()
if gug_name in existing_gug_names_to_keys:
gug.SetGUGKey( existing_gug_names_to_keys[ gug_name ] )
else:
gug.RegenerateGUGKey()
existing_gugs = list( self._gugs )
@ -1722,9 +1751,20 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
default_parsers = ClientDefaults.GetDefaultParsers()
existing_parser_names_to_keys = { parser.GetName() : parser.GetParserKey() for parser in self._parsers }
for parser in default_parsers:
parser.RegenerateParserKey()
name = parser.GetName()
if name in existing_parser_names_to_keys:
parser.SetParserKey( existing_parser_names_to_keys[ name ] )
else:
parser.RegenerateParserKey()
existing_parsers = list( self._parsers )
@ -1744,9 +1784,25 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
default_url_classes = ClientDefaults.GetDefaultURLClasses()
existing_class_names_to_keys = { url_class.GetName() : url_class.GetClassKey() for url_class in self._url_classes }
for url_class in default_url_classes:
url_class.RegenerateMatchKey()
name = url_class.GetName()
if name in existing_class_names_to_keys:
url_class.SetClassKey( existing_class_names_to_keys[ name ] )
else:
url_class.RegenerateClassKey()
for url_class in default_url_classes:
url_class.RegenerateClassKey()
existing_url_classes = list( self._url_classes )
@ -1762,7 +1818,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
with self._lock:
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
parser_key = parser.GetParserKey()
self._url_class_keys_to_parser_keys[ url_class_key ] = parser_key
@ -1899,7 +1955,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
deletee_url_class_keys = set()
for ( url_class_key, parser_key ) in list(self._url_class_keys_to_parser_keys.items()):
for ( url_class_key, parser_key ) in self._url_class_keys_to_parser_keys.items():
if parser_key not in parser_keys:
@ -1926,8 +1982,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
# by default, we will show post urls
old_post_url_class_keys = { url_class.GetMatchKey() for url_class in self._url_classes if url_class.IsPostURL() }
post_url_class_keys = { url_class.GetMatchKey() for url_class in url_classes if url_class.IsPostURL() }
old_post_url_class_keys = { url_class.GetClassKey() for url_class in self._url_classes if url_class.IsPostURL() }
post_url_class_keys = { url_class.GetClassKey() for url_class in url_classes if url_class.IsPostURL() }
added_post_url_class_keys = post_url_class_keys.difference( old_post_url_class_keys )
@ -1945,7 +2001,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
# delete orphans
url_class_keys = { url_class.GetMatchKey() for url_class in url_classes }
url_class_keys = { url_class.GetClassKey() for url_class in url_classes }
self._url_class_keys_to_display.intersection_update( url_class_keys )
@ -1960,7 +2016,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
for ( url_class_original, url_class_api ) in url_class_api_pairs:
url_class_key = url_class_original.GetMatchKey()
url_class_key = url_class_original.GetClassKey()
if url_class_key in self._url_class_keys_to_parser_keys:
@ -1982,6 +2038,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
self._url_class_keys_to_parser_keys.update( url_class_keys_to_parser_keys )
self._CleanURLClassKeysToParserKeys()
self._SetDirty()
@ -2021,6 +2079,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
self._url_class_keys_to_parser_keys.update( new_url_class_keys_to_parser_keys )
self._CleanURLClassKeysToParserKeys()
self._SetDirty()
@ -2096,11 +2156,16 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
for url_class in url_classes:
if url_class in api_pair_unparsable_url_classes:
continue
if url_class.Matches( example_url ):
# we have a match. this is the 'correct' match for this example url, and we should not search any more, so we break below
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
parsable = url_class.IsParsable()
linkable = url_class_key not in existing_url_class_keys_to_parser_keys and url_class_key not in new_url_class_keys_to_parser_keys
@ -2125,7 +2190,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
continue
url_class_key = url_class.GetMatchKey()
url_class_key = url_class.GetClassKey()
if url_class_key in existing_url_class_keys_to_parser_keys:
@ -2547,6 +2612,11 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
return ( self._url_template, self._replacement_phrase, self._search_terms_separator, self._example_search_text )
def SetGUGKey( self, gug_key: bytes ):
self._gallery_url_generator_key = gug_key
def SetGUGKeyAndName( self, gug_key_and_name ):
( gug_key, name ) = gug_key_and_name
@ -2744,6 +2814,11 @@ class NestedGalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
self._gug_keys_and_names = good_gug_keys_and_names
def SetGUGKey( self, gug_key: bytes ):
self._gallery_url_generator_key = gug_key
def SetGUGKeyAndName( self, gug_key_and_name ):
( gug_key, name ) = gug_key_and_name
@ -3141,7 +3216,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
return ( self._gallery_index_type, self._gallery_index_identifier, self._gallery_index_delta )
def GetMatchKey( self ):
def GetClassKey( self ):
return self._url_class_key
@ -3359,7 +3434,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
return is_a_direct_file_page or is_a_single_file_post_page
def RegenerateMatchKey( self ):
def RegenerateClassKey( self ):
self._url_class_key = HydrusData.GenerateKey()
@ -3369,7 +3444,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
self._example_url = example_url
def SetMatchKey( self, match_key ):
def SetClassKey( self, match_key ):
self._url_class_key = match_key

View File

@ -689,68 +689,10 @@ class NetworkJob( object ):
if self._ObeysBandwidth():
result = self.engine.bandwidth_manager.TryToStartRequest( self._network_contexts )
if result:
self._bandwidth_tracker.ReportRequestUsed()
else:
( bandwidth_waiting_duration, bandwidth_network_context ) = self.engine.bandwidth_manager.GetWaitingEstimateAndContext( self._network_contexts )
will_override = self._bandwidth_manual_override_delayed_timestamp is not None
override_coming_first = False
if will_override:
override_waiting_duration = self._bandwidth_manual_override_delayed_timestamp - HydrusData.GetNow()
override_coming_first = override_waiting_duration < bandwidth_waiting_duration
just_now_threshold = 2
if override_coming_first:
waiting_duration = override_waiting_duration
waiting_str = 'overriding bandwidth ' + HydrusData.TimestampToPrettyTimeDelta( self._bandwidth_manual_override_delayed_timestamp, just_now_string = 'imminently', just_now_threshold = just_now_threshold )
else:
waiting_duration = bandwidth_waiting_duration
waiting_str = 'bandwidth free ' + HydrusData.TimestampToPrettyTimeDelta( HydrusData.GetNow() + waiting_duration, just_now_string = 'imminently', just_now_threshold = just_now_threshold )
waiting_str += '\u2026 (' + bandwidth_network_context.ToHumanString() + ')'
self._status_text = waiting_str
if waiting_duration > 1200:
self._Sleep( 30 )
elif waiting_duration > 120:
self._Sleep( 10 )
elif waiting_duration > 10:
self._Sleep( 1 )
return result
return self.engine.bandwidth_manager.CanDoWork( self._network_contexts )
else:
self._bandwidth_tracker.ReportRequestUsed()
self.engine.bandwidth_manager.ReportRequestUsed( self._network_contexts )
return True
@ -1404,6 +1346,79 @@ class NetworkJob( object ):
return True
def TryToStartBandwidth( self ):
with self._lock:
if self._ObeysBandwidth():
result = self.engine.bandwidth_manager.TryToStartRequest( self._network_contexts )
if result:
self._bandwidth_tracker.ReportRequestUsed()
else:
( bandwidth_waiting_duration, bandwidth_network_context ) = self.engine.bandwidth_manager.GetWaitingEstimateAndContext( self._network_contexts )
will_override = self._bandwidth_manual_override_delayed_timestamp is not None
override_coming_first = False
if will_override:
override_waiting_duration = self._bandwidth_manual_override_delayed_timestamp - HydrusData.GetNow()
override_coming_first = override_waiting_duration < bandwidth_waiting_duration
just_now_threshold = 2
if override_coming_first:
waiting_duration = override_waiting_duration
waiting_str = 'overriding bandwidth ' + HydrusData.TimestampToPrettyTimeDelta( self._bandwidth_manual_override_delayed_timestamp, just_now_string = 'imminently', just_now_threshold = just_now_threshold )
else:
waiting_duration = bandwidth_waiting_duration
waiting_str = 'bandwidth free ' + HydrusData.TimestampToPrettyTimeDelta( HydrusData.GetNow() + waiting_duration, just_now_string = 'imminently', just_now_threshold = just_now_threshold )
waiting_str += '\u2026 (' + bandwidth_network_context.ToHumanString() + ')'
self._status_text = waiting_str
if waiting_duration > 1200:
self._Sleep( 30 )
elif waiting_duration > 120:
self._Sleep( 10 )
elif waiting_duration > 10:
self._Sleep( 1 )
return result
else:
self._bandwidth_tracker.ReportRequestUsed()
self.engine.bandwidth_manager.ReportRequestUsed( self._network_contexts )
return True
def WaitUntilDone( self ):
while True:

View File

@ -68,6 +68,8 @@ def VideoHasAudio( path ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
try:
process = subprocess.Popen( cmd, bufsize = 65536, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )

View File

@ -73,7 +73,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 400
SOFTWARE_VERSION = 401
CLIENT_API_VERSION = 12
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )

View File

@ -57,6 +57,13 @@ def CalculateScoreFromRating( count, rating ):
return score
def CheckProgramIsNotShuttingDown():
if HG.model_shutdown:
raise HydrusExceptions.ShutdownException( 'Application is shutting down!' )
def CleanRunningFile( db_path, instance ):
# just to be careful
@ -718,7 +725,7 @@ def GetSubprocessEnv():
remove_if_hydrus_base_dir = [ 'QT_PLUGIN_PATH', 'QML2_IMPORT_PATH', 'SSL_CERT_FILE' ]
hydrus_base_dir = HG.client_controller.GetDBDir()
hydrus_base_dir = HG.controller.GetDBDir()
for key in remove_if_hydrus_base_dir:

View File

@ -1,6 +1,7 @@
from hydrus.external import hexagonitswfheader
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusThreading
import os
import subprocess
import time
@ -49,6 +50,8 @@ def RenderPageToFile( path, temp_path, page_index ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, **sbp_kwargs )
while p.poll() is None:
@ -63,5 +66,5 @@ def RenderPageToFile( path, temp_path, page_index ):
time.sleep( 0.5 )
p.communicate()
HydrusThreading.SubprocessCommunicate( p )

View File

@ -2,6 +2,7 @@ from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusText
from hydrus.core import HydrusThreading
import os
import socket
import subprocess
@ -35,11 +36,13 @@ def GetExternalIP():
sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
HydrusData.WaitForProcessToFinish( p, 30 )
( stdout, stderr ) = p.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( p )
if stderr is not None and len( stderr ) > 0:
@ -83,11 +86,13 @@ def AddUPnPMapping( internal_client, internal_port, external_port, protocol, des
sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
HydrusData.WaitForProcessToFinish( p, 30 )
( stdout, stderr ) = p.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( p )
if 'x.x.x.x:' + str( external_port ) + ' TCP is redirected to internal ' + internal_client + ':' + str( internal_port ) in stdout:
@ -117,11 +122,13 @@ def GetUPnPMappings():
sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
HydrusData.WaitForProcessToFinish( p, 30 )
( stdout, stderr ) = p.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( p )
if stderr is not None and len( stderr ) > 0:
@ -213,11 +220,13 @@ def RemoveUPnPMapping( external_port, protocol ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
HydrusData.CheckProgramIsNotShuttingDown()
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
HydrusData.WaitForProcessToFinish( p, 30 )
( stdout, stderr ) = p.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( p )
if stderr is not None and len( stderr ) > 0: raise Exception( 'Problem while trying to remove UPnP mapping:' + os.linesep * 2 + stderr )

View File

@ -469,9 +469,11 @@ def LaunchDirectory( path ):
preexec_fn = getattr( os, 'setsid', None )
HydrusData.CheckProgramIsNotShuttingDown()
process = subprocess.Popen( cmd, preexec_fn = preexec_fn, **sbp_kwargs )
process.communicate()
HydrusThreading.SubprocessCommunicate( process )
@ -524,9 +526,11 @@ def LaunchFile( path, launch_path = None ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs( hide_terminal = hide_terminal, text = True )
HydrusData.CheckProgramIsNotShuttingDown()
process = subprocess.Popen( cmd, preexec_fn = preexec_fn, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
( stdout, stderr ) = process.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( process )
if HG.subprocess_report_mode:
@ -850,9 +854,11 @@ def OpenFileLocation( path ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs( hide_terminal = False )
HydrusData.CheckProgramIsNotShuttingDown()
process = subprocess.Popen( cmd, **sbp_kwargs )
process.communicate()
HydrusThreading.SubprocessCommunicate( process )
thread = threading.Thread( target = do_it )

View File

@ -105,6 +105,7 @@ SERIALISABLE_TYPE_SUBSCRIPTION_QUERY_LOG_CONTAINER = 86
SERIALISABLE_TYPE_SUBSCRIPTION_QUERY_HEADER = 87
SERIALISABLE_TYPE_SUBSCRIPTION = 88
SERIALISABLE_TYPE_FILE_SEED_CACHE_STATUS = 89
SERIALISABLE_TYPE_SUBSCRIPTION_CONTAINER = 90
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}

View File

@ -3,6 +3,7 @@ import collections
from hydrus.core import HydrusExceptions
import queue
import random
import subprocess
import threading
import time
import traceback
@ -69,6 +70,11 @@ def GetThreadInfo( thread = None ):
def IsThreadShuttingDown():
if HG.emergency_exit:
return True
me = threading.current_thread()
if isinstance( me, DAEMON ):
@ -96,6 +102,39 @@ def ShutdownThread( thread ):
thread_info[ 'shutting_down' ] = True
def SubprocessCommunicate( process: subprocess.Popen ):
def do_test():
if HG.model_shutdown:
try:
process.kill()
except:
pass
raise HydrusExceptions.ShutdownException( 'Application is shutting down!' )
do_test()
while True:
try:
return process.communicate( timeout = 10 )
except subprocess.TimeoutExpired:
do_test()
class DAEMON( threading.Thread ):
def __init__( self, controller, name ):

View File

@ -3,6 +3,7 @@ from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusText
from hydrus.core import HydrusThreading
import numpy
import os
import re
@ -44,6 +45,8 @@ def GetFFMPEGVersion():
cmd = [ FFMPEG_PATH, '-version' ]
HydrusData.CheckProgramIsNotShuttingDown()
try:
sbp_kwargs = HydrusData.GetSubprocessKWArgs( text = True )
@ -61,7 +64,7 @@ def GetFFMPEGVersion():
return 'unable to execute ffmpeg at path "{}"'.format( FFMPEG_PATH )
( stdout, stderr ) = process.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( process )
del process
@ -135,6 +138,8 @@ def GetFFMPEGInfoLines( path, count_frames_manually = False, only_first_second =
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
try:
process = subprocess.Popen( cmd, bufsize = 10**5, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )
@ -168,7 +173,7 @@ def GetFFMPEGInfoLines( path, count_frames_manually = False, only_first_second =
raise FileNotFoundError( 'Cannot interact with video because FFMPEG not found--are you sure it is installed? Full error: ' + str( e ) )
( stdout, stderr ) = process.communicate()
( stdout, stderr ) = HydrusThreading.SubprocessCommunicate( process )
data_bytes = stderr
@ -793,6 +798,8 @@ class VideoRendererFFMPEG( object ):
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
HydrusData.CheckProgramIsNotShuttingDown()
try:
self.process = subprocess.Popen( cmd, bufsize = self.bufsize, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, **sbp_kwargs )

View File

@ -523,25 +523,25 @@ class TestNetworkingJob( unittest.TestCase ):
job = self._GetJob()
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.ReportDataUsed( [ DOMAIN_NETWORK_CONTEXT ], 50 )
job.engine.bandwidth_manager.SetRules( DOMAIN_NETWORK_CONTEXT, RESTRICTIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), False )
self.assertEqual( job.TryToStartBandwidth(), False )
#
job = self._GetJob( for_login = True )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.ReportDataUsed( [ DOMAIN_NETWORK_CONTEXT ], 50 )
job.engine.bandwidth_manager.SetRules( DOMAIN_NETWORK_CONTEXT, RESTRICTIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
def test_bandwidth_ok( self ):
@ -558,11 +558,11 @@ class TestNetworkingJob( unittest.TestCase ):
job.engine.bandwidth_manager.ReportDataUsed( [ DOMAIN_NETWORK_CONTEXT ], 50 )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.SetRules( DOMAIN_NETWORK_CONTEXT, PERMISSIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
#
@ -570,11 +570,11 @@ class TestNetworkingJob( unittest.TestCase ):
job.engine.bandwidth_manager.ReportDataUsed( [ DOMAIN_NETWORK_CONTEXT ], 50 )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.SetRules( DOMAIN_NETWORK_CONTEXT, PERMISSIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
def test_bandwidth_reported( self ):
@ -585,7 +585,7 @@ class TestNetworkingJob( unittest.TestCase ):
job = self._GetJob()
job.BandwidthOK()
job.TryToStartBandwidth()
job.Start()
@ -695,25 +695,25 @@ class TestNetworkingJobHydrus( unittest.TestCase ):
job = self._GetJob()
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.ReportDataUsed( [ HYDRUS_NETWORK_CONTEXT ], 50 )
job.engine.bandwidth_manager.SetRules( HYDRUS_NETWORK_CONTEXT, RESTRICTIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), False )
self.assertEqual( job.TryToStartBandwidth(), False )
#
job = self._GetJob( for_login = True )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.ReportDataUsed( [ HYDRUS_NETWORK_CONTEXT ], 50 )
job.engine.bandwidth_manager.SetRules( HYDRUS_NETWORK_CONTEXT, RESTRICTIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
def test_bandwidth_ok( self ):
@ -730,11 +730,11 @@ class TestNetworkingJobHydrus( unittest.TestCase ):
job.engine.bandwidth_manager.ReportDataUsed( [ HYDRUS_NETWORK_CONTEXT ], 50 )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.SetRules( HYDRUS_NETWORK_CONTEXT, PERMISSIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
#
@ -742,11 +742,11 @@ class TestNetworkingJobHydrus( unittest.TestCase ):
job.engine.bandwidth_manager.ReportDataUsed( [ HYDRUS_NETWORK_CONTEXT ], 50 )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
job.engine.bandwidth_manager.SetRules( HYDRUS_NETWORK_CONTEXT, PERMISSIVE_DATA_RULES )
self.assertEqual( job.BandwidthOK(), True )
self.assertEqual( job.TryToStartBandwidth(), True )
def test_bandwidth_reported( self ):

View File

@ -22,13 +22,13 @@ import twisted.internet.ssl
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
with open( os.path.join( HC.STATIC_DIR, 'hydrus.png' ), 'rb' ) as f:
with open( os.path.join( HC.STATIC_DIR, 'hydrus.png' ), 'rb' ) as f_g:
EXAMPLE_FILE = f.read()
EXAMPLE_FILE = f_g.read()
with open( os.path.join( HC.STATIC_DIR, 'hydrus_small.png' ), 'rb' ) as f:
with open( os.path.join( HC.STATIC_DIR, 'hydrus_small.png' ), 'rb' ) as f_g:
EXAMPLE_THUMBNAIL = f.read()
EXAMPLE_THUMBNAIL = f_g.read()
class TestServer( unittest.TestCase ):

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 2.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.9 KiB

After

Width:  |  Height:  |  Size: 1.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 KiB

BIN
static/github.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 KiB