Version 397
This commit is contained in:
parent
da8c961219
commit
9b31343f6b
|
@ -14,7 +14,7 @@ The client can do quite a lot! Please check out the help inside the release or [
|
|||
* [endchan bunker](https://endchan.net/hydrus/)
|
||||
* [twitter](https://twitter.com/hydrusnetwork)
|
||||
* [tumblr](http://hydrus.tumblr.com/)
|
||||
* [discord](https://discord.gg/3H8UTpb)
|
||||
* [discord](https://discord.gg/wPHPCUZ)
|
||||
* [patreon](https://www.patreon.com/hydrus_dev)
|
||||
|
||||
## Attribution
|
||||
|
|
|
@ -6,7 +6,7 @@
|
|||
</head>
|
||||
<body>
|
||||
<div class="content">
|
||||
<p class="warning">The PTR is now run by users with more bandwidth than I had to give, so the bandwidth limits are gone! If you would like to talk with the new management, please check the <a href="https://discord.gg/3H8UTpb">discord</a>.</p>
|
||||
<p class="warning">The PTR is now run by users with more bandwidth than I had to give, so the bandwidth limits are gone! If you would like to talk with the new management, please check the <a href="https://discord.gg/wPHPCUZ">discord</a>.</p>
|
||||
<p class="warning">A guide and schema for the new PTR is <a href="https://github.com/Zweibach/text/blob/master/Hydrus/PTR.md">here</a>.</p>
|
||||
<h3>first off</h3>
|
||||
<p>I have purposely not pre-baked any default repositories into the client. You have to choose to connect yourself. <b>The client will never connect anywhere until you tell it to.</b></p>
|
||||
|
|
|
@ -8,6 +8,51 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 397</h3></li>
|
||||
<ul>
|
||||
<li>regular changelog:</li>
|
||||
<li>added 'system:has/has no note with name xxx' to search for specific note names</li>
|
||||
<li>in the normal system predicate list, the notes pred is now the generic 'system:notes' to launch a combined dialog for both num notes and named notes</li>
|
||||
<li>favourite tag suggestions are now sorted in manage tags dialog according to the default tag sort</li>
|
||||
<li>page names will now middle...elide when there are too many to fit into a row (and normally left/right buttons would be added). if the elided tabs still do not fit, the buttons will pop up as before. added a checkbox to options->gui pages to turn this text eliding off</li>
|
||||
<li>pulled the 'page name' options on that panel into their own box and added some text regarding the 'my big row of import page tabs keeps scrolling weird' issue</li>
|
||||
<li>when files are pixel duplicates, the filesize and age comparison statements will now have 0 score and thus be coloured neutral blue</li>
|
||||
<li>the standard text entry dialog now always selects any default text it starts with, so you can now type to immediately overwrite. see how you like it and if there are some places where you think an exception should be made</li>
|
||||
<li>updated the IPFS interface to work with the new IPFS 5.0. all api requests are now POST so it doesn't 405, and the User-Agent is overridden to one that IPFS will not 403 at, and I fixed a typo the new api is more strict about</li>
|
||||
<li>a hack to get page splitters to lay out correctly on session load is rewritten from a hammer to a scalpel. pages now set their splitter positions on their first individual visible selection. this both reduces some minor ui lag on session/page load and improves splitter positions for clients that open minimised to the system tray</li>
|
||||
<li>a long-time odd issue where loaded sessions would initially select the top-left-most non-page of pages is fixed. now the bottom-left-most page of any kind is selected</li>
|
||||
<li>fixed tag autocomplete selecting the bottom-most pre-loading result. it now correctly selects at the top</li>
|
||||
<li>fixed an issue setting certain values (typically loading a default) to a tag import options panel</li>
|
||||
<li>the client is now more aggressive about clearing subscriptions from memory when they are finished running</li>
|
||||
<li>in windows, the main method that copies files now checks for modified time of the source file. if it is before 1980-01-01 UTC, it does not copy the file metadata, as some Windows has trouble with this lmaoooo</li>
|
||||
<li>cleaned up how some thumbnail 'current focus' media determination code works. should have fixed some weird errors when hitting certain shortcuts on collections</li>
|
||||
<li>cleaned up basic list/sort code across the program</li>
|
||||
<li>the 'queue' and add/edit/delete listboxes now emit change signals when new items are added or imported</li>
|
||||
<li>pyparsing, a helper for cloudscraper, is now correctly bundled in the built releases. a new line in help->about displays this</li>
|
||||
<li>help->about now lists cloudscraper version</li>
|
||||
<li>updated the discord link to the new https://discord.gg/wPHPCUZ</li>
|
||||
<li>.</li>
|
||||
<li>upcoming string processing changes for advanced users:</li>
|
||||
<li>I extended string parsing code this week, but I am not yet ready to turn it on. when it does come on, it will change all formulae from the fixed string match/converter pair a combined general string processing 'script' of n steps</li>
|
||||
<li>wrote a new 'string splitter' object that takes one strings and splits it into up to n strings based on a separator phrase (such as ' ,')</li>
|
||||
<li>wrote an edit panel for string splitters</li>
|
||||
<li>wrote a new 'string processor' object that holds n ordered string match/converter/splitter objects and filters/converts/splits x strings into y strings based on those steps</li>
|
||||
<li>wrote an edit panel for string processors. it has a notebook that live updates with test results for each step on every update</li>
|
||||
<li>wrote unit tests for string match</li>
|
||||
<li>wrote unit tests for string converter</li>
|
||||
<li>wrote unit tests for string splitter</li>
|
||||
<li>wrote unit tests for string processor</li>
|
||||
<li>refactored string conversion edit panels to their own file</li>
|
||||
<li>refactored string conversion controls to their own file</li>
|
||||
<li>misc string processing cleanup and labelling improvements</li>
|
||||
<li>.</li>
|
||||
<li>technical url parsing stuff:</li>
|
||||
<li>urls are now stripped of leading and trailing whitespace during normalisation, just in case a paste contains some extra whitespace. previously, it would sometimes throw a 'doesn't start with http' error</li>
|
||||
<li>the hydrus url normalisation process now normalises the hostname according to the NKFC unicode format, meaning unusual characters like ?and e◌́ are now replaced with their normalised visual equivalent ? and é, and hence these urls will no longer throw errors when they are added</li>
|
||||
<li>if '?' or '#' end up in a hostname (which are invalid characters), it is now converted to _, just to stop complete parse mangling when weird urls are submitted. this character replacement may become more sophisticated in future</li>
|
||||
<li>the hydrus downloader should now support search terms that include '#'</li>
|
||||
<li>download query parameters that contain '%23' ('#', encoded) are now not unquoted in url normalisation</li>
|
||||
</ul>
|
||||
<li><h3>version 396</h3></li>
|
||||
<ul>
|
||||
<li>notes:</li>
|
||||
|
|
|
@ -22,7 +22,7 @@
|
|||
<li><a href="https://github.com/hydrusnetwork/hydrus">github</a></li>
|
||||
<li><a href="https://twitter.com/hydrusnetwork">twitter</a></li>
|
||||
<li><a href="mailto:hydrus.admin@gmail.com">email</a></li>
|
||||
<li><a href="https://discord.gg/3H8UTpb">discord</a></li>
|
||||
<li><a href="https://discord.gg/wPHPCUZ">discord</a></li>
|
||||
<li><a href="https://www.patreon.com/hydrus_dev">patreon</a></li>
|
||||
<li><a href="https://github.com/CuddleBear92/Hydrus-Presets-and-Scripts">user-run wiki (including download presets for several non-default boorus)</a>
|
||||
</ul>
|
||||
|
|
|
@ -12,12 +12,13 @@
|
|||
<p><a href="https://en.wikipedia.org/wiki/Tag_(metadata)">wiki</a></p>
|
||||
<p>A <i>tag</i> is a small bit of text describing a single property of something. They make searching easy. Good examples are "flower" or "nicolas cage" or "the sopranos" or "2003". By combining several tags together ( e.g. [ 'tiger woods', 'sports illustrated', '2008' ] or [ 'cosplay', 'the legend of zelda' ] ), a huge image collection is reduced to a tiny and easy-to-digest sample.</p>
|
||||
<p>A good word for the connection of a particular tag to a particular file is <i>mapping</i>.</p>
|
||||
<p>In the hydrus network, all tags are automatically converted to lower case. 'Sunset Drive' becomes 'sunset drive'. Why?</p>
|
||||
<p>Hydrus is designed with the intention that tags are for <i>searching</i>, not <i>describing</i>. Workflows and UI are tuned for finding files and other similar files (e.g. by the same artist), and while it is possible to have nice metadata overlays around files, this is not considered their chief purpose. Trying to have 'perfect' descriptions for files is often a rabbit-hole that can consume hours of work with relatively little demonstrable benefit.</p>
|
||||
<p>All tags are automatically converted to lower case. 'Sunset Drive' becomes 'sunset drive'. Why?</p>
|
||||
<ol>
|
||||
<li>Although it may seem preferable to have 'The Lord of the Rings' rather than 'the lord of the rings', there are many, many special cases where style guides differ on which words to capitalise.</li>
|
||||
<li>Searches become far easier when case is not matched. And When case does not matter, what point is there in recording it?</li>
|
||||
<li>Although it is more beautiful to have 'The Lord of the Rings' rather than 'the lord of the rings', there are many, many special cases where style guides differ on which words to capitalise.</li>
|
||||
<li>As 'The Lord of the Rings' and 'the lord of the rings' are semantically identical, it is natural to search in a case insensitive way. When case does not matter, what point is there in recording it?</li>
|
||||
</ol>
|
||||
<p>Secondly, leading and trailing whitespace is removed, and multiple whitespace is collapsed to a single character. <pre>' yellow dress '</pre> becomes <pre>'yellow dress'</pre></p>
|
||||
<p>Furthermore, leading and trailing whitespace is removed, and multiple whitespace is collapsed to a single character. <pre>' yellow dress '</pre> becomes <pre>'yellow dress'</pre></p>
|
||||
<a id="namespaces"><h3>what is a namespace?</h3></a>
|
||||
<p>A <i>namespace</i> is a category that in hydrus prefixes a tag. An example is 'person' in the tag 'person:ron paul'--it lets people and software know that 'ron paul' is a name. You can create any namespace you like; just type one or more words and then a colon, and then the next string of text will have that namespace.</p>
|
||||
<p>The hydrus client gives namespaces different colours so you can pick out important tags more easily in a large list, and you can also search by a particular namespace, even creating complicated predicates like 'give all files that do not have any character tags', for instance.</p>
|
||||
|
@ -30,7 +31,7 @@
|
|||
<li>A filename is often--for <i>ridiculous</i> reasons--limited to a certain prohibitive character set. Even when utf-8 is supported, some arbitrary ascii characters are usually not, and different localisations, operating systems and formatting conventions only make it worse.</p>
|
||||
<li>Folders can offer context, but they are clunky and time-consuming to change. If you put each chapter of a comic in a different folder, for instance, reading several volumes in one sitting can be a pain. Nesting many folders adds navigation-latency and tends to induce less informative "04.jpg"-type filenames.</li>
|
||||
</ul>
|
||||
<p>So, the client tracks files by their <i>hash</i>.</p>
|
||||
<p>So, the client tracks files by their <i>hash</i>. This technical identifier easily eliminates duplicates and permits the database to robustly attach other metadata like tags and ratings and known urls and notes and everything else, even across multiple clients and even if a file is deleted and later imported.</p>
|
||||
<p>As a general rule, I suggest you not set up hydrus to parse and display all your imported files' filenames as tags. 'image.jpg' is useless as a tag. <a href="https://www.youtube.com/watch?v=_yYS0ZZdsnA">Shed the concept of filenames as you would chains.</a></p>
|
||||
<a id="external_files"><h3>can the client manage files from their original locations?</h3></a>
|
||||
<p>When the client imports a file, it makes a quickly accessible but human-ugly copy in its internal database, by default under <i>install_dir/db/client_files</i>. When it needs to access that file again, it always knows where it is, and it can be confident it is what it expects it to be. It never accesses the original again.</p>
|
||||
|
|
|
@ -372,9 +372,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
l = [ basic_permission_to_str_lookup[ p ] for p in self._basic_permissions ]
|
||||
|
||||
l.sort()
|
||||
l = sorted( ( basic_permission_to_str_lookup[ p ] for p in self._basic_permissions ) )
|
||||
|
||||
return ', '.join( l )
|
||||
|
||||
|
|
|
@ -228,11 +228,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
def _ReportShutdownDaemonsStatus( self ):
|
||||
|
||||
names = { daemon.name for daemon in self._daemons if daemon.is_alive() }
|
||||
|
||||
names = list( names )
|
||||
|
||||
names.sort()
|
||||
names = sorted( { daemon.name for daemon in self._daemons if daemon.is_alive() } )
|
||||
|
||||
self.pub( 'splash_set_status_subtext', ', '.join( names ) )
|
||||
|
||||
|
|
|
@ -1622,11 +1622,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
existing_table_names = { name for name in existing_table_names if True in ( name.startswith( table_prefix ) for table_prefix in table_prefixes ) }
|
||||
|
||||
surplus_table_names = existing_table_names.difference( good_table_names )
|
||||
|
||||
surplus_table_names = list( surplus_table_names )
|
||||
|
||||
surplus_table_names.sort()
|
||||
surplus_table_names = sorted( existing_table_names.difference( good_table_names ) )
|
||||
|
||||
for table_name in surplus_table_names:
|
||||
|
||||
|
@ -3369,9 +3365,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
distances_to_pairs = HydrusData.BuildKeyToListDict( ( ( distance, ( smaller_media_id, larger_media_id ) ) for ( smaller_media_id, larger_media_id, distance ) in result ) )
|
||||
|
||||
distances = list( distances_to_pairs.keys() )
|
||||
|
||||
distances.sort()
|
||||
distances = sorted( distances_to_pairs.keys() )
|
||||
|
||||
# we want to preference pairs that have the smallest distance between them. deciding on more similar files first helps merge dupes before dealing with alts so reduces potentials more quickly
|
||||
for distance in distances:
|
||||
|
@ -4891,7 +4885,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NOT_LOCAL, min_current_count = num_not_local ) )
|
||||
|
||||
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME ] ] )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME ] ] )
|
||||
|
||||
if have_ratings:
|
||||
|
||||
|
@ -5256,6 +5250,13 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return hash_ids
|
||||
|
||||
|
||||
def _GetHashIdsFromNoteName( self, name: str, hash_ids_table_name: str ):
|
||||
|
||||
label_id = self._GetLabelId( name )
|
||||
|
||||
return self._STS( self._c.execute( 'SELECT hash_id FROM file_notes NATURAL JOIN {} WHERE name_id = ?;'.format( hash_ids_table_name ), ( label_id, ) ) )
|
||||
|
||||
|
||||
def _GetHashIdsFromNumNotes( self, min_num_notes: typing.Optional[ int ], max_num_notes: typing.Optional[ int ], hash_ids_table_name: str ):
|
||||
|
||||
has_notes = max_num_notes is None and min_num_notes == 1
|
||||
|
@ -6178,6 +6179,40 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if 'has_note_names' in simple_preds:
|
||||
|
||||
inclusive_note_names = simple_preds[ 'has_note_names' ]
|
||||
|
||||
for note_name in inclusive_note_names:
|
||||
|
||||
with HydrusDB.TemporaryIntegerTable( self._c, query_hash_ids, 'hash_id' ) as temp_table_name:
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
notes_hash_ids = self._GetHashIdsFromNoteName( note_name, temp_table_name )
|
||||
|
||||
query_hash_ids = intersection_update_qhi( query_hash_ids, notes_hash_ids )
|
||||
|
||||
|
||||
|
||||
|
||||
if 'not_has_note_names' in simple_preds:
|
||||
|
||||
exclusive_note_names = simple_preds[ 'not_has_note_names' ]
|
||||
|
||||
for note_name in exclusive_note_names:
|
||||
|
||||
with HydrusDB.TemporaryIntegerTable( self._c, query_hash_ids, 'hash_id' ) as temp_table_name:
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
notes_hash_ids = self._GetHashIdsFromNoteName( note_name, temp_table_name )
|
||||
|
||||
query_hash_ids.difference_update( notes_hash_ids )
|
||||
|
||||
|
||||
|
||||
|
||||
for ( view_type, viewing_locations, operator, viewing_value ) in system_predicates.GetFileViewingStatsPredicates():
|
||||
|
||||
only_do_zero = ( operator in ( '=', '\u2248' ) and viewing_value == 0 ) or ( operator == '<' and viewing_value == 1 )
|
||||
|
@ -7000,9 +7035,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
possible_due_names.add( name )
|
||||
|
||||
|
||||
possible_due_names = list( possible_due_names )
|
||||
|
||||
possible_due_names.sort()
|
||||
possible_due_names = sorted( possible_due_names )
|
||||
|
||||
if len( possible_due_names ) > 0:
|
||||
|
||||
|
@ -7934,9 +7967,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_ids_i_can_process = set()
|
||||
|
||||
update_indices = list( update_indices_to_unprocessed_hash_ids.keys() )
|
||||
|
||||
update_indices.sort()
|
||||
update_indices = sorted( update_indices_to_unprocessed_hash_ids.keys() )
|
||||
|
||||
for update_index in update_indices:
|
||||
|
||||
|
@ -8076,9 +8107,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
service_id = self._GetServiceId( service_key )
|
||||
hash_ids = self._GetHashIds( hashes )
|
||||
|
||||
result = [ filename for ( filename, ) in self._c.execute( 'SELECT filename FROM service_filenames WHERE service_id = ? AND hash_id IN ' + HydrusData.SplayListForDB( hash_ids ) + ';', ( service_id, ) ) ]
|
||||
|
||||
result.sort()
|
||||
result = sorted( ( filename for ( filename, ) in self._c.execute( 'SELECT filename FROM service_filenames WHERE service_id = ? AND hash_id IN ' + HydrusData.SplayListForDB( hash_ids ) + ';', ( service_id, ) ) ) )
|
||||
|
||||
return result
|
||||
|
||||
|
@ -9672,9 +9701,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
else:
|
||||
|
||||
children = [ ( HydrusData.Get64BitHammingDistance( phash, child_phash ), child_id, child_phash ) for ( child_id, child_phash ) in children ]
|
||||
|
||||
children.sort()
|
||||
children = sorted( ( ( HydrusData.Get64BitHammingDistance( phash, child_phash ), child_id, child_phash ) for ( child_id, child_phash ) in children ) )
|
||||
|
||||
median_index = len( children ) // 2
|
||||
|
||||
|
@ -9885,9 +9912,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
for ( v_id, v_phash ) in viewpoints:
|
||||
|
||||
views = [ HydrusData.Get64BitHammingDistance( v_phash, s_phash ) for ( s_id, s_phash ) in sample if v_id != s_id ]
|
||||
|
||||
views.sort()
|
||||
views = sorted( ( HydrusData.Get64BitHammingDistance( v_phash, s_phash ) for ( s_id, s_phash ) in sample if v_id != s_id ) )
|
||||
|
||||
# let's figure out the ratio of left_children to right_children, preferring 1:1, and convert it to a discrete integer score
|
||||
|
||||
|
@ -11768,14 +11793,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
main_mappings_tables.update( ( name.split( '.' )[1] for name in GenerateMappingsTableNames( service_id ) ) )
|
||||
|
||||
|
||||
missing_main_tables = main_mappings_tables.difference( existing_mapping_tables )
|
||||
missing_main_tables = sorted( main_mappings_tables.difference( existing_mapping_tables ) )
|
||||
|
||||
if len( missing_main_tables ) > 0:
|
||||
|
||||
missing_main_tables = list( missing_main_tables )
|
||||
|
||||
missing_main_tables.sort()
|
||||
|
||||
message = 'On boot, some important mappings tables were missing! This could be due to the entire \'mappings\' database file being missing or some other problem. The tags in these tables are lost. The exact missing tables were:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( missing_main_tables )
|
||||
|
@ -11812,14 +11833,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
main_cache_tables.add( 'integer_subtags' )
|
||||
|
||||
missing_main_tables = main_cache_tables.difference( existing_cache_tables )
|
||||
missing_main_tables = sorted( main_cache_tables.difference( existing_cache_tables ) )
|
||||
|
||||
if len( missing_main_tables ) > 0:
|
||||
|
||||
missing_main_tables = list( missing_main_tables )
|
||||
|
||||
missing_main_tables.sort()
|
||||
|
||||
message = 'On boot, some important caches tables were missing! This could be due to the entire \'caches\' database file being missing or some other problem. Data related to duplicate file search may have been lost. The exact missing tables were:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( missing_main_tables )
|
||||
|
@ -11847,14 +11864,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
mappings_cache_tables.add( 'local_tags_cache' )
|
||||
|
||||
missing_main_tables = mappings_cache_tables.difference( existing_cache_tables )
|
||||
missing_main_tables = sorted( mappings_cache_tables.difference( existing_cache_tables ) )
|
||||
|
||||
if len( missing_main_tables ) > 0:
|
||||
|
||||
missing_main_tables = list( missing_main_tables )
|
||||
|
||||
missing_main_tables.sort()
|
||||
|
||||
message = 'On boot, some mapping caches tables were missing! This could be due to the entire \'caches\' database file being missing or due to some other problem. All of this data can be regenerated. The exact missing tables were:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( missing_main_tables )
|
||||
|
@ -12164,9 +12177,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if store_backups:
|
||||
|
||||
existing_timestamps = self._STL( self._c.execute( 'SELECT timestamp FROM json_dumps_named WHERE dump_type = ? AND dump_name = ?;', ( dump_type, dump_name ) ) )
|
||||
|
||||
existing_timestamps.sort()
|
||||
existing_timestamps = sorted( self._STI( self._c.execute( 'SELECT timestamp FROM json_dumps_named WHERE dump_type = ? AND dump_name = ?;', ( dump_type, dump_name ) ) ) )
|
||||
|
||||
if len( existing_timestamps ) > 0:
|
||||
|
||||
|
|
|
@ -49,9 +49,7 @@ def GenerateExportFilename( destination_directory, media, terms, append_number =
|
|||
|
||||
tags = tags_manager.GetNamespaceSlice( ( term, ), ClientTags.TAG_DISPLAY_SIBLINGS_AND_PARENTS )
|
||||
|
||||
subtags = [ HydrusTags.SplitTag( tag )[1] for tag in tags ]
|
||||
|
||||
subtags.sort()
|
||||
subtags = sorted( ( HydrusTags.SplitTag( tag )[1] for tag in tags ) )
|
||||
|
||||
filename += clean_tag_text( ', '.join( subtags ) )
|
||||
|
||||
|
@ -62,7 +60,7 @@ def GenerateExportFilename( destination_directory, media, terms, append_number =
|
|||
current = tags_manager.GetCurrent( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_SIBLINGS_AND_PARENTS )
|
||||
pending = tags_manager.GetPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_SIBLINGS_AND_PARENTS )
|
||||
|
||||
tags = list( current.union( pending ) )
|
||||
tags = sorted( current.union( pending ) )
|
||||
|
||||
if term == 'nn tags':
|
||||
|
||||
|
@ -73,8 +71,6 @@ def GenerateExportFilename( destination_directory, media, terms, append_number =
|
|||
tags = [ HydrusTags.SplitTag( tag )[1] for tag in tags ]
|
||||
|
||||
|
||||
tags.sort()
|
||||
|
||||
filename += clean_tag_text( ', '.join( tags ) )
|
||||
|
||||
elif term == 'hash':
|
||||
|
|
|
@ -286,9 +286,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
if len( correct_rows ) > 0:
|
||||
|
||||
summaries = [ '{} moved from {} to {}'.format( HydrusData.ToHumanInt( count ), missing_location, correct_location ) for ( ( missing_location, correct_location ), count ) in fixes_counter.items() ]
|
||||
|
||||
summaries.sort()
|
||||
summaries = sorted( ( '{} moved from {} to {}'.format( HydrusData.ToHumanInt( count ), missing_location, correct_location ) for ( ( missing_location, correct_location ), count ) in fixes_counter.items() ) )
|
||||
|
||||
summary_message = 'Some client file folders were missing, but they seem to be in other known locations! The folders are:'
|
||||
summary_message += os.linesep * 2
|
||||
|
@ -638,17 +636,13 @@ class ClientFilesManager( object ):
|
|||
|
||||
missing_dict = HydrusData.BuildKeyToListDict( self._missing_locations )
|
||||
|
||||
missing_locations = list( missing_dict.keys() )
|
||||
|
||||
missing_locations.sort()
|
||||
missing_locations = sorted( missing_dict.keys() )
|
||||
|
||||
missing_string = ''
|
||||
|
||||
for missing_location in missing_locations:
|
||||
|
||||
missing_prefixes = list( missing_dict[ missing_location ] )
|
||||
|
||||
missing_prefixes.sort()
|
||||
missing_prefixes = sorted( missing_dict[ missing_location ] )
|
||||
|
||||
missing_prefixes_string = ' ' + os.linesep.join( ( ', '.join( block ) for block in HydrusData.SplitListIntoChunks( missing_prefixes, 32 ) ) )
|
||||
|
||||
|
|
|
@ -1477,9 +1477,7 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
|
|||
metadata_row[ 'num_words' ] = file_info_manager.num_words
|
||||
metadata_row[ 'has_audio' ] = file_info_manager.has_audio
|
||||
|
||||
known_urls = list( media_result.GetLocationsManager().GetURLs() )
|
||||
|
||||
known_urls.sort()
|
||||
known_urls = sorted( media_result.GetLocationsManager().GetURLs() )
|
||||
|
||||
metadata_row[ 'known_urls' ] = known_urls
|
||||
|
||||
|
@ -1656,11 +1654,8 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceCl
|
|||
|
||||
if HG.client_controller.new_options.GetBoolean( 'notify_client_api_cookies' ) and len( domains_cleared ) + len( domains_set ) > 0:
|
||||
|
||||
domains_cleared = list( domains_cleared )
|
||||
domains_set = list( domains_set )
|
||||
|
||||
domains_cleared.sort()
|
||||
domains_set.sort()
|
||||
domains_cleared = sorted( domains_cleared )
|
||||
domains_set = sorted( domains_set )
|
||||
|
||||
message = 'Cookies sent from API:'
|
||||
|
||||
|
|
|
@ -101,9 +101,7 @@ def CollapseTagSiblingPairs( groups_of_pairs ):
|
|||
|
||||
for pairs in groups_of_pairs:
|
||||
|
||||
pairs = list( pairs )
|
||||
|
||||
pairs.sort()
|
||||
pairs = sorted( pairs )
|
||||
|
||||
for ( bad, good ) in pairs:
|
||||
|
||||
|
|
|
@ -89,6 +89,56 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
s_size = shown_media.GetSize()
|
||||
c_size = comparison_media.GetSize()
|
||||
|
||||
is_a_pixel_dupe = False
|
||||
|
||||
if shown_media.IsStaticImage() and comparison_media.IsStaticImage() and shown_media.GetResolution() == comparison_media.GetResolution():
|
||||
|
||||
global hashes_to_pixel_hashes
|
||||
|
||||
if s_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ s_hash ] = HydrusImageHandling.GetImagePixelHash( path, s_mime )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ c_hash ] = HydrusImageHandling.GetImagePixelHash( path, c_mime )
|
||||
|
||||
|
||||
s_pixel_hash = hashes_to_pixel_hashes[ s_hash ]
|
||||
c_pixel_hash = hashes_to_pixel_hashes[ c_hash ]
|
||||
|
||||
if s_pixel_hash == c_pixel_hash:
|
||||
|
||||
is_a_pixel_dupe = True
|
||||
|
||||
if s_mime == HC.IMAGE_PNG and c_mime != HC.IMAGE_PNG:
|
||||
|
||||
statement = 'this is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = -100
|
||||
|
||||
elif s_mime != HC.IMAGE_PNG and c_mime == HC.IMAGE_PNG:
|
||||
|
||||
statement = 'other file is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = 100
|
||||
|
||||
else:
|
||||
|
||||
statement = 'images are pixel-for-pixel duplicates!'
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statements_and_scores[ 'pixel_duplicates' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
if s_size != c_size:
|
||||
|
||||
size_ratio = s_size / c_size
|
||||
|
@ -119,6 +169,11 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
score = 0
|
||||
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{} {} {}'.format( HydrusData.ToHumanBytes( s_size ), operator, HydrusData.ToHumanBytes( c_size ) )
|
||||
|
||||
statements_and_scores[ 'filesize' ] = ( statement, score )
|
||||
|
@ -321,6 +376,11 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
score = -duplicate_comparison_score_older
|
||||
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{} {} {}'.format( HydrusData.TimestampToPrettyTimeDelta( s_ts ), operator, HydrusData.TimestampToPrettyTimeDelta( c_ts ) )
|
||||
|
||||
statements_and_scores[ 'time_imported' ] = ( statement, score )
|
||||
|
@ -384,52 +444,6 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
|
||||
|
||||
if shown_media.IsStaticImage() and comparison_media.IsStaticImage() and shown_media.GetResolution() == comparison_media.GetResolution():
|
||||
|
||||
global hashes_to_pixel_hashes
|
||||
|
||||
if s_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ s_hash ] = HydrusImageHandling.GetImagePixelHash( path, s_mime )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ c_hash ] = HydrusImageHandling.GetImagePixelHash( path, c_mime )
|
||||
|
||||
|
||||
s_pixel_hash = hashes_to_pixel_hashes[ s_hash ]
|
||||
c_pixel_hash = hashes_to_pixel_hashes[ c_hash ]
|
||||
|
||||
if s_pixel_hash == c_pixel_hash:
|
||||
|
||||
if s_mime == HC.IMAGE_PNG and c_mime != HC.IMAGE_PNG:
|
||||
|
||||
statement = 'this is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = -100
|
||||
|
||||
elif s_mime != HC.IMAGE_PNG and c_mime == HC.IMAGE_PNG:
|
||||
|
||||
statement = 'other file is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = 100
|
||||
|
||||
else:
|
||||
|
||||
statement = 'images are pixel-for-pixel duplicates!'
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statements_and_scores[ 'pixel_duplicates' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
return statements_and_scores
|
||||
|
||||
def GetMediasTags( pool, tag_service_key, tag_display_type, content_statuses ):
|
||||
|
|
|
@ -442,9 +442,7 @@ class NotesManager( object ):
|
|||
|
||||
def GetNames( self ):
|
||||
|
||||
names = list( self._names_to_notes.keys() )
|
||||
|
||||
names.sort()
|
||||
names = sorted( self._names_to_notes.keys() )
|
||||
|
||||
return names
|
||||
|
||||
|
@ -792,9 +790,7 @@ class TagsManager( object ):
|
|||
|
||||
for desired_namespace in namespaces:
|
||||
|
||||
subtags = [ HydrusTags.ConvertTagToSortable( subtag ) for ( namespace, subtag ) in pairs if namespace == desired_namespace ]
|
||||
|
||||
subtags.sort()
|
||||
subtags = sorted( ( HydrusTags.ConvertTagToSortable( subtag ) for ( namespace, subtag ) in pairs if namespace == desired_namespace ) )
|
||||
|
||||
slice.append( tuple( subtags ) )
|
||||
|
||||
|
|
|
@ -144,6 +144,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'use_system_ffmpeg' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'elide_page_tab_names' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'maintain_similar_files_duplicate_pairs_during_idle' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_namespaces' ] = True
|
||||
|
|
|
@ -2,6 +2,7 @@ import base64
|
|||
import bs4
|
||||
import calendar
|
||||
import codecs
|
||||
import typing
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
from hydrus.client.networking import ClientNetworkingJobs
|
||||
import collections
|
||||
|
@ -173,9 +174,7 @@ def ConvertParsableContentToPrettyString( parsable_content, include_veto = False
|
|||
|
||||
else:
|
||||
|
||||
hash_types = list( additional_infos )
|
||||
|
||||
hash_types.sort()
|
||||
hash_types = sorted( additional_infos )
|
||||
|
||||
pretty_strings.append( 'hashes: ' + ', '.join( hash_types ) )
|
||||
|
||||
|
@ -1404,9 +1403,7 @@ class ParseFormulaJSON( ParseFormula ):
|
|||
|
||||
elif isinstance( root, dict ):
|
||||
|
||||
pairs = list( root.items() )
|
||||
|
||||
pairs.sort()
|
||||
pairs = sorted( root.items() )
|
||||
|
||||
for ( key, value ) in pairs:
|
||||
|
||||
|
@ -1443,9 +1440,7 @@ class ParseFormulaJSON( ParseFormula ):
|
|||
|
||||
string_match = parse_rule
|
||||
|
||||
pairs = list( root.items() )
|
||||
|
||||
pairs.sort()
|
||||
pairs = sorted( root.items() )
|
||||
|
||||
for ( key, value ) in pairs:
|
||||
|
||||
|
@ -1485,9 +1480,7 @@ class ParseFormulaJSON( ParseFormula ):
|
|||
|
||||
if isinstance( root, dict ):
|
||||
|
||||
pairs = list( root.items() )
|
||||
|
||||
pairs.sort()
|
||||
pairs = sorted( root.items() )
|
||||
|
||||
for ( key, value ) in pairs:
|
||||
|
||||
|
@ -2131,9 +2124,7 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def GetSafeSummary( self ):
|
||||
|
||||
domains = list( { ClientNetworkingDomain.ConvertURLIntoDomain( url ) for url in self._example_urls } )
|
||||
|
||||
domains.sort()
|
||||
domains = sorted( { ClientNetworkingDomain.ConvertURLIntoDomain( url ) for url in self._example_urls } )
|
||||
|
||||
return 'Parser "' + self._name + '" - ' + ', '.join( domains )
|
||||
|
||||
|
@ -2821,7 +2812,14 @@ transformation_type_str_lookup[ STRING_TRANSFORMATION_DATE_DECODE ] = 'datestrin
|
|||
transformation_type_str_lookup[ STRING_TRANSFORMATION_INTEGER_ADDITION ] = 'integer addition'
|
||||
transformation_type_str_lookup[ STRING_TRANSFORMATION_DATE_ENCODE ] = 'timestamp to datestring'
|
||||
|
||||
class StringConverter( HydrusSerialisable.SerialisableBase ):
|
||||
class StringProcessingStep( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
def ToString( self, simple = False ) -> str:
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
class StringConverter( StringProcessingStep ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_STRING_CONVERTER
|
||||
SERIALISABLE_NAME = 'String Converter'
|
||||
|
@ -2839,7 +2837,7 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
|||
example_string = 'example string'
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
StringProcessingStep.__init__( self )
|
||||
|
||||
self.transformations = transformations
|
||||
|
||||
|
@ -3068,6 +3066,36 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
|||
return len( self.transformations ) > 0
|
||||
|
||||
|
||||
def ToString( self, simple = False ) -> str:
|
||||
|
||||
num_rules = len( self.transformations )
|
||||
|
||||
if num_rules == 0:
|
||||
|
||||
if simple:
|
||||
|
||||
label = 'no changes'
|
||||
|
||||
else:
|
||||
|
||||
label = 'no string transformations'
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if simple:
|
||||
|
||||
label = '{} changes'.format( HydrusData.ToHumanInt( num_rules ) )
|
||||
|
||||
else:
|
||||
|
||||
label = '{} string transformations'.format( HydrusData.ToHumanInt( num_rules ) )
|
||||
|
||||
|
||||
|
||||
return label
|
||||
|
||||
|
||||
@staticmethod
|
||||
def TransformationToString( transformation ):
|
||||
|
||||
|
@ -3142,7 +3170,7 @@ ALPHA = 0
|
|||
ALPHANUMERIC = 1
|
||||
NUMERIC = 2
|
||||
|
||||
class StringMatch( HydrusSerialisable.SerialisableBase ):
|
||||
class StringMatch( StringProcessingStep ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_STRING_MATCH
|
||||
SERIALISABLE_NAME = 'String Match'
|
||||
|
@ -3150,9 +3178,7 @@ class StringMatch( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def __init__( self, match_type = STRING_MATCH_ANY, match_value = '', min_chars = None, max_chars = None, example_string = 'example string' ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
# make a gui control that accepts one of these. displays expected input on the right and colours red/green (and does isvalid) based on current input
|
||||
# think about replacing the veto stuff above with this.
|
||||
StringProcessingStep.__init__( self )
|
||||
|
||||
self._match_type = match_type
|
||||
self._match_value = match_value
|
||||
|
@ -3272,7 +3298,12 @@ class StringMatch( HydrusSerialisable.SerialisableBase ):
|
|||
return ( self._match_type, self._match_value, self._min_chars, self._max_chars, self._example_string )
|
||||
|
||||
|
||||
def ToString( self ):
|
||||
def ToString( self, simple = False ):
|
||||
|
||||
if simple:
|
||||
|
||||
return 'filter'
|
||||
|
||||
|
||||
result = ''
|
||||
|
||||
|
@ -3342,3 +3373,199 @@ class StringMatch( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_STRING_MATCH ] = StringMatch
|
||||
|
||||
class StringSplitter( StringProcessingStep ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_STRING_SPLITTER
|
||||
SERIALISABLE_NAME = 'String Splitter'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, separator: str = ',', max_splits: typing.Optional[ int ] = None ):
|
||||
|
||||
StringProcessingStep.__init__( self )
|
||||
|
||||
self._separator = separator
|
||||
self._max_splits = max_splits
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self._separator, self._max_splits )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._separator, self._max_splits ) = serialisable_info
|
||||
|
||||
|
||||
def GetMaxSplits( self ):
|
||||
|
||||
return self._max_splits
|
||||
|
||||
|
||||
def GetSeparator( self ):
|
||||
|
||||
return self._separator
|
||||
|
||||
|
||||
def Split( self, text: str ) -> typing.List[ str ]:
|
||||
|
||||
if self._max_splits is None:
|
||||
|
||||
results = text.split( self._separator )
|
||||
|
||||
else:
|
||||
|
||||
results = text.split( self._separator, self._max_splits )
|
||||
|
||||
|
||||
return [ result for result in results if result != '' ]
|
||||
|
||||
|
||||
def ToString( self, simple = False ):
|
||||
|
||||
if simple:
|
||||
|
||||
return 'splitter'
|
||||
|
||||
|
||||
result = 'splitting by "{}"'.format( self._separator )
|
||||
|
||||
if self._max_splits is not None:
|
||||
|
||||
result = '{}, at most {} times'.format( result, HydrusData.ToHumanInt( self._max_splits ) )
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_STRING_SPLITTER ] = StringSplitter
|
||||
|
||||
class StringProcessor( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_STRING_PROCESSOR
|
||||
SERIALISABLE_NAME = 'String Processor'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
StringProcessingStep.__init__( self )
|
||||
|
||||
self._processing_steps: typing.List[ StringProcessingStep ] = []
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return HydrusSerialisable.SerialisableList( self._processing_steps ).GetSerialisableTuple()
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
serialisable_processing_steps = serialisable_info
|
||||
|
||||
self._processing_steps = list( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_processing_steps ) )
|
||||
|
||||
|
||||
def GetProcessingSteps( self ):
|
||||
|
||||
return list( self._processing_steps )
|
||||
|
||||
|
||||
def ProcessStrings( self, starting_strings: typing.Iterable[ str ], max_steps_allowed = None ) -> typing.List[ str ]:
|
||||
|
||||
final_strings = []
|
||||
|
||||
for starting_string in starting_strings:
|
||||
|
||||
current_strings = [ starting_string ]
|
||||
|
||||
for ( i, processing_step ) in enumerate( self._processing_steps ):
|
||||
|
||||
if max_steps_allowed is not None and i >= max_steps_allowed:
|
||||
|
||||
break
|
||||
|
||||
|
||||
next_strings = []
|
||||
|
||||
for current_string in current_strings:
|
||||
|
||||
if isinstance( processing_step, StringConverter ):
|
||||
|
||||
try:
|
||||
|
||||
next_string = processing_step.Convert( current_string )
|
||||
|
||||
next_strings.append( next_string )
|
||||
|
||||
except HydrusExceptions.StringConvertException:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
elif isinstance( processing_step, StringMatch ):
|
||||
|
||||
try:
|
||||
|
||||
if processing_step.Matches( current_string ):
|
||||
|
||||
next_strings.append( current_string )
|
||||
|
||||
|
||||
except HydrusExceptions.StringMatchException:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
elif isinstance( processing_step, StringSplitter ):
|
||||
|
||||
split_strings = processing_step.Split( current_string )
|
||||
|
||||
next_strings.extend( split_strings )
|
||||
|
||||
|
||||
|
||||
current_strings = next_strings
|
||||
|
||||
|
||||
final_strings.extend( current_strings )
|
||||
|
||||
|
||||
return final_strings
|
||||
|
||||
|
||||
def SetProcessingSteps( self, processing_steps: typing.List[ StringProcessingStep ] ):
|
||||
|
||||
self._processing_steps = list( processing_steps )
|
||||
|
||||
|
||||
def ToString( self ) -> str:
|
||||
|
||||
if len( self._processing_steps ) == 0:
|
||||
|
||||
return 'no string processing'
|
||||
|
||||
else:
|
||||
|
||||
components = []
|
||||
|
||||
if True in ( isinstance( ps, StringConverter ) for ps in self._processing_steps ):
|
||||
|
||||
components = 'conversion'
|
||||
|
||||
|
||||
if True in ( isinstance( ps, StringMatch ) for ps in self._processing_steps ):
|
||||
|
||||
components = 'filtering'
|
||||
|
||||
|
||||
if True in ( isinstance( ps, StringSplitter ) for ps in self._processing_steps ):
|
||||
|
||||
components = 'splitting'
|
||||
|
||||
|
||||
return 'some {}'.format( ', '.join( components ) )
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_STRING_SPLITTER ] = StringSplitter
|
||||
|
|
|
@ -54,42 +54,46 @@ PREDICATE_TYPE_SYSTEM_MODIFIED_TIME = 35
|
|||
PREDICATE_TYPE_SYSTEM_FRAMERATE = 36
|
||||
PREDICATE_TYPE_SYSTEM_NUM_FRAMES = 37
|
||||
PREDICATE_TYPE_SYSTEM_NUM_NOTES = 38
|
||||
PREDICATE_TYPE_SYSTEM_NOTES = 39
|
||||
PREDICATE_TYPE_SYSTEM_HAS_NOTE_NAME = 40
|
||||
|
||||
SYSTEM_PREDICATE_TYPES = set()
|
||||
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_EVERYTHING )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_INBOX )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_ARCHIVE )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_UNTAGGED )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NUM_TAGS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_LIMIT )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_SIZE )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_AGE )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_MODIFIED_TIME )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_HASH )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_WIDTH )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_HEIGHT )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_RATIO )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_DURATION )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FRAMERATE )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NUM_FRAMES )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_HAS_AUDIO )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_MIME )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_RATING )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_SIMILAR_TO )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_LOCAL )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NOT_LOCAL )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NUM_WORDS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NUM_NOTES )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FILE_SERVICE )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_NUM_PIXELS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_DIMENSIONS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_COUNT )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_KNOWN_URLS )
|
||||
SYSTEM_PREDICATE_TYPES.add( PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS )
|
||||
SYSTEM_PREDICATE_TYPES = {
|
||||
PREDICATE_TYPE_SYSTEM_EVERYTHING,
|
||||
PREDICATE_TYPE_SYSTEM_INBOX,
|
||||
PREDICATE_TYPE_SYSTEM_ARCHIVE,
|
||||
PREDICATE_TYPE_SYSTEM_UNTAGGED,
|
||||
PREDICATE_TYPE_SYSTEM_NUM_TAGS,
|
||||
PREDICATE_TYPE_SYSTEM_LIMIT,
|
||||
PREDICATE_TYPE_SYSTEM_SIZE,
|
||||
PREDICATE_TYPE_SYSTEM_AGE,
|
||||
PREDICATE_TYPE_SYSTEM_MODIFIED_TIME,
|
||||
PREDICATE_TYPE_SYSTEM_HASH,
|
||||
PREDICATE_TYPE_SYSTEM_WIDTH,
|
||||
PREDICATE_TYPE_SYSTEM_HEIGHT,
|
||||
PREDICATE_TYPE_SYSTEM_RATIO,
|
||||
PREDICATE_TYPE_SYSTEM_DURATION,
|
||||
PREDICATE_TYPE_SYSTEM_FRAMERATE,
|
||||
PREDICATE_TYPE_SYSTEM_NUM_FRAMES,
|
||||
PREDICATE_TYPE_SYSTEM_HAS_AUDIO,
|
||||
PREDICATE_TYPE_SYSTEM_MIME,
|
||||
PREDICATE_TYPE_SYSTEM_RATING,
|
||||
PREDICATE_TYPE_SYSTEM_SIMILAR_TO,
|
||||
PREDICATE_TYPE_SYSTEM_LOCAL,
|
||||
PREDICATE_TYPE_SYSTEM_NOT_LOCAL,
|
||||
PREDICATE_TYPE_SYSTEM_NUM_WORDS,
|
||||
PREDICATE_TYPE_SYSTEM_NUM_NOTES,
|
||||
PREDICATE_TYPE_SYSTEM_HAS_NOTE_NAME,
|
||||
PREDICATE_TYPE_SYSTEM_FILE_SERVICE,
|
||||
PREDICATE_TYPE_SYSTEM_NUM_PIXELS,
|
||||
PREDICATE_TYPE_SYSTEM_DIMENSIONS,
|
||||
PREDICATE_TYPE_SYSTEM_NOTES,
|
||||
PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER,
|
||||
PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS,
|
||||
PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_COUNT,
|
||||
PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING,
|
||||
PREDICATE_TYPE_SYSTEM_KNOWN_URLS,
|
||||
PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS
|
||||
}
|
||||
|
||||
IGNORED_TAG_SEARCH_CHARACTERS = '[](){}/\\"\'-_'
|
||||
IGNORED_TAG_SEARCH_CHARACTERS_UNICODE_TRANSLATE = { ord( char ) : ' ' for char in IGNORED_TAG_SEARCH_CHARACTERS }
|
||||
|
@ -974,6 +978,27 @@ class FileSystemPredicates( object ):
|
|||
elif operator == '=': self._common_info[ 'num_notes' ] = num_notes
|
||||
|
||||
|
||||
if predicate_type == PREDICATE_TYPE_SYSTEM_HAS_NOTE_NAME:
|
||||
|
||||
( operator, name ) = value
|
||||
|
||||
if operator:
|
||||
|
||||
label = 'has_note_names'
|
||||
|
||||
else:
|
||||
|
||||
label = 'not_has_note_names'
|
||||
|
||||
|
||||
if label not in self._common_info:
|
||||
|
||||
self._common_info[ label ] = set()
|
||||
|
||||
|
||||
self._common_info[ label ].add( name )
|
||||
|
||||
|
||||
if predicate_type == PREDICATE_TYPE_SYSTEM_NUM_WORDS:
|
||||
|
||||
( operator, num_words ) = value
|
||||
|
@ -1545,6 +1570,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_LOCAL: base = 'local'
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_NOT_LOCAL: base = 'not local'
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_DIMENSIONS: base = 'dimensions'
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_NOTES: base = 'notes'
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS: base = 'file relationships'
|
||||
elif self._predicate_type in ( PREDICATE_TYPE_SYSTEM_WIDTH, PREDICATE_TYPE_SYSTEM_HEIGHT, PREDICATE_TYPE_SYSTEM_NUM_NOTES, PREDICATE_TYPE_SYSTEM_NUM_WORDS, PREDICATE_TYPE_SYSTEM_NUM_FRAMES ):
|
||||
|
||||
|
@ -1629,6 +1655,24 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
base += ' {} {}fps'.format( operator, HydrusData.ToHumanInt( value ) )
|
||||
|
||||
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_HAS_NOTE_NAME:
|
||||
|
||||
base = 'has note'
|
||||
|
||||
if self._value is not None:
|
||||
|
||||
( operator, name ) = self._value
|
||||
|
||||
if operator:
|
||||
|
||||
base = 'has note with name "{}"'.format( name )
|
||||
|
||||
else:
|
||||
|
||||
base = 'does not have note with name "{}"'.format( name )
|
||||
|
||||
|
||||
|
||||
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_NUM_TAGS:
|
||||
|
||||
base = 'number of tags'
|
||||
|
|
|
@ -168,7 +168,7 @@ def DumpToPng( width, payload_bytes, title, payload_description, text, path ):
|
|||
|
||||
cv2.imwrite( temp_path, finished_image, [ cv2.IMWRITE_PNG_COMPRESSION, 9 ] )
|
||||
|
||||
shutil.copy2( temp_path, path )
|
||||
HydrusPaths.MirrorFile( temp_path, path )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
@ -243,7 +243,7 @@ def LoadFromPng( path ):
|
|||
|
||||
try:
|
||||
|
||||
shutil.copy2( path, temp_path )
|
||||
HydrusPaths.MirrorFile( path, temp_path )
|
||||
|
||||
numpy_image = cv2.imread( temp_path, flags = IMREAD_UNCHANGED )
|
||||
|
||||
|
|
|
@ -2150,7 +2150,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
links_url = api_base_url + 'object/links/' + multihash
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', links_url )
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', links_url )
|
||||
|
||||
if job_key is not None:
|
||||
|
||||
|
@ -2159,8 +2159,6 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
try:
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
network_job.WaitUntilDone()
|
||||
|
@ -2234,10 +2232,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'config?arg=Experimental.FilestoreEnabled&arg={}&bool=true'.format( arg_value )
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OnlyTryConnectionOnce()
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2259,10 +2254,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'version'
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OnlyTryConnectionOnce()
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2292,10 +2284,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'config?arg=Experimental.FilestoreEnabled'
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OnlyTryConnectionOnce()
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2423,21 +2412,40 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
# check if it is pinned. if we try to unpin something not pinned, the daemon 500s
|
||||
|
||||
url = api_base_url + '/pin/ls?arg={}'.format( multihash )
|
||||
url = api_base_url + 'pin/ls?arg={}'.format( multihash )
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
network_job.WaitUntilDone()
|
||||
|
||||
parsing_text = network_job.GetContentText()
|
||||
|
||||
j = json.loads( parsing_text )
|
||||
|
||||
file_is_pinned = 'Keys' in j and multihash in j['Keys']
|
||||
try:
|
||||
|
||||
network_job.WaitUntilDone()
|
||||
|
||||
parsing_text = network_job.GetContentText()
|
||||
|
||||
j = json.loads( parsing_text )
|
||||
|
||||
file_is_pinned = False
|
||||
|
||||
if 'PinLsList' in j:
|
||||
|
||||
file_is_pinned = 'Keys' in j[ 'PinLsList' ] and multihash in j[ 'PinLsList' ]['Keys']
|
||||
|
||||
else:
|
||||
|
||||
file_is_pinned = 'Keys' in j and multihash in j['Keys']
|
||||
|
||||
|
||||
except HydrusExceptions.ServerException:
|
||||
|
||||
if 'not pinned' in network_job.GetContentText():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
raise
|
||||
|
||||
|
||||
return file_is_pinned
|
||||
|
||||
|
@ -2454,9 +2462,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
file_info = []
|
||||
|
||||
hashes = list( hashes )
|
||||
|
||||
hashes.sort()
|
||||
hashes = sorted( hashes )
|
||||
|
||||
for ( i, hash ) in enumerate( hashes ):
|
||||
|
||||
|
@ -2506,9 +2512,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'object/new?arg=unixfs-dir'
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2538,9 +2542,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'object/patch/add-link?arg=' + object_multihash + '&arg=' + filename + '&arg=' + multihash
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2555,9 +2557,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'pin/add?arg=' + directory_multihash
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2650,12 +2650,10 @@ class ServiceIPFS( ServiceRemote ):
|
|||
files = { 'path' : ( hash.hex(), f, mime_string ) }
|
||||
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
network_job.SetFiles( files )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
network_job.WaitUntilDone()
|
||||
|
@ -2713,9 +2711,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'pin/rm/' + multihash
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
@ -2738,9 +2734,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
url = api_base_url + 'pin/rm/' + multihash
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
network_job = ClientNetworkingJobs.NetworkJobIPFS( 'POST', url )
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
|
|
|
@ -518,9 +518,18 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
|
||||
library_versions = []
|
||||
|
||||
library_versions.append( ( 'FFMPEG', HydrusVideoHandling.GetFFMPEGVersion() ) )
|
||||
library_versions.append( ( 'OpenCV', cv2.__version__ ) )
|
||||
# 2.7.12 (v2.7.12:d33e0cf91556, Jun 27 2016, 15:24:40) [MSC v.1500 64 bit (AMD64)]
|
||||
v = sys.version
|
||||
|
||||
if ' ' in v:
|
||||
|
||||
v = v.split( ' ' )[0]
|
||||
|
||||
|
||||
library_versions.append( ( 'python', v ) )
|
||||
library_versions.append( ( 'openssl', ssl.OPENSSL_VERSION ) )
|
||||
|
||||
library_versions.append( ( 'OpenCV', cv2.__version__ ) )
|
||||
library_versions.append( ( 'Pillow', PIL.__version__ ) )
|
||||
|
||||
if ClientGUIMPV.MPV_IS_AVAILABLE:
|
||||
|
@ -535,18 +544,12 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
library_versions.append( ( 'mpv', 'not available' ) )
|
||||
|
||||
|
||||
# 2.7.12 (v2.7.12:d33e0cf91556, Jun 27 2016, 15:24:40) [MSC v.1500 64 bit (AMD64)]
|
||||
v = sys.version
|
||||
|
||||
if ' ' in v:
|
||||
|
||||
v = v.split( ' ' )[0]
|
||||
|
||||
|
||||
library_versions.append( ( 'python', v ) )
|
||||
library_versions.append( ( 'FFMPEG', HydrusVideoHandling.GetFFMPEGVersion() ) )
|
||||
|
||||
library_versions.append( ( 'sqlite', sqlite3.sqlite_version ) )
|
||||
|
||||
library_versions.append( ( 'Qt', QC.__version__ ) )
|
||||
|
||||
if qtpy.PYSIDE2:
|
||||
|
||||
import PySide2
|
||||
|
@ -562,15 +565,22 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
|
||||
library_versions.append( ( 'PyQt5', PYQT_VERSION_STR ) )
|
||||
library_versions.append( ( 'sip', SIP_VERSION_STR ) )
|
||||
|
||||
library_versions.append( ( 'Qt', QC.__version__ ) )
|
||||
|
||||
library_versions.append( ( 'html5lib present: ', str( ClientParsing.HTML5LIB_IS_OK ) ) )
|
||||
library_versions.append( ( 'lxml present: ', str( ClientParsing.LXML_IS_OK ) ) )
|
||||
|
||||
|
||||
from hydrus.client.networking import ClientNetworkingJobs
|
||||
|
||||
library_versions.append( ( 'cloudscraper present: ', str( ClientNetworkingJobs.CLOUDSCRAPER_OK ) ) )
|
||||
if ClientNetworkingJobs.CLOUDSCRAPER_OK:
|
||||
|
||||
library_versions.append( ( 'cloudscraper', ClientNetworkingJobs.cloudscraper.__version__ ) )
|
||||
|
||||
else:
|
||||
|
||||
library_versions.append( ( 'cloudscraper present: ', 'False' ) )
|
||||
|
||||
|
||||
library_versions.append( ( 'pyparsing present: ', str( ClientNetworkingJobs.PYPARSING_OK ) ) )
|
||||
library_versions.append( ( 'html5lib present: ', str( ClientParsing.HTML5LIB_IS_OK ) ) )
|
||||
library_versions.append( ( 'lxml present: ', str( ClientParsing.LXML_IS_OK ) ) )
|
||||
library_versions.append( ( 'lz4 present: ', str( ClientRendering.LZ4_OK ) ) )
|
||||
library_versions.append( ( 'install dir', HC.BASE_DIR ) )
|
||||
library_versions.append( ( 'db dir', HG.client_controller.db_dir ) )
|
||||
|
@ -1712,23 +1722,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
if not self._have_shown_once:
|
||||
|
||||
self._have_shown_once = True
|
||||
|
||||
for page in self._notebook.GetPages():
|
||||
|
||||
if isinstance( page, ClientGUIPages.PagesNotebook ):
|
||||
|
||||
page.LayoutPages()
|
||||
|
||||
|
||||
|
||||
for page in self._notebook.GetMediaPages():
|
||||
|
||||
page.SetupSplits()
|
||||
|
||||
|
||||
self._have_shown_once = True
|
||||
|
||||
page = self.GetCurrentPage()
|
||||
|
||||
|
@ -4590,7 +4584,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
site = ClientGUIMenus.AppendMenuItem( links, 'Endchan board bunker', 'Open hydrus dev\'s Endchan board, the bunker for when 8kun is unavailable. Try .org if .net is unavailable.', ClientPaths.LaunchURLInWebBrowser, 'https://endchan.net/hydrus/index.html' )
|
||||
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'twitter', 'Open hydrus dev\'s twitter, where he makes general progress updates and emergency notifications.', CC.global_pixmaps().twitter, ClientPaths.LaunchURLInWebBrowser, 'https://twitter.com/hydrusnetwork' )
|
||||
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'tumblr', 'Open hydrus dev\'s tumblr, where he makes release posts and other status updates.', CC.global_pixmaps().tumblr, ClientPaths.LaunchURLInWebBrowser, 'http://hydrus.tumblr.com/' )
|
||||
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'discord', 'Open a discord channel where many hydrus users congregate. Hydrus dev visits regularly.', CC.global_pixmaps().discord, ClientPaths.LaunchURLInWebBrowser, 'https://discord.gg/vy8CUB4' )
|
||||
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'discord', 'Open a discord channel where many hydrus users congregate. Hydrus dev visits regularly.', CC.global_pixmaps().discord, ClientPaths.LaunchURLInWebBrowser, 'https://discord.gg/wPHPCUZ' )
|
||||
site = ClientGUIMenus.AppendMenuBitmapItem( links, 'patreon', 'Open hydrus dev\'s patreon, which lets you support development.', CC.global_pixmaps().patreon, ClientPaths.LaunchURLInWebBrowser, 'https://www.patreon.com/hydrus_dev' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, links, 'links' )
|
||||
|
@ -4880,9 +4874,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
sessions = QW.QMenu( menu )
|
||||
|
||||
gui_session_names = list( gui_session_names )
|
||||
|
||||
gui_session_names.sort()
|
||||
gui_session_names = sorted( gui_session_names )
|
||||
|
||||
if len( gui_session_names ) > 0:
|
||||
|
||||
|
@ -4908,9 +4900,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
append_backup = QW.QMenu( sessions )
|
||||
|
||||
rows = list( gui_session_names_to_backup_timestamps.items() )
|
||||
|
||||
rows.sort()
|
||||
rows = sorted( gui_session_names_to_backup_timestamps.items() )
|
||||
|
||||
for ( name, timestamps ) in rows:
|
||||
|
||||
|
|
|
@ -519,7 +519,8 @@ class ListBoxTagsAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
|||
|
||||
skip_ors = True
|
||||
|
||||
skip_countless = HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' )
|
||||
some_preds_have_count = True in ( predicate.GetCount() > 0 for predicate in predicates )
|
||||
skip_countless = HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' ) and some_preds_have_count
|
||||
|
||||
for ( index, predicate ) in enumerate( predicates ):
|
||||
|
||||
|
@ -1378,9 +1379,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
|
||||
def RefreshFavouriteTags( self ):
|
||||
|
||||
favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
|
||||
|
||||
favourite_tags.sort()
|
||||
favourite_tags = sorted( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
|
||||
|
||||
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in favourite_tags ]
|
||||
|
||||
|
@ -1667,6 +1666,10 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
folder_names.insert( 0, None )
|
||||
|
||||
else:
|
||||
|
||||
folder_names.sort()
|
||||
|
||||
|
||||
for folder_name in folder_names:
|
||||
|
||||
|
@ -1681,9 +1684,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
ClientGUIMenus.AppendMenu( menu, menu_to_use, folder_name )
|
||||
|
||||
|
||||
names = list( folders_to_names[ folder_name ] )
|
||||
|
||||
names.sort()
|
||||
names = sorted( folders_to_names[ folder_name ] )
|
||||
|
||||
for name in names:
|
||||
|
||||
|
@ -2384,9 +2385,7 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
def RefreshFavouriteTags( self ):
|
||||
|
||||
favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
|
||||
|
||||
favourite_tags.sort()
|
||||
favourite_tags = sorted( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
|
||||
|
||||
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in favourite_tags ]
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
import re
|
||||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
|
@ -128,17 +129,13 @@ class ShortcutAwareToolTipMixin( object ):
|
|||
|
||||
if len( names_to_shortcuts ) > 0:
|
||||
|
||||
names = list( names_to_shortcuts.keys() )
|
||||
|
||||
names.sort()
|
||||
names = sorted( names_to_shortcuts.keys() )
|
||||
|
||||
for name in names:
|
||||
|
||||
shortcuts = names_to_shortcuts[ name ]
|
||||
|
||||
shortcut_strings = [ shortcut.ToString() for shortcut in shortcuts ]
|
||||
|
||||
shortcut_strings.sort()
|
||||
shortcut_strings = sorted( ( shortcut.ToString() for shortcut in shortcuts ) )
|
||||
|
||||
tt += os.linesep * 2
|
||||
|
||||
|
@ -381,6 +378,18 @@ class BetterNotebook( QW.QTabWidget ):
|
|||
|
||||
|
||||
|
||||
def DeleteAllPages( self ):
|
||||
|
||||
while self.count() > 0:
|
||||
|
||||
page = self.widget( 0 )
|
||||
|
||||
self.removeTab( 0 )
|
||||
|
||||
page.deleteLater()
|
||||
|
||||
|
||||
|
||||
def GetPages( self ):
|
||||
|
||||
return [ self.widget( i ) for i in range( self.count() ) ]
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1121,7 +1121,16 @@ class DialogTextEntry( Dialog ):
|
|||
#
|
||||
|
||||
self._text.setText( default )
|
||||
if placeholder is not None: self._text.setPlaceholderText( placeholder )
|
||||
|
||||
if placeholder is not None:
|
||||
|
||||
self._text.setPlaceholderText( placeholder )
|
||||
|
||||
|
||||
if len( default ) > 0:
|
||||
|
||||
self._text.setSelection( 0, len( default ) )
|
||||
|
||||
|
||||
self._CheckText()
|
||||
|
||||
|
|
|
@ -732,9 +732,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
tags.update( current_tags )
|
||||
|
||||
|
||||
tags = list( tags )
|
||||
|
||||
tags.sort()
|
||||
tags = sorted( tags )
|
||||
|
||||
txt_path = path + '.txt'
|
||||
|
||||
|
|
|
@ -1578,9 +1578,7 @@ class EditLocalImportFilenameTaggingPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
tags.update( self._filename_tagging_panel.GetTags( index, path ) )
|
||||
|
||||
tags = list( tags )
|
||||
|
||||
tags.sort()
|
||||
tags = sorted( tags )
|
||||
|
||||
return tags
|
||||
|
||||
|
|
|
@ -93,6 +93,8 @@ class AddEditDeleteListBox( QW.QWidget ):
|
|||
|
||||
self._AddData( data )
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
def _AddAllDefaults( self, defaults_callable ):
|
||||
|
||||
|
@ -103,6 +105,8 @@ class AddEditDeleteListBox( QW.QWidget ):
|
|||
self._AddData( default )
|
||||
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
def _AddData( self, data ):
|
||||
|
||||
|
@ -144,6 +148,8 @@ class AddEditDeleteListBox( QW.QWidget ):
|
|||
|
||||
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
def _Delete( self ):
|
||||
|
||||
|
@ -401,6 +407,8 @@ class AddEditDeleteListBox( QW.QWidget ):
|
|||
QW.QMessageBox.critical( self, 'Error', message )
|
||||
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
def _SetNoneDupeName( self, obj ):
|
||||
|
||||
|
@ -619,6 +627,8 @@ class QueueListBox( QW.QWidget ):
|
|||
|
||||
self._AddData( data )
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
def _AddData( self, data ):
|
||||
|
||||
|
@ -653,6 +663,7 @@ class QueueListBox( QW.QWidget ):
|
|||
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
||||
|
||||
def _Down( self ):
|
||||
|
@ -704,7 +715,7 @@ class QueueListBox( QW.QWidget ):
|
|||
new_item.setData( QC.Qt.UserRole, new_data )
|
||||
|
||||
self._listbox.insertItem( i, new_item )
|
||||
|
||||
|
||||
|
||||
self.listBoxChanged.emit()
|
||||
|
||||
|
@ -1814,9 +1825,7 @@ class ListBoxTags( ListBox ):
|
|||
|
||||
if len( predicates ) > 0:
|
||||
|
||||
s = [ predicate.ToString() for predicate in predicates ]
|
||||
|
||||
s.sort()
|
||||
s = sorted( ( predicate.ToString() for predicate in predicates ) )
|
||||
|
||||
page_name = ', '.join( s )
|
||||
|
||||
|
|
|
@ -166,9 +166,7 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
|
||||
def _RecalculateIndicesAfterDelete( self ):
|
||||
|
||||
indices_and_data_info = list( self._indices_to_data_info.items() )
|
||||
|
||||
indices_and_data_info.sort()
|
||||
indices_and_data_info = sorted( self._indices_to_data_info.items() )
|
||||
|
||||
self._indices_to_data_info = {}
|
||||
self._data_to_indices = {}
|
||||
|
@ -537,9 +535,7 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
|
||||
# keep it sorted here, which is sometimes useful
|
||||
|
||||
indices_and_datas = [ ( index, data ) for ( data, index ) in self._data_to_indices.items() ]
|
||||
|
||||
indices_and_datas.sort()
|
||||
indices_and_datas = sorted( ( ( index, data ) for ( data, index ) in self._data_to_indices.items() ) )
|
||||
|
||||
datas = [ data for ( index, data ) in indices_and_datas ]
|
||||
|
||||
|
|
|
@ -22,6 +22,7 @@ from hydrus.client.gui import ClientGUIListBoxes
|
|||
from hydrus.client.gui import ClientGUIListCtrl
|
||||
from hydrus.client.gui import ClientGUIParsing
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui import ClientGUIStringControls
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.importing import ClientImporting
|
||||
|
@ -211,7 +212,7 @@ class EditLoginCredentialDefinitionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
string_match = credential_definition.GetStringMatch()
|
||||
|
||||
self._string_match = ClientGUIControls.StringMatchButton( self, string_match )
|
||||
self._string_match = ClientGUIStringControls.StringMatchButton( self, string_match )
|
||||
|
||||
#
|
||||
|
||||
|
@ -334,9 +335,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
domains_in_use = { login_domain for ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason ) in self._domains_and_login_info.GetData() }
|
||||
|
||||
available_examples = list( example_domains.difference( domains_in_use ) )
|
||||
|
||||
available_examples.sort()
|
||||
available_examples = sorted( example_domains.difference( domains_in_use ) )
|
||||
|
||||
if len( available_examples ) > 0:
|
||||
|
||||
|
@ -1307,7 +1306,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
required_cookies_info_box_panel = ClientGUICommon.StaticBox( self, 'cookies required to consider session logged in' )
|
||||
|
||||
self._required_cookies_info = ClientGUIControls.StringMatchToStringMatchDictControl( required_cookies_info_box_panel, login_script.GetRequiredCookiesInfo(), min_height = 4, key_name = 'cookie name' )
|
||||
self._required_cookies_info = ClientGUIStringControls.StringMatchToStringMatchDictControl( required_cookies_info_box_panel, login_script.GetRequiredCookiesInfo(), min_height = 4, key_name = 'cookie name' )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1664,9 +1663,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if self._test_domain == '':
|
||||
|
||||
example_domains = list( login_script.GetExampleDomains() )
|
||||
|
||||
example_domains.sort()
|
||||
example_domains = sorted( login_script.GetExampleDomains() )
|
||||
|
||||
if len( example_domains ) > 0:
|
||||
|
||||
|
@ -1952,8 +1949,7 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
name = login_script.GetName()
|
||||
|
||||
example_domains = list( login_script.GetExampleDomains() )
|
||||
example_domains.sort()
|
||||
example_domains = sorted( login_script.GetExampleDomains() )
|
||||
|
||||
pretty_name = name
|
||||
pretty_example_domains = ', '.join( example_domains )
|
||||
|
@ -2040,25 +2036,25 @@ class EditLoginStepPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
required_credentials_panel = ClientGUICommon.StaticBox( self, 'credentials to send' )
|
||||
|
||||
self._required_credentials = ClientGUIControls.StringToStringDictControl( required_credentials_panel, required_credentials, min_height = 4, key_name = 'credential name', value_name = 'parameter name' )
|
||||
self._required_credentials = ClientGUIStringControls.StringToStringDictControl( required_credentials_panel, required_credentials, min_height = 4, key_name = 'credential name', value_name = 'parameter name' )
|
||||
|
||||
#
|
||||
|
||||
static_args_panel = ClientGUICommon.StaticBox( self, 'static variables to send' )
|
||||
|
||||
self._static_args = ClientGUIControls.StringToStringDictControl( static_args_panel, static_args, min_height = 4, key_name = 'parameter name', value_name = 'value' )
|
||||
self._static_args = ClientGUIStringControls.StringToStringDictControl( static_args_panel, static_args, min_height = 4, key_name = 'parameter name', value_name = 'value' )
|
||||
|
||||
#
|
||||
|
||||
temp_args_panel = ClientGUICommon.StaticBox( self, 'temporary variables to send' )
|
||||
|
||||
self._temp_args = ClientGUIControls.StringToStringDictControl( temp_args_panel, temp_args, min_height = 4, key_name = 'temp variable name', value_name = 'parameter name' )
|
||||
self._temp_args = ClientGUIStringControls.StringToStringDictControl( temp_args_panel, temp_args, min_height = 4, key_name = 'temp variable name', value_name = 'parameter name' )
|
||||
|
||||
#
|
||||
|
||||
required_cookies_info_box_panel = ClientGUICommon.StaticBox( self, 'cookies required to consider step successful' )
|
||||
|
||||
self._required_cookies_info = ClientGUIControls.StringMatchToStringMatchDictControl( required_cookies_info_box_panel, required_cookies_info, min_height = 4, key_name = 'cookie name' )
|
||||
self._required_cookies_info = ClientGUIStringControls.StringMatchToStringMatchDictControl( required_cookies_info_box_panel, required_cookies_info, min_height = 4, key_name = 'cookie name' )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -29,9 +29,7 @@ def CopyMediaURLs( medias ):
|
|||
urls.update( media_urls )
|
||||
|
||||
|
||||
urls = list( urls )
|
||||
|
||||
urls.sort()
|
||||
urls = sorted( urls )
|
||||
|
||||
urls_string = os.linesep.join( urls )
|
||||
|
||||
|
@ -54,9 +52,7 @@ def CopyMediaURLClassURLs( medias, url_class ):
|
|||
|
||||
|
||||
|
||||
urls = list( urls )
|
||||
|
||||
urls.sort()
|
||||
urls = sorted( urls )
|
||||
|
||||
urls_string = os.linesep.join( urls )
|
||||
|
||||
|
@ -139,9 +135,7 @@ def OpenExternally( media ):
|
|||
|
||||
def OpenURLs( urls ):
|
||||
|
||||
urls = list( urls )
|
||||
|
||||
urls.sort()
|
||||
urls = sorted( urls )
|
||||
|
||||
if len( urls ) > 1:
|
||||
|
||||
|
|
|
@ -424,6 +424,8 @@ class Page( QW.QSplitter ):
|
|||
|
||||
self._search_preview_split = QW.QSplitter( self )
|
||||
|
||||
self._done_split_setups = False
|
||||
|
||||
self._management_panel = ClientGUIManagement.CreateManagementPanel( self._search_preview_split, self, self._controller, self._management_controller )
|
||||
|
||||
file_service_key = self._management_controller.GetKey( 'file_service' )
|
||||
|
@ -442,8 +444,6 @@ class Page( QW.QSplitter ):
|
|||
|
||||
self._preview_panel.setLayout( vbox )
|
||||
|
||||
self.SetupSplits()
|
||||
|
||||
self.widget( 0 ).setMinimumWidth( 120 )
|
||||
self.widget( 1 ).setMinimumWidth( 120 )
|
||||
self.setStretchFactor( 0, 0 )
|
||||
|
@ -740,6 +740,13 @@ class Page( QW.QSplitter ):
|
|||
|
||||
def PageShown( self ):
|
||||
|
||||
if self.isVisible() and not self._done_split_setups:
|
||||
|
||||
self.SetupSplits()
|
||||
|
||||
self._done_split_setups = True
|
||||
|
||||
|
||||
self._management_panel.PageShown()
|
||||
self._media_panel.PageShown()
|
||||
self._preview_canvas.PageShown()
|
||||
|
@ -931,6 +938,11 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
QP.TabWidgetWithDnD.__init__( self, parent )
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'elide_page_tab_names' ):
|
||||
|
||||
self.tabBar().setElideMode( QC.Qt.ElideMiddle )
|
||||
|
||||
|
||||
self._parent_notebook = parent
|
||||
|
||||
# this is disabled for now because it seems borked in Qt
|
||||
|
@ -963,7 +975,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
self._widget_event_filter.EVT_LEFT_DOWN( lambda ev: ev.accept() )
|
||||
self._widget_event_filter.EVT_LEFT_DOWN( lambda ev: ev.accept() )
|
||||
|
||||
self.currentChanged.connect( self.EventPageChanged )
|
||||
self.currentChanged.connect( self.pageJustChanged )
|
||||
self.pageDragAndDropped.connect( self._RefreshPageNamesAfterDnD )
|
||||
|
||||
self._previous_page_index = -1
|
||||
|
@ -1373,11 +1385,13 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
safe_page_name = ClientGUIFunctions.EscapeMnemonics( page_name )
|
||||
|
||||
existing_page_name = self.tabText( index )
|
||||
tab_bar = self.tabBar()
|
||||
|
||||
existing_page_name = tab_bar.tabText( index )
|
||||
|
||||
if existing_page_name not in ( safe_page_name, page_name ):
|
||||
|
||||
self.setTabText( index, safe_page_name )
|
||||
tab_bar.setTabText( index, safe_page_name )
|
||||
|
||||
|
||||
|
||||
|
@ -1890,28 +1904,6 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
notebook._ChooseNewPage()
|
||||
|
||||
|
||||
def EventPageChanged( self, index ):
|
||||
|
||||
old_selection = self._previous_page_index
|
||||
selection = index
|
||||
|
||||
if old_selection != -1 and old_selection < self.count():
|
||||
|
||||
self.widget( old_selection ).PageHidden()
|
||||
|
||||
|
||||
if selection != -1:
|
||||
|
||||
self.widget( selection ).PageShown()
|
||||
|
||||
|
||||
self._controller.gui.RefreshStatusBar()
|
||||
|
||||
self._previous_page_index = index
|
||||
|
||||
self._controller.pub( 'notify_page_change' )
|
||||
|
||||
|
||||
def GetAPIInfoDict( self, simple ):
|
||||
|
||||
return {}
|
||||
|
@ -2296,6 +2288,8 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
for page_tuple in page_tuples:
|
||||
|
||||
select_page = not done_first_page
|
||||
|
||||
( page_type, page_data ) = page_tuple
|
||||
|
||||
if page_type == 'pages':
|
||||
|
@ -2304,7 +2298,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
try:
|
||||
|
||||
page = self.NewPagesNotebook( name, forced_insertion_index = forced_insertion_index, give_it_a_blank_page = False, select_page = False )
|
||||
page = self.NewPagesNotebook( name, forced_insertion_index = forced_insertion_index, give_it_a_blank_page = False, select_page = select_page )
|
||||
|
||||
page.AppendSessionPageTuples( subpage_tuples )
|
||||
|
||||
|
@ -2319,12 +2313,8 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
try:
|
||||
|
||||
select_page = not done_first_page
|
||||
|
||||
self.NewPage( management_controller, initial_hashes = initial_hashes, forced_insertion_index = forced_insertion_index, select_page = select_page )
|
||||
|
||||
done_first_page = True
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
@ -2333,6 +2323,8 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
forced_insertion_index += 1
|
||||
|
||||
done_first_page = True
|
||||
|
||||
|
||||
|
||||
def IsMultipleWatcherPage( self ):
|
||||
|
@ -2552,10 +2544,6 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
self.setCurrentIndex( insertion_index )
|
||||
|
||||
|
||||
self.LayoutPages()
|
||||
|
||||
page.SetupSplits()
|
||||
|
||||
self._controller.pub( 'refresh_page_name', page.GetPageKey() )
|
||||
self._controller.pub( 'notify_new_pages' )
|
||||
|
||||
|
@ -2699,9 +2687,11 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
page_name = page.GetName()
|
||||
|
||||
self.insertTab( insertion_index, page, page_name )
|
||||
if select_page: self.setCurrentIndex( insertion_index )
|
||||
|
||||
self.LayoutPages()
|
||||
if select_page:
|
||||
|
||||
self.setCurrentIndex( insertion_index )
|
||||
|
||||
|
||||
self._controller.pub( 'refresh_page_name', page.GetPageKey() )
|
||||
|
||||
|
@ -2749,6 +2739,30 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
|
||||
def pageJustChanged( self, index ):
|
||||
|
||||
old_selection = self._previous_page_index
|
||||
selection = index
|
||||
|
||||
if old_selection != -1 and old_selection < self.count():
|
||||
|
||||
self.widget( old_selection ).PageHidden()
|
||||
|
||||
|
||||
if selection != -1:
|
||||
|
||||
new_page = self.widget( selection )
|
||||
|
||||
new_page.PageShown()
|
||||
|
||||
|
||||
self._controller.gui.RefreshStatusBar()
|
||||
|
||||
self._previous_page_index = index
|
||||
|
||||
self._controller.pub( 'notify_page_change' )
|
||||
|
||||
|
||||
def PageShown( self ):
|
||||
|
||||
result = self.currentWidget()
|
||||
|
|
|
@ -32,6 +32,8 @@ from hydrus.client.gui import ClientGUIListCtrl
|
|||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui import ClientGUIScrolledPanelsEdit
|
||||
from hydrus.client.gui import ClientGUISerialisable
|
||||
from hydrus.client.gui import ClientGUIStringControls
|
||||
from hydrus.client.gui import ClientGUIStringPanels
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.networking import ClientNetworkingContexts
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
|
@ -304,9 +306,7 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
gug_names = list( gug_names )
|
||||
|
||||
gug_names.sort()
|
||||
gug_names = sorted( gug_names )
|
||||
|
||||
num_gugs = len( gug_names )
|
||||
|
||||
|
@ -392,9 +392,7 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
domains = domains.difference( existing_domains )
|
||||
|
||||
domains = list( domains )
|
||||
|
||||
domains.sort()
|
||||
domains = sorted( domains )
|
||||
|
||||
domain_metadatas = []
|
||||
|
||||
|
@ -556,9 +554,9 @@ class EditCompoundFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( formulae, sub_phrase, string_match, string_converter ) = formula.ToTuple()
|
||||
|
||||
self._string_match_button = ClientGUIControls.StringMatchButton( edit_panel, string_match )
|
||||
self._string_match_button = ClientGUIStringControls.StringMatchButton( edit_panel, string_match )
|
||||
|
||||
self._string_converter_button = ClientGUIControls.StringConverterButton( edit_panel, string_converter )
|
||||
self._string_converter_button = ClientGUIStringControls.StringConverterButton( edit_panel, string_converter )
|
||||
|
||||
#
|
||||
|
||||
|
@ -777,9 +775,9 @@ class EditContextVariableFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( variable_name, string_match, string_converter ) = formula.ToTuple()
|
||||
|
||||
self._string_match_button = ClientGUIControls.StringMatchButton( edit_panel, string_match )
|
||||
self._string_match_button = ClientGUIStringControls.StringMatchButton( edit_panel, string_match )
|
||||
|
||||
self._string_converter_button = ClientGUIControls.StringConverterButton( edit_panel, string_converter )
|
||||
self._string_converter_button = ClientGUIStringControls.StringConverterButton( edit_panel, string_converter )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1045,7 +1043,7 @@ class EditHTMLTagRulePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._tag_name = QW.QLineEdit( self )
|
||||
|
||||
self._tag_attributes = ClientGUIControls.StringToStringDictControl( self, tag_attributes, min_height = 4 )
|
||||
self._tag_attributes = ClientGUIStringControls.StringToStringDictControl( self, tag_attributes, min_height = 4 )
|
||||
|
||||
self._tag_index = ClientGUICommon.NoneableSpinCtrl( self, 'index to fetch', none_phrase = 'get all', min = 0, max = 255 )
|
||||
|
||||
|
@ -1053,7 +1051,7 @@ class EditHTMLTagRulePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._should_test_tag_string = QW.QCheckBox( self )
|
||||
|
||||
self._tag_string_string_match = ClientGUIControls.StringMatchButton( self, tag_string_string_match )
|
||||
self._tag_string_string_match = ClientGUIStringControls.StringMatchButton( self, tag_string_string_match )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1250,9 +1248,9 @@ class EditHTMLFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( tag_rules, content_to_fetch, attribute_to_fetch, string_match, string_converter ) = formula.ToTuple()
|
||||
|
||||
self._string_match_button = ClientGUIControls.StringMatchButton( edit_panel, string_match )
|
||||
self._string_match_button = ClientGUIStringControls.StringMatchButton( edit_panel, string_match )
|
||||
|
||||
self._string_converter_button = ClientGUIControls.StringConverterButton( edit_panel, string_converter )
|
||||
self._string_converter_button = ClientGUIStringControls.StringConverterButton( edit_panel, string_converter )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1478,7 +1476,7 @@ class EditJSONParsingRulePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._parse_rule_type.addItem( 'all dictionary/list items', ClientParsing.JSON_PARSE_RULE_TYPE_ALL_ITEMS )
|
||||
self._parse_rule_type.addItem( 'indexed list item', ClientParsing.JSON_PARSE_RULE_TYPE_INDEXED_ITEM )
|
||||
|
||||
self._string_match = ClientGUIControls.EditStringMatchPanel( self, string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'posts', example_string = 'posts' ) )
|
||||
self._string_match = ClientGUIStringPanels.EditStringMatchPanel( self, string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'posts', example_string = 'posts' ) )
|
||||
|
||||
self._index = QP.MakeQSpinBox( self, min=0, max=65535 )
|
||||
|
||||
|
@ -1601,9 +1599,9 @@ class EditJSONFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( parse_rules, content_to_fetch, string_match, string_converter ) = formula.ToTuple()
|
||||
|
||||
self._string_match_button = ClientGUIControls.StringMatchButton( edit_panel, string_match )
|
||||
self._string_match_button = ClientGUIStringControls.StringMatchButton( edit_panel, string_match )
|
||||
|
||||
self._string_converter_button = ClientGUIControls.StringConverterButton( edit_panel, string_converter )
|
||||
self._string_converter_button = ClientGUIStringControls.StringConverterButton( edit_panel, string_converter )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1883,7 +1881,7 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._veto_panel = QW.QWidget( self._content_panel )
|
||||
|
||||
self._veto_if_matches_found = QW.QCheckBox( self._veto_panel )
|
||||
self._string_match = ClientGUIControls.EditStringMatchPanel( self._veto_panel )
|
||||
self._string_match = ClientGUIStringPanels.EditStringMatchPanel( self._veto_panel )
|
||||
|
||||
self._temp_variable_panel = QW.QWidget( self._content_panel )
|
||||
|
||||
|
@ -2903,7 +2901,7 @@ class EditPageParserPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
string_converter = parser.GetStringConverter()
|
||||
|
||||
self._string_converter = ClientGUIControls.StringConverterButton( conversion_panel, string_converter )
|
||||
self._string_converter = ClientGUIStringControls.StringConverterButton( conversion_panel, string_converter )
|
||||
|
||||
#
|
||||
|
||||
|
@ -3135,9 +3133,7 @@ class EditPageParserPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
produces = page_parser.GetParsableContent()
|
||||
|
||||
produces = list( produces )
|
||||
|
||||
produces.sort()
|
||||
produces = sorted( produces )
|
||||
|
||||
pretty_name = name
|
||||
pretty_formula = formula.ToPrettyString()
|
||||
|
@ -3358,10 +3354,9 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
name = parser.GetName()
|
||||
|
||||
example_urls = list( parser.GetExampleURLs() )
|
||||
example_urls.sort()
|
||||
example_urls = sorted( parser.GetExampleURLs() )
|
||||
|
||||
produces = list( parser.GetParsableContent() )
|
||||
produces = sorted( parser.GetParsableContent() )
|
||||
|
||||
pretty_produces = ClientParsing.ConvertParsableContentToPrettyString( produces )
|
||||
|
||||
|
@ -3456,13 +3451,13 @@ class EditParsingScriptFileLookupPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._file_identifier_type.addItem( ClientParsing.file_identifier_string_lookup[ t], t )
|
||||
|
||||
|
||||
self._file_identifier_string_converter = ClientGUIControls.StringConverterButton( query_panel, file_identifier_string_converter )
|
||||
self._file_identifier_string_converter = ClientGUIStringControls.StringConverterButton( query_panel, file_identifier_string_converter )
|
||||
|
||||
self._file_identifier_arg_name = QW.QLineEdit( query_panel )
|
||||
|
||||
static_args_panel = ClientGUICommon.StaticBox( query_panel, 'static arguments' )
|
||||
|
||||
self._static_args = ClientGUIControls.StringToStringDictControl( static_args_panel, static_args, min_height = 4 )
|
||||
self._static_args = ClientGUIStringControls.StringToStringDictControl( static_args_panel, static_args, min_height = 4 )
|
||||
|
||||
children_panel = ClientGUICommon.StaticBox( edit_panel, 'content parsing children' )
|
||||
|
||||
|
@ -4250,7 +4245,7 @@ class TestPanel( QW.QWidget ):
|
|||
|
||||
self._object_callable = object_callable
|
||||
|
||||
self._example_parsing_context = ClientGUIControls.StringToStringDictButton( self, 'edit example parsing context' )
|
||||
self._example_parsing_context = ClientGUIStringControls.StringToStringDictButton( self, 'edit example parsing context' )
|
||||
|
||||
self._data_preview_notebook = QW.QTabWidget( self )
|
||||
|
||||
|
|
|
@ -148,20 +148,18 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
if self._focused_media is not None:
|
||||
|
||||
display_media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
return
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
|
||||
if display_media.GetMime() in HC.IMAGES:
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'bmp', display_media )
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Sorry, cannot take bmps of anything but static images right now!' )
|
||||
if media.GetMime() in HC.IMAGES:
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'bmp', media )
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Sorry, cannot take bmps of anything but static images right now!' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -184,41 +182,42 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
paths.append( path )
|
||||
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'paths', paths )
|
||||
if len( paths ) > 0:
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'paths', paths )
|
||||
|
||||
|
||||
|
||||
def _CopyHashToClipboard( self, hash_type ):
|
||||
|
||||
display_media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
return
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
|
||||
sha256_hash = display_media.GetHash()
|
||||
|
||||
if hash_type == 'sha256':
|
||||
sha256_hash = media.GetHash()
|
||||
|
||||
hex_hash = sha256_hash.hex()
|
||||
|
||||
else:
|
||||
|
||||
if display_media.GetLocationsManager().IsLocal():
|
||||
if hash_type == 'sha256':
|
||||
|
||||
( other_hash, ) = HG.client_controller.Read( 'file_hashes', ( sha256_hash, ), 'sha256', hash_type )
|
||||
|
||||
hex_hash = other_hash.hex()
|
||||
hex_hash = sha256_hash.hex()
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Unfortunately, you do not have that file in your database, so its non-sha256 hashes are unknown.' )
|
||||
|
||||
return
|
||||
if media.GetLocationsManager().IsLocal():
|
||||
|
||||
( other_hash, ) = HG.client_controller.Read( 'file_hashes', ( sha256_hash, ), 'sha256', hash_type )
|
||||
|
||||
hex_hash = other_hash.hex()
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Unfortunately, you do not have that file in your database, so its non-sha256 hashes are unknown.' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', hex_hash )
|
||||
HG.client_controller.pub( 'clipboard', 'text', hex_hash )
|
||||
|
||||
|
||||
|
||||
def _CopyHashesToClipboard( self, hash_type ):
|
||||
|
@ -250,18 +249,16 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _CopyPathToClipboard( self ):
|
||||
|
||||
display_media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
return
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
path = client_files_manager.GetFilePath( media.GetHash(), media.GetMime() )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', path )
|
||||
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
path = client_files_manager.GetFilePath( display_media.GetHash(), display_media.GetMime() )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', path )
|
||||
|
||||
|
||||
def _CopyPathsToClipboard( self ):
|
||||
|
@ -277,34 +274,35 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
paths.append( client_files_manager.GetFilePath( media_result.GetHash(), media_result.GetMime(), check_file_exists = False ) )
|
||||
|
||||
|
||||
text = os.linesep.join( paths )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
if len( paths ) > 0:
|
||||
|
||||
text = os.linesep.join( paths )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
|
||||
|
||||
|
||||
def _CopyServiceFilenameToClipboard( self, service_key ):
|
||||
|
||||
display_media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
return
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
|
||||
hash = display_media.GetHash()
|
||||
|
||||
( filename, ) = HG.client_controller.Read( 'service_filenames', service_key, { hash } )
|
||||
|
||||
service = HG.client_controller.services_manager.GetService( service_key )
|
||||
|
||||
if service.GetServiceType() == HC.IPFS:
|
||||
hash = media.GetHash()
|
||||
|
||||
multihash_prefix = service.GetMultihashPrefix()
|
||||
( filename, ) = HG.client_controller.Read( 'service_filenames', service_key, { hash } )
|
||||
|
||||
filename = multihash_prefix + filename
|
||||
service = HG.client_controller.services_manager.GetService( service_key )
|
||||
|
||||
if service.GetServiceType() == HC.IPFS:
|
||||
|
||||
multihash_prefix = service.GetMultihashPrefix()
|
||||
|
||||
filename = multihash_prefix + filename
|
||||
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', filename )
|
||||
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', filename )
|
||||
|
||||
|
||||
def _CopyServiceFilenamesToClipboard( self, service_key ):
|
||||
|
@ -464,23 +462,18 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _LaunchMediaViewer( self, first_media = None ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
display_media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
|
||||
return
|
||||
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
new_options = HG.client_controller.new_options
|
||||
|
||||
( media_show_action, media_start_paused, media_start_with_embed ) = new_options.GetMediaShowAction( display_media.GetMime() )
|
||||
( media_show_action, media_start_paused, media_start_with_embed ) = new_options.GetMediaShowAction( media.GetMime() )
|
||||
|
||||
if media_show_action == CC.MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY:
|
||||
|
||||
hash = display_media.GetHash()
|
||||
mime = display_media.GetMime()
|
||||
hash = media.GetHash()
|
||||
mime = media.GetMime()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
|
@ -535,6 +528,21 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
|
||||
|
||||
def _GetFocusSingleton( self ) -> ClientMedia.MediaSingleton:
|
||||
|
||||
if self._focused_media is not None:
|
||||
|
||||
media_singleton = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if media_singleton is not None:
|
||||
|
||||
return media_singleton
|
||||
|
||||
|
||||
|
||||
raise HydrusExceptions.DataMissing( 'No media singleton!' )
|
||||
|
||||
|
||||
def _GetNumSelected( self ):
|
||||
|
||||
return sum( [ media.GetNumFiles() for media in self._selected_media ] )
|
||||
|
@ -784,6 +792,20 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
return ( sorted_mime_descriptor, selected_mime_descriptor )
|
||||
|
||||
|
||||
def _HasFocusSingleton( self ) -> bool:
|
||||
|
||||
try:
|
||||
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
return True
|
||||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
def _HitMedia( self, media, ctrl, shift ):
|
||||
|
||||
if media is None:
|
||||
|
@ -878,22 +900,15 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _ManageNotes( self ):
|
||||
|
||||
if self._focused_media is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
return
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
|
||||
if media is None:
|
||||
ClientGUIMediaActions.EditFileNotes( self, media )
|
||||
|
||||
return
|
||||
self.setFocus( QC.Qt.OtherFocusReason )
|
||||
|
||||
|
||||
ClientGUIMediaActions.EditFileNotes( self, media )
|
||||
|
||||
self.setFocus( QC.Qt.OtherFocusReason )
|
||||
|
||||
|
||||
def _ManageRatings( self ):
|
||||
|
||||
|
@ -983,21 +998,16 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _OpenExternally( self ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
open_externally_media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if open_externally_media is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if open_externally_media.GetLocationsManager().IsLocal():
|
||||
if media.GetLocationsManager().IsLocal():
|
||||
|
||||
self.SetFocusedMedia( None )
|
||||
|
||||
hash = open_externally_media.GetHash()
|
||||
mime = open_externally_media.GetMime()
|
||||
hash = media.GetHash()
|
||||
mime = media.GetMime()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
|
@ -1014,12 +1024,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _OpenFileInWebBrowser( self ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
if self._focused_media.GetLocationsManager().IsLocal():
|
||||
focused_singleton = self._GetFocusSingleton()
|
||||
|
||||
if focused_singleton.GetLocationsManager().IsLocal():
|
||||
|
||||
hash = self._focused_media.GetHash()
|
||||
mime = self._focused_media.GetMime()
|
||||
hash = focused_singleton.GetHash()
|
||||
mime = focused_singleton.GetMime()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
|
@ -1034,12 +1046,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _OpenFileLocation( self ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
if self._focused_media.GetLocationsManager().IsLocal():
|
||||
focused_singleton = self._GetFocusSingleton()
|
||||
|
||||
if focused_singleton.GetLocationsManager().IsLocal():
|
||||
|
||||
hash = self._focused_media.GetHash()
|
||||
mime = self._focused_media.GetMime()
|
||||
hash = focused_singleton.GetHash()
|
||||
mime = focused_singleton.GetMime()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
|
@ -1054,9 +1068,11 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _OpenKnownURL( self ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
ClientGUIMedia.DoOpenKnownURLFromShortcut( self, self._focused_media )
|
||||
focused_singleton = self._GetFocusSingleton()
|
||||
|
||||
ClientGUIMedia.DoOpenKnownURLFromShortcut( self, focused_singleton )
|
||||
|
||||
|
||||
|
||||
|
@ -1533,39 +1549,47 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
def _SetDuplicatesFocusedBetter( self, duplicate_action_options = None ):
|
||||
|
||||
flat_media = self._GetSelectedFlatMedia()
|
||||
|
||||
if self._focused_media is None or self._focused_media.GetDisplayMedia() is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focused_singleton = self._GetFocusSingleton()
|
||||
|
||||
focused_hash = focused_singleton.GetHash()
|
||||
|
||||
flat_media = self._GetSelectedFlatMedia()
|
||||
|
||||
( better_media, ) = [ media for media in flat_media if media.GetHash() == focused_hash ]
|
||||
|
||||
worse_flat_media = [ media for media in flat_media if media.GetHash() != focused_hash ]
|
||||
|
||||
media_pairs = [ ( better_media, worse_media ) for worse_media in worse_flat_media ]
|
||||
|
||||
self._SetDuplicates( HC.DUPLICATE_BETTER, media_pairs = media_pairs )
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'No file is focused, so cannot set the focused file as better!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
focused_hash = self._focused_media.GetDisplayMedia().GetHash()
|
||||
|
||||
( better_media, ) = [ media for media in flat_media if media.GetHash() == focused_hash ]
|
||||
|
||||
worse_flat_media = [ media for media in flat_media if media.GetHash() != focused_hash ]
|
||||
|
||||
media_pairs = [ ( better_media, worse_media ) for worse_media in worse_flat_media ]
|
||||
|
||||
self._SetDuplicates( HC.DUPLICATE_BETTER, media_pairs = media_pairs )
|
||||
|
||||
|
||||
def _SetDuplicatesFocusedKing( self ):
|
||||
|
||||
if self._focused_media is None or self._focused_media.GetDisplayMedia() is None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
focused_hash = media.GetHash()
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'duplicate_set_king', focused_hash )
|
||||
|
||||
else:
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'No file is focused, so cannot set the focused file as king!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
focused_hash = self._focused_media.GetDisplayMedia().GetHash()
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'duplicate_set_king', focused_hash )
|
||||
|
||||
|
||||
def _SetDuplicatesPotential( self ):
|
||||
|
||||
|
@ -1842,16 +1866,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
elif action == 'duplicate_media_clear_focused_false_positives':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_clear_false_positives':
|
||||
|
@ -1865,16 +1886,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
elif action == 'duplicate_media_dissolve_focused_alternate_group':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_dissolve_alternate_group':
|
||||
|
@ -1888,16 +1906,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
elif action == 'duplicate_media_dissolve_focused_duplicate_group':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_dissolve_duplicate_group':
|
||||
|
@ -1911,44 +1926,35 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
elif action == 'duplicate_media_remove_focused_from_alternate_group':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_remove_focused_from_duplicate_group':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_reset_focused_potential_search':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_reset_potential_search':
|
||||
|
@ -1962,16 +1968,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
|
|||
|
||||
elif action == 'duplicate_media_remove_focused_potentials':
|
||||
|
||||
if self._focused_media is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
media = self._focused_media.GetDisplayMedia()
|
||||
media = self._GetFocusSingleton()
|
||||
|
||||
if media is not None:
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == 'duplicate_media_remove_potentials':
|
||||
|
@ -2392,6 +2395,11 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
display_media = thumbnail.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
hash = display_media.GetHash()
|
||||
|
||||
if hash in self._hashes_faded and thumbnail_cache.HasThumbnailCached( thumbnail ):
|
||||
|
@ -2438,6 +2446,13 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
for thumbnail in thumbnails:
|
||||
|
||||
display_media = thumbnail.GetDisplayMedia()
|
||||
|
||||
if display_media is None:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
try:
|
||||
|
||||
thumbnail_index = self._sorted_media.index( thumbnail )
|
||||
|
@ -2454,22 +2469,17 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
continue
|
||||
|
||||
|
||||
display_media = thumbnail.GetDisplayMedia()
|
||||
hash = display_media.GetHash()
|
||||
|
||||
if display_media is not None:
|
||||
|
||||
hash = display_media.GetHash()
|
||||
|
||||
self._hashes_faded.add( hash )
|
||||
|
||||
self._StopFading( hash )
|
||||
|
||||
bmp = thumbnail.GetQtImage()
|
||||
|
||||
alpha_bmp = QP.AdjustOpacity( bmp, 0.20 )
|
||||
|
||||
self._thumbnails_being_faded_in[ hash ] = ( bmp, alpha_bmp, thumbnail_index, thumbnail, now_precise, 0 )
|
||||
|
||||
self._hashes_faded.add( hash )
|
||||
|
||||
self._StopFading( hash )
|
||||
|
||||
bmp = thumbnail.GetQtImage()
|
||||
|
||||
alpha_bmp = QP.AdjustOpacity( bmp, 0.20 )
|
||||
|
||||
self._thumbnails_being_faded_in[ hash ] = ( bmp, alpha_bmp, thumbnail_index, thumbnail, now_precise, 0 )
|
||||
|
||||
|
||||
HG.client_controller.gui.RegisterAnimationUpdateWindow( self )
|
||||
|
@ -3231,7 +3241,9 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
menu = QW.QMenu( self.window() )
|
||||
|
||||
if self._focused_media is not None and self._focused_media.GetDisplayMedia() is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
|
||||
# variables
|
||||
|
||||
|
@ -3437,7 +3449,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
else:
|
||||
|
||||
pretty_info_lines = self._focused_media.GetPrettyInfoLines()
|
||||
pretty_info_lines = focus_singleton.GetPrettyInfoLines()
|
||||
|
||||
top_line = pretty_info_lines.pop( 0 )
|
||||
|
||||
|
@ -3565,7 +3577,9 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
|
||||
|
||||
if self._focused_media is not None and self._focused_media.GetDisplayMedia() is not None:
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
|
||||
if selection_has_inbox:
|
||||
|
||||
|
@ -3605,7 +3619,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( manage_menu, 'urls', 'Manage urls for the selected files.', self._ManageURLs )
|
||||
|
||||
num_notes = self._focused_media.GetDisplayMedia().GetNotesManager().GetNumNotes()
|
||||
num_notes = focus_singleton.GetNotesManager().GetNumNotes()
|
||||
|
||||
notes_str = 'notes'
|
||||
|
||||
|
@ -3707,7 +3721,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
duplicates_menu = QW.QMenu( manage_menu )
|
||||
|
||||
focused_hash = self._focused_media.GetDisplayMedia().GetHash()
|
||||
focused_hash = focus_singleton.GetHash()
|
||||
|
||||
if HG.client_controller.DBCurrentlyDoingJob():
|
||||
|
||||
|
@ -3722,7 +3736,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
focus_is_in_alternate_group = False
|
||||
focus_has_fps = False
|
||||
focus_has_potentials = False
|
||||
focus_can_be_searched = self._focused_media.GetDisplayMedia().GetMime() in HC.MIMES_WE_CAN_PHASH
|
||||
focus_can_be_searched = focus_singleton.GetMime() in HC.MIMES_WE_CAN_PHASH
|
||||
|
||||
if file_duplicate_info is None:
|
||||
|
||||
|
@ -4004,7 +4018,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
if advanced_mode:
|
||||
|
||||
hash_id_str = str( self._focused_media.GetDisplayMedia().GetHashId() )
|
||||
hash_id_str = str( focus_singleton.GetHashId() )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( copy_menu, 'file_id ({})'.format( hash_id_str ), 'Copy this file\'s internal file/hash_id.', HG.client_controller.pub, 'clipboard', 'text', hash_id_str )
|
||||
|
||||
|
|
|
@ -35,6 +35,7 @@ from hydrus.client.gui import ClientGUIShortcuts
|
|||
from hydrus.client.gui import ClientGUIFileSeedCache
|
||||
from hydrus.client.gui import ClientGUIGallerySeedLog
|
||||
from hydrus.client.gui import ClientGUIMPV
|
||||
from hydrus.client.gui import ClientGUIStringControls
|
||||
from hydrus.client.gui import ClientGUITags
|
||||
from hydrus.client.gui import ClientGUITime
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
|
@ -1811,9 +1812,7 @@ class EditFileNotesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
names = list( names_to_notes.keys() )
|
||||
|
||||
names.sort()
|
||||
names = sorted( names_to_notes.keys() )
|
||||
|
||||
for name in names:
|
||||
|
||||
|
@ -1979,9 +1978,7 @@ class EditFileNotesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( names_to_notes, deletee_names ) = self.GetValue()
|
||||
|
||||
empty_note_names = [ name for ( name, note ) in names_to_notes.items() if note == '' ]
|
||||
|
||||
empty_note_names.sort()
|
||||
empty_note_names = sorted( ( name for ( name, note ) in names_to_notes.items() if note == '' ) )
|
||||
|
||||
if len( empty_note_names ) > 0:
|
||||
|
||||
|
@ -4346,11 +4343,8 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
current_query_texts = self._GetCurrentQueryTexts()
|
||||
|
||||
already_existing_query_texts = list( current_query_texts.intersection( query_texts ) )
|
||||
new_query_texts = list( set( query_texts ).difference( current_query_texts ) )
|
||||
|
||||
already_existing_query_texts.sort()
|
||||
new_query_texts.sort()
|
||||
already_existing_query_texts = sorted( current_query_texts.intersection( query_texts ) )
|
||||
new_query_texts = sorted( set( query_texts ).difference( current_query_texts ) )
|
||||
|
||||
if len( already_existing_query_texts ) > 0:
|
||||
|
||||
|
@ -5721,9 +5715,7 @@ class EditTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
url_class_keys_to_url_classes = { url_class.GetMatchKey() : url_class for url_class in url_classes }
|
||||
|
||||
url_class_names_and_default_tag_import_options = [ ( url_class_keys_to_url_classes[ url_class_key ].GetName(), url_class_keys_to_default_tag_import_options[ url_class_key ] ) for url_class_key in list( url_class_keys_to_default_tag_import_options.keys() ) if url_class_key in url_class_keys_to_url_classes ]
|
||||
|
||||
url_class_names_and_default_tag_import_options.sort()
|
||||
url_class_names_and_default_tag_import_options = sorted( ( ( url_class_keys_to_url_classes[ url_class_key ].GetName(), url_class_keys_to_default_tag_import_options[ url_class_key ] ) for url_class_key in list( url_class_keys_to_default_tag_import_options.keys() ) if url_class_key in url_class_keys_to_url_classes ) )
|
||||
|
||||
choice_tuples.extend( url_class_names_and_default_tag_import_options )
|
||||
|
||||
|
@ -6148,9 +6140,9 @@ class EditServiceTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return service_tag_import_options
|
||||
|
||||
|
||||
def SetValue( self, service_tag_import_options ):
|
||||
def SetValue( self, service_tag_import_options: ClientImportOptions.ServiceTagImportOptions ):
|
||||
|
||||
( get_tags, get_tags_filter, self._additional_tags, self._to_new_files, self._to_already_in_inbox, self._to_already_in_archive, self._only_add_existing_tags, self._only_add_existing_tags_filter ) = service_tag_import_options.ToTuple()
|
||||
( get_tags, get_tags_filter, self._additional_tags, self._to_new_files, self._to_already_in_inbox, self._to_already_in_archive, self._only_add_existing_tags, self._only_add_existing_tags_filter, self._get_tags_overwrite_deleted, self._additional_tags_overwrite_deleted ) = service_tag_import_options.ToTuple()
|
||||
|
||||
self._get_tags_checkbox.setChecked( get_tags )
|
||||
|
||||
|
@ -6505,7 +6497,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._send_referral_url.setToolTip( tt )
|
||||
|
||||
self._referral_url_converter = ClientGUIControls.StringConverterButton( self, referral_url_converter )
|
||||
self._referral_url_converter = ClientGUIStringControls.StringConverterButton( self, referral_url_converter )
|
||||
|
||||
tt = 'This will generate a referral URL from the original URL. If the URL needs a referral URL, and you can infer what that would be from just this URL, this will let hydrus download this URL without having to previously visit the referral URL (e.g. letting the user drag-and-drop import). It also lets you set up alternate referral URLs for perculiar situations.'
|
||||
|
||||
|
@ -6514,7 +6506,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._referral_url = QW.QLineEdit()
|
||||
self._referral_url.setReadOnly( True )
|
||||
|
||||
self._api_lookup_converter = ClientGUIControls.StringConverterButton( self, api_lookup_converter )
|
||||
self._api_lookup_converter = ClientGUIStringControls.StringConverterButton( self, api_lookup_converter )
|
||||
|
||||
tt = 'This will let you generate an alternate URL for the client to use for the actual download whenever it encounters a URL in this class. You must have a separate URL class to match the API type (which will link to parsers).'
|
||||
|
||||
|
@ -6675,7 +6667,9 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit value' ) as dlg:
|
||||
|
||||
panel = ClientGUIControls.EditStringMatchPanel( dlg, string_match )
|
||||
from hydrus.client.gui import ClientGUIStringPanels
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -6805,7 +6799,9 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit value' ) as dlg:
|
||||
|
||||
panel = ClientGUIControls.EditStringMatchPanel( dlg, original_string_match )
|
||||
from hydrus.client.gui import ClientGUIStringPanels
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, original_string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -6865,7 +6861,9 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit path component' ) as dlg:
|
||||
|
||||
panel = ClientGUIControls.EditStringMatchPanel( dlg, string_match )
|
||||
from hydrus.client.gui import ClientGUIStringPanels
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -38,6 +38,7 @@ from hydrus.client.gui import ClientGUISearch
|
|||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui import ClientGUIScrolledPanelsEdit
|
||||
from hydrus.client.gui import ClientGUIShortcuts
|
||||
from hydrus.client.gui import ClientGUIStringControls
|
||||
from hydrus.client.gui import ClientGUIStyle
|
||||
from hydrus.client.gui import ClientGUITime
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
|
@ -539,9 +540,7 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
hta = HydrusTagArchive.HydrusTagArchive( hta_path )
|
||||
|
||||
archive_namespaces = list( hta.GetNamespaces() )
|
||||
|
||||
archive_namespaces.sort()
|
||||
archive_namespaces = sorted( hta.GetNamespaces() )
|
||||
|
||||
choice_tuples = [ ( HydrusData.ConvertUglyNamespaceToPrettyString( namespace ), namespace, False ) for namespace in archive_namespaces ]
|
||||
|
||||
|
@ -587,9 +586,7 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
hta = HydrusTagArchive.HydrusTagArchive( hta_path )
|
||||
|
||||
archive_namespaces = list( hta.GetNamespaces() )
|
||||
|
||||
archive_namespaces.sort()
|
||||
archive_namespaces = sorted( hta.GetNamespaces() )
|
||||
|
||||
choice_tuples = [ ( HydrusData.ConvertUglyNamespaceToPrettyString( namespace ), namespace, namespace in existing_namespaces ) for namespace in archive_namespaces ]
|
||||
|
||||
|
@ -1405,7 +1402,7 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this path remapping control -->', QG.QColor( 0, 0, 255 ) )
|
||||
|
||||
self._nocopy_abs_path_translations = ClientGUIControls.StringToStringDictControl( self, abs_initial_dict, key_name = 'hydrus path', value_name = 'ipfs path', allow_add_delete = False, edit_keys = False )
|
||||
self._nocopy_abs_path_translations = ClientGUIStringControls.StringToStringDictControl( self, abs_initial_dict, key_name = 'hydrus path', value_name = 'ipfs path', allow_add_delete = False, edit_keys = False )
|
||||
|
||||
self._multihash_prefix = QW.QLineEdit( self )
|
||||
|
||||
|
@ -2862,17 +2859,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._notebook_tabs_on_left = QW.QCheckBox( self )
|
||||
|
||||
self._max_page_name_chars = QP.MakeQSpinBox( self, min=1, max=256 )
|
||||
|
||||
self._page_file_count_display = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
for display_type in ( CC.PAGE_FILE_COUNT_DISPLAY_ALL, CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS, CC.PAGE_FILE_COUNT_DISPLAY_NONE ):
|
||||
|
||||
self._page_file_count_display.addItem( CC.page_file_count_display_string_lookup[ display_type], display_type )
|
||||
|
||||
|
||||
self._import_page_progress_display = QW.QCheckBox( self )
|
||||
|
||||
self._total_pages_warning = QP.MakeQSpinBox( self, min=5, max=200 )
|
||||
|
||||
self._reverse_page_shift_drag_behaviour = QW.QCheckBox( self )
|
||||
|
@ -2882,6 +2868,22 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
#
|
||||
|
||||
self._page_names_panel = ClientGUICommon.StaticBox( self, 'page tab names' )
|
||||
|
||||
self._max_page_name_chars = QP.MakeQSpinBox( self._page_names_panel, min=1, max=256 )
|
||||
self._elide_page_tab_names = QW.QCheckBox( self._page_names_panel )
|
||||
|
||||
self._page_file_count_display = ClientGUICommon.BetterChoice( self._page_names_panel )
|
||||
|
||||
for display_type in ( CC.PAGE_FILE_COUNT_DISPLAY_ALL, CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS, CC.PAGE_FILE_COUNT_DISPLAY_NONE ):
|
||||
|
||||
self._page_file_count_display.addItem( CC.page_file_count_display_string_lookup[ display_type], display_type )
|
||||
|
||||
|
||||
self._import_page_progress_display = QW.QCheckBox( self._page_names_panel )
|
||||
|
||||
#
|
||||
|
||||
gui_session_names = HG.client_controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION )
|
||||
|
||||
if 'last session' not in gui_session_names:
|
||||
|
@ -2919,6 +2921,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._max_page_name_chars.setValue( self._new_options.GetInteger( 'max_page_name_chars' ) )
|
||||
|
||||
self._elide_page_tab_names.setChecked( self._new_options.GetBoolean( 'elide_page_tab_names' ) )
|
||||
|
||||
self._page_file_count_display.SetValue( self._new_options.GetInteger( 'page_file_count_display' ) )
|
||||
|
||||
self._import_page_progress_display.setChecked( self._new_options.GetBoolean( 'import_page_progress_display' ) )
|
||||
|
@ -2941,17 +2945,41 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'When switching to a page, focus its input field (if any): ', self._set_search_focus_on_page_change ) )
|
||||
rows.append( ( 'Switch to main window when opening tag search page from media viewer: ', self._activate_window_on_tag_search_page_activation ) )
|
||||
rows.append( ( 'Line notebook tabs down the left: ', self._notebook_tabs_on_left ) )
|
||||
rows.append( ( 'Max characters to display in a page name: ', self._max_page_name_chars ) )
|
||||
rows.append( ( 'Show page file count after its name: ', self._page_file_count_display ) )
|
||||
rows.append( ( 'Show import page x/y progress after its name: ', self._import_page_progress_display ) )
|
||||
rows.append( ( 'Warn at this many total pages: ', self._total_pages_warning ) )
|
||||
rows.append( ( 'Reverse page tab shift-drag behaviour: ', self._reverse_page_shift_drag_behaviour ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Max characters to display in a page name: ', self._max_page_name_chars ) )
|
||||
rows.append( ( 'When there are too many tabs to fit, \'...\' elide their names so they fit: ', self._elide_page_tab_names ) )
|
||||
rows.append( ( 'Show page file count after its name: ', self._page_file_count_display ) )
|
||||
rows.append( ( 'Show import page x/y progress after its name: ', self._import_page_progress_display ) )
|
||||
|
||||
page_names_gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
label = 'If you have enough pages in a row, left/right arrows will appear to navigate them back and forth.'
|
||||
label += os.linesep
|
||||
label += 'Due to an unfortunate Qt issue, the tab bar will scroll so the current tab is right-most visible whenever a page is renamed.'
|
||||
label += os.linesep
|
||||
label += 'Therefore, if you set pages to have current file count or import progress in their name (which will update from time to time), do not put import pages in a long row of tabs, as it will reset scroll position on every progress update.'
|
||||
label += os.linesep
|
||||
label += 'Just make some nested \'page of pages\' so they are not all in the same row.'
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self._page_names_panel, label )
|
||||
|
||||
st.setWordWrap( True )
|
||||
|
||||
self._page_names_panel.Add( st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self._page_names_panel.Add( page_names_gridbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._page_names_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, QW.QWidget( self ), CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
|
@ -2974,6 +3002,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._new_options.SetInteger( 'max_page_name_chars', self._max_page_name_chars.value() )
|
||||
|
||||
self._new_options.SetBoolean( 'elide_page_tab_names', self._elide_page_tab_names.isChecked() )
|
||||
|
||||
self._new_options.SetInteger( 'page_file_count_display', self._page_file_count_display.GetValue() )
|
||||
self._new_options.SetBoolean( 'import_page_progress_display', self._import_page_progress_display.isChecked() )
|
||||
|
||||
|
@ -4661,9 +4691,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._favourite_file_lookup_script = ClientGUICommon.BetterChoice( suggested_tags_file_lookup_script_panel )
|
||||
|
||||
script_names = list( HG.client_controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_PARSE_ROOT_FILE_LOOKUP ) )
|
||||
|
||||
script_names.sort()
|
||||
script_names = sorted( HG.client_controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_PARSE_ROOT_FILE_LOOKUP ) )
|
||||
|
||||
for name in script_names:
|
||||
|
||||
|
@ -5265,9 +5293,7 @@ class ManageURLsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def _Copy( self ):
|
||||
|
||||
urls = list( self._current_urls_count.keys() )
|
||||
|
||||
urls.sort()
|
||||
urls = sorted( self._current_urls_count.keys() )
|
||||
|
||||
text = os.linesep.join( urls )
|
||||
|
||||
|
@ -5294,7 +5320,7 @@ class ManageURLsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
addee_hashes = { m.GetHash() for m in addee_media }
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( ( url, ), addee_hashes ) )
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( ( normalised_url, ), addee_hashes ) )
|
||||
|
||||
for m in addee_media:
|
||||
|
||||
|
|
|
@ -3213,9 +3213,7 @@ class ReviewLocalFileImports( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
del self._current_path_data[ path ]
|
||||
|
||||
|
||||
flat_path_data = [ ( index, path, mime, size ) for ( path, ( index, mime, size ) ) in self._current_path_data.items() ]
|
||||
|
||||
flat_path_data.sort()
|
||||
flat_path_data = sorted( ( ( index, path, mime, size ) for ( path, ( index, mime, size ) ) in self._current_path_data.items() ) )
|
||||
|
||||
new_index = 1
|
||||
|
||||
|
|
|
@ -24,6 +24,28 @@ from hydrus.client.gui import ClientGUIShortcuts
|
|||
from hydrus.client.gui import ClientGUITime
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
|
||||
FLESH_OUT_SYSTEM_PRED_TYPES = {
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_AGE,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_HASH,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_MIME,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_RATING,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_SERVICE,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_NOTES,
|
||||
ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS
|
||||
}
|
||||
|
||||
def FleshOutPredicates( widget: QW.QWidget, predicates: typing.Iterable[ ClientSearch.Predicate ] ) -> typing.List[ ClientSearch.Predicate ]:
|
||||
|
||||
window = widget.window()
|
||||
|
@ -36,7 +58,7 @@ def FleshOutPredicates( widget: QW.QWidget, predicates: typing.Iterable[ ClientS
|
|||
|
||||
( predicate_type, value, inclusive ) = predicate.GetInfo()
|
||||
|
||||
if value is None and predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ClientSearch.PREDICATE_TYPE_SYSTEM_RATING, ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ]:
|
||||
if value is None and predicate_type in FLESH_OUT_SYSTEM_PRED_TYPES:
|
||||
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
|
||||
|
@ -625,12 +647,13 @@ class InputFileSystemPredicate( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
editable_pred_panel_classes.append( PanelPredicateSystemNumTags )
|
||||
|
||||
elif predicate_type == ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES:
|
||||
elif predicate_type == ClientSearch.PREDICATE_TYPE_SYSTEM_NOTES:
|
||||
|
||||
static_pred_buttons.append( StaticSystemPredicateButton( self, ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES, ( '>', 0 ) ), ) ) )
|
||||
static_pred_buttons.append( StaticSystemPredicateButton( self, ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES, ( '=', 0 ) ), ) ) )
|
||||
|
||||
editable_pred_panel_classes.append( PanelPredicateSystemNumNotes )
|
||||
editable_pred_panel_classes.append( PanelPredicateSystemHasNoteName )
|
||||
|
||||
elif predicate_type == ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS:
|
||||
|
||||
|
@ -1394,6 +1417,47 @@ class PanelPredicateSystemHash( PanelPredicateSystem ):
|
|||
return ( hashes, hash_type )
|
||||
|
||||
|
||||
class PanelPredicateSystemHasNoteName( PanelPredicateSystem ):
|
||||
|
||||
PREDICATE_TYPE = ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_NOTE_NAME
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
PanelPredicateSystem.__init__( self, parent )
|
||||
|
||||
self._operator = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
self._operator.addItem( 'has note with name ', True )
|
||||
self._operator.addItem( 'does not have note with name', False )
|
||||
|
||||
self._name = QW.QLineEdit( self )
|
||||
self._name.setFixedWidth( 250 )
|
||||
|
||||
hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText(self,'system:note name'), CC.FLAGS_VCENTER )
|
||||
QP.AddToLayout( hbox, self._operator, CC.FLAGS_VCENTER )
|
||||
QP.AddToLayout( hbox, self._name, CC.FLAGS_VCENTER )
|
||||
|
||||
hbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( hbox )
|
||||
|
||||
|
||||
def GetInfo( self ):
|
||||
|
||||
name = self._name.text()
|
||||
|
||||
if name == '':
|
||||
|
||||
name = 'notes'
|
||||
|
||||
|
||||
info = ( self._operator.GetValue(), name )
|
||||
|
||||
return info
|
||||
|
||||
|
||||
class PanelPredicateSystemHeight( PanelPredicateSystem ):
|
||||
|
||||
PREDICATE_TYPE = ClientSearch.PREDICATE_TYPE_SYSTEM_HEIGHT
|
||||
|
|
|
@ -88,9 +88,7 @@ class ApplicationCommandWidget( ClientGUIScrolledPanels.EditPanel ):
|
|||
choices = ClientGUIShortcuts.simple_shortcut_name_to_action_lookup[ 'custom' ]
|
||||
|
||||
|
||||
choices = list( choices )
|
||||
|
||||
choices.sort()
|
||||
choices = sorted( choices )
|
||||
|
||||
self._simple_actions = QW.QComboBox( self )
|
||||
self._simple_actions.addItems( choices )
|
||||
|
|
|
@ -534,7 +534,7 @@ class Shortcut( HydrusSerialisable.SerialisableBase ):
|
|||
shortcut_key += 32 # convert A to a
|
||||
|
||||
|
||||
modifiers.sort()
|
||||
modifiers = sorted( modifiers )
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
|
@ -1173,9 +1173,7 @@ class ShortcutsHandler( QC.QObject ):
|
|||
|
||||
def GetCustomShortcutNames( self ):
|
||||
|
||||
custom_names = [ name for name in self._shortcuts_names if name not in SHORTCUTS_RESERVED_NAMES ]
|
||||
|
||||
custom_names.sort()
|
||||
custom_names = sorted( ( name for name in self._shortcuts_names if name not in SHORTCUTS_RESERVED_NAMES ) )
|
||||
|
||||
return custom_names
|
||||
|
||||
|
|
|
@ -0,0 +1,606 @@
|
|||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientParsing
|
||||
from hydrus.client.gui import ClientGUICommon
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIListCtrl
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui import ClientGUIStringPanels
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
|
||||
class StringConverterButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
stringConverterUpdate = QC.Signal()
|
||||
|
||||
def __init__( self, parent, string_converter: ClientParsing.StringConverter ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'edit string converter', self._Edit )
|
||||
|
||||
self._string_converter = string_converter
|
||||
|
||||
self._example_string_override = None
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit string converter', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringConverterPanel( dlg, self._string_converter, example_string_override = self._example_string_override )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._string_converter = panel.GetValue()
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
self.stringConverterUpdate.emit()
|
||||
|
||||
|
||||
def _UpdateLabel( self ):
|
||||
|
||||
label = self._string_converter.ToString()
|
||||
|
||||
self.setText( label )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
return self._string_converter
|
||||
|
||||
|
||||
def SetExampleString( self, example_string ):
|
||||
|
||||
self._example_string_override = example_string
|
||||
|
||||
|
||||
def SetValue( self, string_converter ):
|
||||
|
||||
self._string_converter = string_converter
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
class StringMatchButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, string_match: ClientParsing.StringMatch ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'edit string match', self._Edit )
|
||||
|
||||
self._string_match = string_match
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit string match', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, self._string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._string_match = panel.GetValue()
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
|
||||
|
||||
def _UpdateLabel( self ):
|
||||
|
||||
label = self._string_match.ToString()
|
||||
|
||||
self.setText( label )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
return self._string_match
|
||||
|
||||
|
||||
def SetValue( self, string_match ):
|
||||
|
||||
self._string_match = string_match
|
||||
|
||||
self._UpdateLabel()
|
||||
|
||||
|
||||
class StringMatchToStringMatchDictControl( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent, initial_dict: typing.Dict[ ClientParsing.StringMatch, ClientParsing.StringMatch ], min_height = 10, key_name = 'key' ):
|
||||
|
||||
QW.QWidget.__init__( self, parent )
|
||||
|
||||
self._key_name = key_name
|
||||
|
||||
listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
columns = [ ( self._key_name, 20 ), ( 'matching', -1 ) ]
|
||||
|
||||
self._listctrl = ClientGUIListCtrl.BetterListCtrl( listctrl_panel, 'key_to_string_match', min_height, 36, columns, self._ConvertDataToListCtrlTuples, use_simple_delete = True, activation_callback = self._Edit )
|
||||
|
||||
listctrl_panel.SetListCtrl( self._listctrl )
|
||||
|
||||
listctrl_panel.AddButton( 'add', self._Add )
|
||||
listctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True )
|
||||
listctrl_panel.AddDeleteButton()
|
||||
|
||||
#
|
||||
|
||||
self._listctrl.AddDatas( list(initial_dict.items()) )
|
||||
|
||||
self._listctrl.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, listctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
|
||||
def _ConvertDataToListCtrlTuples( self, data ):
|
||||
|
||||
( key_string_match, value_string_match ) = data
|
||||
|
||||
pretty_key = key_string_match.ToString()
|
||||
pretty_value = value_string_match.ToString()
|
||||
|
||||
display_tuple = ( pretty_key, pretty_value )
|
||||
sort_tuple = ( pretty_key, pretty_value )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _Add( self ):
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit ' + self._key_name ) as dlg:
|
||||
|
||||
string_match = ClientParsing.StringMatch()
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
key_string_match = panel.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit match' ) as dlg:
|
||||
|
||||
string_match = ClientParsing.StringMatch()
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
value_string_match = panel.GetValue()
|
||||
|
||||
data = ( key_string_match, value_string_match )
|
||||
|
||||
self._listctrl.AddDatas( ( data, ) )
|
||||
|
||||
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
for data in self._listctrl.GetData( only_selected = True ):
|
||||
|
||||
( key_string_match, value_string_match ) = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit ' + self._key_name ) as dlg:
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, key_string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
key_string_match = panel.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit match' ) as dlg:
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, value_string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
value_string_match = panel.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
self._listctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
edited_data = ( key_string_match, value_string_match )
|
||||
|
||||
self._listctrl.AddDatas( ( edited_data, ) )
|
||||
|
||||
|
||||
self._listctrl.Sort()
|
||||
|
||||
|
||||
def GetValue( self ) -> typing.Dict[ str, ClientParsing.StringMatch ]:
|
||||
|
||||
value_dict = dict( self._listctrl.GetData() )
|
||||
|
||||
return value_dict
|
||||
|
||||
|
||||
class StringToStringDictButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, label ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, label, self._Edit )
|
||||
|
||||
self._value: typing.Dict[ str, str ] = {}
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit string dictionary' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
control = StringToStringDictControl( panel, self._value )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._value = control.GetValue()
|
||||
|
||||
|
||||
|
||||
|
||||
def GetValue( self ) -> typing.Dict[ str, str ]:
|
||||
|
||||
return self._value
|
||||
|
||||
|
||||
def SetValue( self, value ):
|
||||
|
||||
self._value = value
|
||||
|
||||
|
||||
class StringToStringDictControl( QW.QWidget ):
|
||||
|
||||
listCtrlChanged = QC.Signal()
|
||||
|
||||
def __init__( self, parent, initial_dict: typing.Dict[ str, str ], min_height = 10, key_name = 'key', value_name = 'value', allow_add_delete = True, edit_keys = True ):
|
||||
|
||||
QW.QWidget.__init__( self, parent )
|
||||
|
||||
self._key_name = key_name
|
||||
self._value_name = value_name
|
||||
|
||||
self._edit_keys = edit_keys
|
||||
|
||||
listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
columns = [ ( self._key_name, 20 ), ( self._value_name, -1 ) ]
|
||||
|
||||
use_simple_delete = allow_add_delete
|
||||
|
||||
self._listctrl = ClientGUIListCtrl.BetterListCtrl( listctrl_panel, 'key_to_value', min_height, 36, columns, self._ConvertDataToListCtrlTuples, use_simple_delete = use_simple_delete, activation_callback = self._Edit )
|
||||
self._listctrl.listCtrlChanged.connect( self.listCtrlChanged )
|
||||
|
||||
listctrl_panel.SetListCtrl( self._listctrl )
|
||||
|
||||
if allow_add_delete:
|
||||
|
||||
listctrl_panel.AddButton( 'add', self._Add )
|
||||
|
||||
|
||||
listctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True )
|
||||
|
||||
if allow_add_delete:
|
||||
|
||||
listctrl_panel.AddDeleteButton()
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._listctrl.AddDatas( list(initial_dict.items()) )
|
||||
|
||||
self._listctrl.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, listctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
|
||||
def _ConvertDataToListCtrlTuples( self, data ):
|
||||
|
||||
( key, value ) = data
|
||||
|
||||
display_tuple = ( key, value )
|
||||
sort_tuple = ( key, value )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _Add( self ):
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'enter the ' + self._key_name, allow_blank = False ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
key = dlg.GetValue()
|
||||
|
||||
if key in self._GetExistingKeys():
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'That {} already exists!'.format( self._key_name ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'enter the ' + self._value_name, allow_blank = True ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
value = dlg.GetValue()
|
||||
|
||||
data = ( key, value )
|
||||
|
||||
self._listctrl.AddDatas( ( data, ) )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
for data in self._listctrl.GetData( only_selected = True ):
|
||||
|
||||
( key, value ) = data
|
||||
|
||||
if self._edit_keys:
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'edit the ' + self._key_name, default = key, allow_blank = False ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_key = dlg.GetValue()
|
||||
|
||||
if edited_key != key and edited_key in self._GetExistingKeys():
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'That {} already exists!'.format( self._key_name ) )
|
||||
|
||||
break
|
||||
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
edited_key = key
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'edit the ' + self._value_name, default = value, allow_blank = True ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_value = dlg.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
self._listctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
edited_data = ( edited_key, edited_value )
|
||||
|
||||
self._listctrl.AddDatas( ( edited_data, ) )
|
||||
|
||||
|
||||
self._listctrl.Sort()
|
||||
|
||||
|
||||
def _GetExistingKeys( self ):
|
||||
|
||||
return { key for ( key, value ) in self._listctrl.GetData() }
|
||||
|
||||
|
||||
def GetValue( self ) -> typing.Dict[ str, str ]:
|
||||
|
||||
value_dict = dict( self._listctrl.GetData() )
|
||||
|
||||
return value_dict
|
||||
|
||||
|
||||
class StringToStringMatchDictControl( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent, initial_dict: typing.Dict[ str, ClientParsing.StringMatch ], min_height = 10, key_name = 'key' ):
|
||||
|
||||
QW.QWidget.__init__( self, parent )
|
||||
|
||||
self._key_name = key_name
|
||||
|
||||
listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
columns = [ ( self._key_name, 20 ), ( 'matching', -1 ) ]
|
||||
|
||||
self._listctrl = ClientGUIListCtrl.BetterListCtrl( listctrl_panel, 'key_to_string_match', min_height, 36, columns, self._ConvertDataToListCtrlTuples, use_simple_delete = True, activation_callback = self._Edit )
|
||||
|
||||
listctrl_panel.SetListCtrl( self._listctrl )
|
||||
|
||||
listctrl_panel.AddButton( 'add', self._Add )
|
||||
listctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True )
|
||||
listctrl_panel.AddDeleteButton()
|
||||
|
||||
#
|
||||
|
||||
self._listctrl.AddDatas( initial_dict.items() )
|
||||
|
||||
self._listctrl.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, listctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
|
||||
def _ConvertDataToListCtrlTuples( self, data ):
|
||||
|
||||
( key, string_match ) = data
|
||||
|
||||
pretty_string_match = string_match.ToString()
|
||||
|
||||
display_tuple = ( key, pretty_string_match )
|
||||
sort_tuple = ( key, pretty_string_match )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _Add( self ):
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'enter the ' + self._key_name, allow_blank = False ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
key = dlg.GetValue()
|
||||
|
||||
if key in self._GetExistingKeys():
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'That {} already exists!'.format( self._key_name ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit match' ) as dlg:
|
||||
|
||||
string_match = ClientParsing.StringMatch()
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
string_match = panel.GetValue()
|
||||
|
||||
data = ( key, string_match )
|
||||
|
||||
self._listctrl.AddDatas( ( data, ) )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
for data in self._listctrl.GetData( only_selected = True ):
|
||||
|
||||
( key, string_match ) = data
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'edit the ' + self._key_name, default = key, allow_blank = False ) as dlg:
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_key = dlg.GetValue()
|
||||
|
||||
if edited_key != key and edited_key in self._GetExistingKeys():
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'That {} already exists!'.format( self._key_name ) )
|
||||
|
||||
break
|
||||
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit match' ) as dlg:
|
||||
|
||||
string_match = ClientParsing.StringMatch()
|
||||
|
||||
panel = ClientGUIStringPanels.EditStringMatchPanel( dlg, string_match )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_string_match = panel.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
self._listctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
edited_data = ( edited_key, edited_string_match )
|
||||
|
||||
self._listctrl.AddDatas( ( edited_data, ) )
|
||||
|
||||
|
||||
self._listctrl.Sort()
|
||||
|
||||
|
||||
def _GetExistingKeys( self ):
|
||||
|
||||
return { key for ( key, value ) in self._listctrl.GetData() }
|
||||
|
||||
|
||||
def GetValue( self ) -> typing.Dict[ str, ClientParsing.StringMatch ]:
|
||||
|
||||
value_dict = dict( self._listctrl.GetData() )
|
||||
|
||||
return value_dict
|
||||
|
||||
|
||||
|
File diff suppressed because it is too large
Load Diff
|
@ -162,7 +162,9 @@ class FavouritesTagsPanel( QW.QWidget ):
|
|||
|
||||
def _UpdateTagDisplay( self ):
|
||||
|
||||
favourites = HG.client_controller.new_options.GetSuggestedTagsFavourites( self._service_key )
|
||||
favourites = list( HG.client_controller.new_options.GetSuggestedTagsFavourites( self._service_key ) )
|
||||
|
||||
ClientTags.SortTags( HC.options[ 'default_tag_sort' ], favourites )
|
||||
|
||||
tags = FilterSuggestedTagsForMedia( favourites, self._media, self._service_key )
|
||||
|
||||
|
|
|
@ -383,6 +383,11 @@ class TabBar( QW.QTabBar ):
|
|||
self._last_clicked_global_pos = None
|
||||
|
||||
|
||||
def event( self, event ):
|
||||
|
||||
return QW.QTabBar.event( self, event )
|
||||
|
||||
|
||||
def mouseMoveEvent( self, e ):
|
||||
|
||||
e.ignore()
|
||||
|
@ -465,7 +470,7 @@ class TabWidgetWithDnD( QW.QTabWidget ):
|
|||
|
||||
|
||||
def _LayoutPagesHelper( self ):
|
||||
|
||||
|
||||
current_index = self.currentIndex()
|
||||
|
||||
for i in range( self.count() ):
|
||||
|
@ -482,7 +487,10 @@ class TabWidgetWithDnD( QW.QTabWidget ):
|
|||
|
||||
|
||||
def LayoutPages( self ):
|
||||
|
||||
|
||||
# hydev adds: I no longer call this, as I moved splitter setting to a thing called per page when page is first visibly shown
|
||||
# leaving it here for now in case I need it again
|
||||
|
||||
# Momentarily switch to each page, then back, forcing a layout update.
|
||||
# If this is not done, the splitters on the hidden pages won't resize their widgets properly when we restore
|
||||
# splitter sizes after this, since they would never became visible.
|
||||
|
@ -2550,10 +2558,8 @@ class CollectComboCtrl( QW.QComboBox ):
|
|||
text_and_data_tuples.update( namespaces )
|
||||
|
||||
|
||||
text_and_data_tuples = list( [ ( namespace, ( 'namespace', namespace ) ) for namespace in text_and_data_tuples ] )
|
||||
text_and_data_tuples = sorted( ( ( namespace, ( 'namespace', namespace ) ) for namespace in text_and_data_tuples ) )
|
||||
|
||||
text_and_data_tuples.sort()
|
||||
|
||||
ratings_services = HG.client_controller.services_manager.GetServices( ( HC.LOCAL_RATING_LIKE, HC.LOCAL_RATING_NUMERICAL ) )
|
||||
|
||||
for ratings_service in ratings_services:
|
||||
|
|
|
@ -1622,9 +1622,7 @@ class ServiceTagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if len( self._additional_tags ) > 0:
|
||||
|
||||
pretty_additional_tags = list( self._additional_tags )
|
||||
|
||||
pretty_additional_tags.sort()
|
||||
pretty_additional_tags = sorted( self._additional_tags )
|
||||
|
||||
statements.append( 'additional tags: ' + ', '.join( pretty_additional_tags ) )
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import gc
|
||||
import os
|
||||
import random
|
||||
import threading
|
||||
|
@ -1504,6 +1505,8 @@ class SubscriptionsManager( object ):
|
|||
|
||||
def _ClearFinishedSubscriptions( self ):
|
||||
|
||||
done_some = False
|
||||
|
||||
for ( name, ( thread, job, subscription ) ) in list( self._running_subscriptions.items() ):
|
||||
|
||||
if job.IsDone():
|
||||
|
@ -1512,8 +1515,12 @@ class SubscriptionsManager( object ):
|
|||
|
||||
del self._running_subscriptions[ name ]
|
||||
|
||||
done_some = True
|
||||
|
||||
|
||||
|
||||
return done_some
|
||||
|
||||
|
||||
def _GetNameReadyToGo( self ):
|
||||
|
||||
|
@ -1725,7 +1732,12 @@ class SubscriptionsManager( object ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
self._ClearFinishedSubscriptions()
|
||||
some_cleared = self._ClearFinishedSubscriptions()
|
||||
|
||||
if some_cleared:
|
||||
|
||||
gc.collect()
|
||||
|
||||
|
||||
wait_time = self._GetMainLoopWaitTime()
|
||||
|
||||
|
@ -1782,17 +1794,13 @@ class SubscriptionsManager( object ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
subs = list( self._current_subscription_names )
|
||||
subs.sort()
|
||||
subs = sorted( self._current_subscription_names )
|
||||
|
||||
running = list( self._running_subscriptions.keys() )
|
||||
running.sort()
|
||||
running = sorted( self._running_subscriptions.keys() )
|
||||
|
||||
cannot_run = list( self._names_that_cannot_run )
|
||||
cannot_run.sort()
|
||||
cannot_run = sorted( self._names_that_cannot_run )
|
||||
|
||||
next_times = list( self._names_to_next_work_time.items() )
|
||||
next_times.sort( key = lambda n, nwt: nwt )
|
||||
next_times = sorted( self._names_to_next_work_time.items(), key = lambda n, nwt: nwt )
|
||||
|
||||
message = '{} subs: {}'.format( HydrusData.ToHumanInt( len( self._current_subscription_names ) ), ', '.join( subs ) )
|
||||
message += os.linesep * 2
|
||||
|
|
|
@ -14,6 +14,7 @@ import os
|
|||
import re
|
||||
import threading
|
||||
import time
|
||||
import unicodedata
|
||||
import urllib.parse
|
||||
|
||||
def AddCookieToSession( session, name, value, domain, path, expires, secure = False, rest = None ):
|
||||
|
@ -123,9 +124,7 @@ def ConvertQueryDictToText( query_dict, param_order = None ):
|
|||
|
||||
if param_order is None:
|
||||
|
||||
param_pairs = list( query_dict.items() )
|
||||
|
||||
param_pairs.sort()
|
||||
param_pairs = sorted( query_dict.items() )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -147,7 +146,7 @@ def ConvertQueryTextToDict( query_text ):
|
|||
# so if there are a mix of encoded and non-encoded, we won't touch it here m8
|
||||
|
||||
# except these chars, which screw with GET arg syntax when unquoted
|
||||
bad_chars = [ '&', '=', '/', '?' ]
|
||||
bad_chars = [ '&', '=', '/', '?', '#' ]
|
||||
|
||||
param_order = []
|
||||
|
||||
|
@ -248,7 +247,7 @@ def ConvertURLClassesIntoAPIPairs( url_classes ):
|
|||
|
||||
def ConvertURLIntoDomain( url ):
|
||||
|
||||
parser_result = urllib.parse.urlparse( url )
|
||||
parser_result = ParseURL( url )
|
||||
|
||||
if parser_result.scheme == '':
|
||||
|
||||
|
@ -335,7 +334,7 @@ def GetSearchURLs( url ):
|
|||
|
||||
for url in list( search_urls ):
|
||||
|
||||
p = urllib.parse.urlparse( url )
|
||||
p = ParseURL( url )
|
||||
|
||||
scheme = p.scheme
|
||||
netloc = p.netloc
|
||||
|
@ -379,6 +378,62 @@ def GetSearchURLs( url ):
|
|||
|
||||
return search_urls
|
||||
|
||||
def ParseURL( url: str ) -> urllib.parse.ParseResult:
|
||||
|
||||
url = url.strip()
|
||||
|
||||
url = UnicodeNormaliseURL( url )
|
||||
|
||||
return urllib.parse.urlparse( url )
|
||||
|
||||
OH_NO_NO_NETLOC_CHARACTERS = '?#'
|
||||
OH_NO_NO_NETLOC_CHARACTERS_UNICODE_TRANSLATE = { ord( char ) : '_' for char in OH_NO_NO_NETLOC_CHARACTERS }
|
||||
|
||||
def UnicodeNormaliseURL( url: str ):
|
||||
|
||||
if url.startswith( 'file:' ):
|
||||
|
||||
return url
|
||||
|
||||
|
||||
# the issue is netloc, blah.com, cannot have certain unicode characters that look like others, or double ( e + accent ) characters that can be one accented-e, so we normalise
|
||||
# urllib.urlparse throws a valueerror if these are in, so let's switch out
|
||||
|
||||
scheme_splitter = '://'
|
||||
netloc_splitter = '/'
|
||||
|
||||
if scheme_splitter in url:
|
||||
|
||||
( scheme, netloc_and_path_and_rest ) = url.split( scheme_splitter, 1 )
|
||||
|
||||
if netloc_splitter in netloc_and_path_and_rest:
|
||||
|
||||
( netloc, path_and_rest ) = netloc_and_path_and_rest.split( netloc_splitter, 1 )
|
||||
|
||||
else:
|
||||
|
||||
netloc = netloc_and_path_and_rest
|
||||
path_and_rest = None
|
||||
|
||||
|
||||
netloc = unicodedata.normalize( 'NFKC', netloc )
|
||||
|
||||
netloc = netloc.translate( OH_NO_NO_NETLOC_CHARACTERS_UNICODE_TRANSLATE )
|
||||
|
||||
scheme_and_netlock = scheme_splitter.join( ( scheme, netloc ) )
|
||||
|
||||
if path_and_rest is None:
|
||||
|
||||
url = scheme_and_netlock
|
||||
|
||||
else:
|
||||
|
||||
url = netloc_splitter.join( ( scheme_and_netlock, path_and_rest ) )
|
||||
|
||||
|
||||
|
||||
return url
|
||||
|
||||
VALID_DENIED = 0
|
||||
VALID_APPROVED = 1
|
||||
VALID_UNKNOWN = 2
|
||||
|
@ -692,9 +747,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
namespaces.update( parser.GetNamespaces() )
|
||||
|
||||
|
||||
self._parser_namespaces = list( namespaces )
|
||||
|
||||
self._parser_namespaces.sort()
|
||||
self._parser_namespaces = sorted( namespaces )
|
||||
|
||||
|
||||
def _SetDirty( self ):
|
||||
|
@ -1602,7 +1655,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if url_class is None:
|
||||
|
||||
p = urllib.parse.urlparse( url )
|
||||
p = ParseURL( url )
|
||||
|
||||
scheme = p.scheme
|
||||
netloc = p.netloc
|
||||
|
@ -2410,7 +2463,7 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
# when the tags separator is '+' but the tags include '6+girls', we run into fun internet land
|
||||
|
||||
bad_chars = [ self._search_terms_separator, '&', '=', '/', '?' ]
|
||||
bad_chars = [ self._search_terms_separator, '&', '=', '/', '?', '#' ]
|
||||
|
||||
if True in ( bad_char in search_term for bad_char in bad_chars ):
|
||||
|
||||
|
@ -3078,7 +3131,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
url = self.Normalise( url )
|
||||
|
||||
p = urllib.parse.urlparse( url )
|
||||
p = ParseURL( url )
|
||||
|
||||
scheme = p.scheme
|
||||
netloc = p.netloc
|
||||
|
@ -3254,7 +3307,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def Normalise( self, url ):
|
||||
|
||||
p = urllib.parse.urlparse( url )
|
||||
p = ParseURL( url )
|
||||
|
||||
scheme = self._preferred_scheme
|
||||
params = ''
|
||||
|
@ -3318,7 +3371,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def Test( self, url ):
|
||||
|
||||
p = urllib.parse.urlparse( url )
|
||||
p = ParseURL( url )
|
||||
|
||||
if self._match_subdomains:
|
||||
|
||||
|
|
|
@ -22,9 +22,22 @@ try:
|
|||
|
||||
CLOUDSCRAPER_OK = True
|
||||
|
||||
try:
|
||||
|
||||
# help pyinstaller
|
||||
import pyparsing
|
||||
|
||||
PYPARSING_OK = True
|
||||
|
||||
except:
|
||||
|
||||
PYPARSING_OK = False
|
||||
|
||||
|
||||
except:
|
||||
|
||||
CLOUDSCRAPER_OK = False
|
||||
PYPARSING_OK = False
|
||||
|
||||
def ConvertStatusCodeAndDataIntoExceptionInfo( status_code, data, is_hydrus_service = False ):
|
||||
|
||||
|
@ -121,6 +134,7 @@ class NetworkJob( object ):
|
|||
|
||||
WILLING_TO_WAIT_ON_INVALID_LOGIN = True
|
||||
IS_HYDRUS_SERVICE = False
|
||||
IS_IPFS_SERVICE = False
|
||||
|
||||
def __init__( self, method, url, body = None, referral_url = None, temp_path = None ):
|
||||
|
||||
|
@ -256,7 +270,7 @@ class NetworkJob( object ):
|
|||
data = self._body
|
||||
files = self._files
|
||||
|
||||
if self.IS_HYDRUS_SERVICE:
|
||||
if self.IS_HYDRUS_SERVICE or self.IS_IPFS_SERVICE:
|
||||
|
||||
headers[ 'User-Agent' ] = 'hydrus client/' + str( HC.NETWORK_VERSION )
|
||||
|
||||
|
@ -1588,6 +1602,18 @@ class NetworkJobHydrus( NetworkJob ):
|
|||
|
||||
|
||||
|
||||
class NetworkJobIPFS( NetworkJob ):
|
||||
|
||||
def __init__( self, method, url, body = None, referral_url = None, temp_path = None ):
|
||||
|
||||
NetworkJob.__init__( self, method, url, body = body, referral_url = referral_url, temp_path = temp_path )
|
||||
|
||||
self.OnlyTryConnectionOnce()
|
||||
self.OverrideBandwidth()
|
||||
|
||||
|
||||
IS_IPFS_SERVICE = True
|
||||
|
||||
class NetworkJobWatcherPage( NetworkJob ):
|
||||
|
||||
def __init__( self, watcher_key, method, url, body = None, referral_url = None, temp_path = None ):
|
||||
|
|
|
@ -1170,8 +1170,7 @@ class LoginScriptDomain( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if len( missing_givens ) > 0:
|
||||
|
||||
missing_givens = list( missing_givens )
|
||||
missing_givens.sort()
|
||||
missing_givens = sorted( missing_givens )
|
||||
|
||||
raise HydrusExceptions.ValidationException( 'Missing required credentials: ' + ', '.join( missing_givens ) )
|
||||
|
||||
|
@ -1202,8 +1201,7 @@ class LoginScriptDomain( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if len( missing_definitions ) > 0:
|
||||
|
||||
missing_definitions = list( missing_definitions )
|
||||
missing_definitions.sort()
|
||||
missing_definitions = sorted( missing_definitions )
|
||||
|
||||
raise HydrusExceptions.ValidationException( 'Missing required credential definitions: ' + ', '.join( missing_definitions ) )
|
||||
|
||||
|
@ -1220,8 +1218,7 @@ class LoginScriptDomain( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if len( missing_vars ) > 0:
|
||||
|
||||
missing_vars = list( missing_vars )
|
||||
missing_vars.sort()
|
||||
missing_vars = sorted( missing_vars )
|
||||
|
||||
raise HydrusExceptions.ValidationException( 'Missing temp variables for login step "' + login_step.GetName() + '": ' + ', '.join( missing_vars ) )
|
||||
|
||||
|
@ -1673,7 +1670,7 @@ class LoginStep( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if self._method == 'POST' and referral_url is not None:
|
||||
|
||||
p = urllib.parse.urlparse( referral_url )
|
||||
p = ClientNetworkingDomain.ParseURL( url )
|
||||
|
||||
r = urllib.parse.ParseResult( p.scheme, p.netloc, '', '', '', '' )
|
||||
|
||||
|
|
|
@ -73,7 +73,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 396
|
||||
SOFTWARE_VERSION = 397
|
||||
CLIENT_API_VERSION = 11
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
|
|
@ -432,9 +432,7 @@ def ConvertUglyNamespaceToPrettyString( namespace ):
|
|||
|
||||
def ConvertUglyNamespacesToPrettyStrings( namespaces ):
|
||||
|
||||
namespaces = list( namespaces )
|
||||
|
||||
namespaces.sort()
|
||||
namespaces = sorted( namespaces )
|
||||
|
||||
result = [ ConvertUglyNamespaceToPrettyString( namespace ) for namespace in namespaces ]
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
import traceback
|
||||
import collections.abc
|
||||
|
||||
class HydrusException( Exception ):
|
||||
|
||||
|
@ -7,17 +8,24 @@ class HydrusException( Exception ):
|
|||
|
||||
s = []
|
||||
|
||||
for arg in self.args:
|
||||
if isinstance( self.args, collections.abc.Iterable ):
|
||||
|
||||
try:
|
||||
for arg in self.args:
|
||||
|
||||
s.append( str( arg ) )
|
||||
|
||||
except:
|
||||
|
||||
s.append( repr( arg ) )
|
||||
try:
|
||||
|
||||
s.append( str( arg ) )
|
||||
|
||||
except:
|
||||
|
||||
s.append( repr( arg ) )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
s = [ repr( self.args ) ]
|
||||
|
||||
|
||||
return os.linesep.join( s )
|
||||
|
||||
|
|
|
@ -8,9 +8,11 @@ import numpy.core.multiarray # important this comes before cv!
|
|||
|
||||
try:
|
||||
|
||||
import numpy.random.common # more hidden imports for pyinstaller
|
||||
import numpy.random.bounded_integers # more hidden imports for pyinstaller
|
||||
import numpy.random.entropy # more hidden imports for pyinstaller
|
||||
# more hidden imports for pyinstaller
|
||||
|
||||
import numpy.random.common # pylint: disable=E0401
|
||||
import numpy.random.bounded_integers # pylint: disable=E0401
|
||||
import numpy.random.entropy # pylint: disable=E0401
|
||||
|
||||
except:
|
||||
|
||||
|
|
|
@ -1808,9 +1808,7 @@ class Metadata( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
data = list( self._metadata.items() )
|
||||
|
||||
data.sort()
|
||||
data = sorted( self._metadata.items() )
|
||||
|
||||
for ( update_index, ( update_hashes, begin, end ) ) in data:
|
||||
|
||||
|
|
|
@ -12,7 +12,6 @@ import shlex
|
|||
import shutil
|
||||
import stat
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
import traceback
|
||||
|
@ -727,8 +726,28 @@ def MirrorFile( source, dest ):
|
|||
|
||||
MakeFileWritable( dest )
|
||||
|
||||
# this overwrites on conflict without hassle
|
||||
shutil.copy2( source, dest )
|
||||
copy_metadata = True
|
||||
|
||||
if HC.PLATFORM_WINDOWS:
|
||||
|
||||
mtime = os.path.getmtime( source )
|
||||
|
||||
# this is 1980-01-01 UTC, before which Windows can have trouble copying lmaoooooo
|
||||
if mtime < 315532800:
|
||||
|
||||
copy_metadata = False
|
||||
|
||||
|
||||
|
||||
if copy_metadata:
|
||||
|
||||
# this overwrites on conflict without hassle
|
||||
shutil.copy2( source, dest )
|
||||
|
||||
else:
|
||||
|
||||
shutil.copy( source, dest )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
|
|
@ -98,6 +98,8 @@ SERIALISABLE_TYPE_TAG_DISPLAY_MANAGER = 79
|
|||
SERIALISABLE_TYPE_TAG_SEARCH_CONTEXT = 80
|
||||
SERIALISABLE_TYPE_FAVOURITE_SEARCH_MANAGER = 81
|
||||
SERIALISABLE_TYPE_NOTE_IMPORT_OPTIONS = 82
|
||||
SERIALISABLE_TYPE_STRING_SPLITTER = 83
|
||||
SERIALISABLE_TYPE_STRING_PROCESSOR = 84
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -673,11 +673,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
expected_content_updates = expected_service_keys_to_content_updates[ service_key ]
|
||||
|
||||
c_u_tuples = [ c_u.ToTuple() for c_u in content_updates ]
|
||||
e_c_u_tuples = [ e_c_u.ToTuple() for e_c_u in expected_content_updates ]
|
||||
|
||||
c_u_tuples.sort()
|
||||
e_c_u_tuples.sort()
|
||||
c_u_tuples = sorted( ( c_u.ToTuple() for c_u in content_updates ) )
|
||||
e_c_u_tuples = sorted( ( e_c_u.ToTuple() for e_c_u in expected_content_updates ) )
|
||||
|
||||
self.assertEqual( c_u_tuples, e_c_u_tuples )
|
||||
|
||||
|
@ -712,11 +709,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
expected_content_updates = expected_service_keys_to_content_updates[ service_key ]
|
||||
|
||||
c_u_tuples = [ c_u.ToTuple() for c_u in content_updates ]
|
||||
e_c_u_tuples = [ e_c_u.ToTuple() for e_c_u in expected_content_updates ]
|
||||
|
||||
c_u_tuples.sort()
|
||||
e_c_u_tuples.sort()
|
||||
c_u_tuples = sorted( ( c_u.ToTuple() for c_u in content_updates ) )
|
||||
e_c_u_tuples = sorted( ( e_c_u.ToTuple() for e_c_u in expected_content_updates ) )
|
||||
|
||||
self.assertEqual( c_u_tuples, e_c_u_tuples )
|
||||
|
||||
|
@ -781,11 +775,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
expected_content_updates = expected_service_keys_to_content_updates[ service_key ]
|
||||
|
||||
c_u_tuples = [ c_u.ToTuple() for c_u in content_updates ]
|
||||
e_c_u_tuples = [ e_c_u.ToTuple() for e_c_u in expected_content_updates ]
|
||||
|
||||
c_u_tuples.sort()
|
||||
e_c_u_tuples.sort()
|
||||
c_u_tuples = sorted( ( c_u.ToTuple() for c_u in content_updates ) )
|
||||
e_c_u_tuples = sorted( ( e_c_u.ToTuple() for e_c_u in expected_content_updates ) )
|
||||
|
||||
self.assertEqual( c_u_tuples, e_c_u_tuples )
|
||||
|
||||
|
@ -820,11 +811,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
expected_content_updates = expected_service_keys_to_content_updates[ service_key ]
|
||||
|
||||
c_u_tuples = [ c_u.ToTuple() for c_u in content_updates ]
|
||||
e_c_u_tuples = [ e_c_u.ToTuple() for e_c_u in expected_content_updates ]
|
||||
|
||||
c_u_tuples.sort()
|
||||
e_c_u_tuples.sort()
|
||||
c_u_tuples = sorted( ( c_u.ToTuple() for c_u in content_updates ) )
|
||||
e_c_u_tuples = sorted( ( e_c_u.ToTuple() for e_c_u in expected_content_updates ) )
|
||||
|
||||
self.assertEqual( c_u_tuples, e_c_u_tuples )
|
||||
|
||||
|
@ -1584,8 +1572,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
urls = { "https://gelbooru.com/index.php?page=post&s=view&id=4841557", "https://img2.gelbooru.com//images/80/c8/80c8646b4a49395fb36c805f316c49a9.jpg" }
|
||||
|
||||
sorted_urls = list( urls )
|
||||
sorted_urls.sort()
|
||||
sorted_urls = sorted( urls )
|
||||
|
||||
for ( file_id, hash ) in file_ids_to_hashes.items():
|
||||
|
||||
|
|
|
@ -1,37 +1,23 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDB
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDownloading
|
||||
from hydrus.client import ClientExporting
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.gui import ClientGUIManagement
|
||||
from hydrus.client.gui import ClientGUIPages
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
from hydrus.client.importing import ClientImportOptions
|
||||
from hydrus.client.importing import ClientImportFileSeeds
|
||||
from hydrus.client import ClientRatings
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusSerialisable
|
||||
import itertools
|
||||
import os
|
||||
from hydrus.server import ServerDB
|
||||
import shutil
|
||||
import sqlite3
|
||||
import stat
|
||||
from hydrus.test import TestController
|
||||
import time
|
||||
import threading
|
||||
import unittest
|
||||
|
||||
class TestClientDB( unittest.TestCase ):
|
||||
|
@ -646,7 +632,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING, min_current_count = 1 ) )
|
||||
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_INBOX, min_current_count = 1 ) )
|
||||
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVE, min_current_count = 0 ) )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ] ] )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ] ] )
|
||||
|
||||
self.assertEqual( set( result ), set( predicates ) )
|
||||
|
||||
|
|
|
@ -1,37 +1,14 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDB
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDownloading
|
||||
from hydrus.client import ClientExporting
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.gui import ClientGUIManagement
|
||||
from hydrus.client.gui import ClientGUIPages
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
from hydrus.client.importing import ClientImportOptions
|
||||
from hydrus.client.importing import ClientImportFileSeeds
|
||||
from hydrus.client import ClientRatings
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusSerialisable
|
||||
import itertools
|
||||
import os
|
||||
from hydrus.server import ServerDB
|
||||
import shutil
|
||||
import sqlite3
|
||||
import stat
|
||||
from hydrus.test import TestController
|
||||
import time
|
||||
import threading
|
||||
import unittest
|
||||
|
||||
class TestClientDBDuplicates( unittest.TestCase ):
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientImageHandling
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
import os
|
||||
import unittest
|
||||
|
|
|
@ -8,7 +8,6 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.client import ClientRatings
|
||||
import collections
|
||||
import os
|
||||
import random
|
||||
|
|
|
@ -1,24 +1,13 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client.importing import ClientImportSubscriptions
|
||||
from hydrus.client.networking import ClientNetworking
|
||||
from hydrus.client.networking import ClientNetworkingBandwidth
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
from hydrus.client.networking import ClientNetworkingLogin
|
||||
from hydrus.client.networking import ClientNetworkingSessions
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusNetworking
|
||||
import os
|
||||
from hydrus.test import TestController
|
||||
import threading
|
||||
import time
|
||||
import unittest
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from httmock import all_requests, urlmatch, HTTMock, response
|
||||
from mock import patch
|
||||
from httmock import all_requests
|
||||
|
||||
MISSING_RESPONSE = '404, bad result'
|
||||
ERROR_RESPONSE = '500, it done broke'
|
||||
|
|
|
@ -1,12 +1,6 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client.gui import ClientGUIListBoxes
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
import os
|
||||
import random
|
||||
from hydrus.test import TestController
|
||||
import time
|
||||
import unittest
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from qtpy import QtCore as QC
|
||||
|
|
|
@ -8,13 +8,10 @@ from hydrus.client import ClientMigration
|
|||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
import collections
|
||||
import hashlib
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusTagArchive
|
||||
import os
|
||||
import random
|
||||
import shutil
|
||||
import time
|
||||
import unittest
|
||||
from hydrus.core import HydrusData
|
||||
|
|
|
@ -8,14 +8,11 @@ from hydrus.client.networking import ClientNetworkingLogin
|
|||
from hydrus.client.networking import ClientNetworkingSessions
|
||||
from hydrus.client import ClientParsing
|
||||
from hydrus.client import ClientServices
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusNetworking
|
||||
import os
|
||||
from hydrus.test import TestController
|
||||
import threading
|
||||
import time
|
||||
import unittest
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
|
|
@ -0,0 +1,233 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.client import ClientParsing
|
||||
import unittest
|
||||
|
||||
class TestStringConverter( unittest.TestCase ):
|
||||
|
||||
def test_basics( self ):
|
||||
|
||||
transformations = []
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_BEGINNING, 1 ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), '123456789' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_END, 1 ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), '12345678' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_BEGINNING, 7 ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), '1234567' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_END, 6 ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), '234567' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_PREPEND_TEXT, 'abc' ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'abc234567' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_APPEND_TEXT, 'x z' ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'abc234567x z' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_ENCODE, 'url percent encoding' ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'abc234567x%20z' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_DECODE, 'url percent encoding' ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'abc234567x z' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_REVERSE, None ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'z x765432cba' )
|
||||
|
||||
#
|
||||
|
||||
transformations.append( ( ClientParsing.STRING_TRANSFORMATION_REGEX_SUB, ( '\\d', 'd' ) ) )
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '0123456789' ), 'z xddddddcba' )
|
||||
|
||||
#
|
||||
|
||||
transformations = [ ( ClientParsing.STRING_TRANSFORMATION_DATE_DECODE, ( '%Y-%m-%d %H:%M:%S', HC.TIMEZONE_GMT, 0 ) ) ]
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '1970-01-02 00:00:00' ), '86400' )
|
||||
|
||||
#
|
||||
|
||||
transformations = [ ( ClientParsing.STRING_TRANSFORMATION_DATE_ENCODE, ( '%Y-%m-%d %H:%M:%S', 0 ) ) ]
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '86400' ), '1970-01-02 00:00:00' )
|
||||
|
||||
#
|
||||
|
||||
transformations = [ ( ClientParsing.STRING_TRANSFORMATION_INTEGER_ADDITION, 5 ) ]
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = transformations )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '4' ), '9' )
|
||||
|
||||
|
||||
class TestStringMatch( unittest.TestCase ):
|
||||
|
||||
def test_basics( self ):
|
||||
|
||||
all_string_match = ClientParsing.StringMatch()
|
||||
|
||||
self.assertTrue( all_string_match.Matches( '123' ) )
|
||||
self.assertTrue( all_string_match.Matches( 'abc' ) )
|
||||
self.assertTrue( all_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
min_string_match = ClientParsing.StringMatch( min_chars = 4 )
|
||||
|
||||
self.assertFalse( min_string_match.Matches( '123' ) )
|
||||
self.assertFalse( min_string_match.Matches( 'abc' ) )
|
||||
self.assertTrue( min_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
max_string_match = ClientParsing.StringMatch( max_chars = 4 )
|
||||
|
||||
self.assertTrue( max_string_match.Matches( '123' ) )
|
||||
self.assertTrue( max_string_match.Matches( 'abc' ) )
|
||||
self.assertFalse( max_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
min_max_string_match = ClientParsing.StringMatch( min_chars = 4, max_chars = 10 )
|
||||
|
||||
self.assertFalse( min_max_string_match.Matches( '123' ) )
|
||||
self.assertFalse( min_max_string_match.Matches( 'abc' ) )
|
||||
self.assertTrue( min_max_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
alpha_string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.ALPHA )
|
||||
|
||||
self.assertFalse( alpha_string_match.Matches( '123' ) )
|
||||
self.assertTrue( alpha_string_match.Matches( 'abc' ) )
|
||||
self.assertFalse( alpha_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
alphanum_string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.ALPHANUMERIC )
|
||||
|
||||
self.assertTrue( alphanum_string_match.Matches( '123' ) )
|
||||
self.assertTrue( alphanum_string_match.Matches( 'abc' ) )
|
||||
self.assertTrue( alphanum_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
num_string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC )
|
||||
|
||||
self.assertTrue( num_string_match.Matches( '123' ) )
|
||||
self.assertFalse( num_string_match.Matches( 'abc' ) )
|
||||
self.assertFalse( num_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
fixed_string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = '123' )
|
||||
|
||||
self.assertTrue( fixed_string_match.Matches( '123' ) )
|
||||
self.assertFalse( fixed_string_match.Matches( 'abc' ) )
|
||||
self.assertFalse( fixed_string_match.Matches( 'abc123' ) )
|
||||
|
||||
#
|
||||
|
||||
re_string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_REGEX, match_value = '\\d' )
|
||||
|
||||
self.assertTrue( re_string_match.Matches( '123' ) )
|
||||
self.assertFalse( re_string_match.Matches( 'abc' ) )
|
||||
self.assertTrue( re_string_match.Matches( 'abc123' ) )
|
||||
|
||||
|
||||
class TestStringSplitter( unittest.TestCase ):
|
||||
|
||||
def test_basics( self ):
|
||||
|
||||
splitter = ClientParsing.StringSplitter( separator = ', ' )
|
||||
|
||||
self.assertTrue( splitter.Split( '123' ), [ '123' ] )
|
||||
self.assertTrue( splitter.Split( '1,2,3' ), [ '1,2,3' ] )
|
||||
self.assertTrue( splitter.Split( '1, 2, 3' ), [ '1', '2', '3' ] )
|
||||
|
||||
splitter = ClientParsing.StringSplitter( separator = ', ', max_splits = 2 )
|
||||
|
||||
self.assertTrue( splitter.Split( '123' ), [ '123' ] )
|
||||
self.assertTrue( splitter.Split( '1,2,3' ), [ '1,2,3' ] )
|
||||
self.assertTrue( splitter.Split( '1, 2, 3, 4' ), [ '1', '2', '3,4' ] )
|
||||
|
||||
|
||||
|
||||
class TestStringProcessor( unittest.TestCase ):
|
||||
|
||||
def test_basics( self ):
|
||||
|
||||
processor = ClientParsing.StringProcessor()
|
||||
|
||||
self.assertEqual( processor.ProcessStrings( [] ), [] )
|
||||
self.assertEqual( processor.ProcessStrings( [ 'test' ] ), [ 'test' ] )
|
||||
self.assertEqual( processor.ProcessStrings( [ 'test', 'test', '', 'test2' ] ), [ 'test', 'test', '', 'test2' ] )
|
||||
|
||||
processing_steps = []
|
||||
|
||||
processing_steps.append( ClientParsing.StringSplitter( separator = ',', max_splits = 2 ) )
|
||||
|
||||
processing_steps.append( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC ) )
|
||||
|
||||
transformations = [ ( ClientParsing.STRING_TRANSFORMATION_APPEND_TEXT, 'abc' ) ]
|
||||
|
||||
processing_steps.append( ClientParsing.StringConverter( transformations = transformations ) )
|
||||
|
||||
processor.SetProcessingSteps( processing_steps )
|
||||
|
||||
expected_result = [ '1abc', '123abc' ]
|
||||
|
||||
self.assertEqual( processor.ProcessStrings( [ '1,a,2,3', 'test', '123' ] ), expected_result )
|
||||
|
||||
|
|
@ -1,16 +1,12 @@
|
|||
import collections
|
||||
from hydrus.client import ClientCaches
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientManagers
|
||||
from hydrus.client import ClientMedia
|
||||
from hydrus.client import ClientMediaManagers
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client import ClientTags
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
import os
|
||||
import unittest
|
||||
|
||||
class TestMergeTagsManagers( unittest.TestCase ):
|
||||
|
|
|
@ -1,10 +1,5 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
import os
|
||||
import threading
|
||||
import time
|
||||
import unittest
|
||||
|
||||
|
|
|
@ -1,10 +1,7 @@
|
|||
import collections
|
||||
import os
|
||||
import random
|
||||
import threading
|
||||
import collections
|
||||
import shutil
|
||||
import sys
|
||||
import tempfile
|
||||
import time
|
||||
import traceback
|
||||
|
@ -24,11 +21,9 @@ from hydrus.client.networking import ClientNetworkingSessions
|
|||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.core import HydrusDB
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusPubSub
|
||||
from hydrus.core import HydrusSessions
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.test import TestClientAPI
|
||||
from hydrus.test import TestClientConstants
|
||||
|
@ -42,19 +37,17 @@ from hydrus.test import TestClientImportSubscriptions
|
|||
from hydrus.test import TestClientListBoxes
|
||||
from hydrus.test import TestClientMigration
|
||||
from hydrus.test import TestClientNetworking
|
||||
from hydrus.test import TestClientParsing
|
||||
from hydrus.test import TestClientTags
|
||||
from hydrus.test import TestClientThreading
|
||||
from hydrus.test import TestDialogs
|
||||
from hydrus.test import TestFunctions
|
||||
from hydrus.test import TestHydrusNATPunch
|
||||
from hydrus.test import TestHydrusNetworking
|
||||
from hydrus.test import TestHydrusSerialisable
|
||||
from hydrus.test import TestHydrusServer
|
||||
from hydrus.test import TestHydrusSessions
|
||||
from hydrus.test import TestServerDB
|
||||
from twisted.internet import reactor
|
||||
from hydrus.client import ClientCaches
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientOptions
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusPaths
|
||||
|
@ -671,6 +664,7 @@ class Controller( object ):
|
|||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientConstants ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientData ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientImportOptions ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientParsing ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientTags ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestClientThreading ) )
|
||||
suites.append( unittest.TestLoader().loadTestsFromModule( TestFunctions ) )
|
||||
|
|
|
@ -1,13 +1,6 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIScrolledPanelsEdit
|
||||
from hydrus.client.gui import ClientGUIScrolledPanelsManagement
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client import ClientThreading
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
import os
|
||||
import unittest
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from qtpy import QtCore as QC
|
||||
|
|
|
@ -1,9 +1,7 @@
|
|||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientTags
|
||||
import os
|
||||
import unittest
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
|
|
@ -1,9 +1,5 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusNATPunch
|
||||
import os
|
||||
import random
|
||||
import time
|
||||
import unittest
|
||||
|
||||
class TestNATPunch( unittest.TestCase ):
|
||||
|
|
|
@ -1,11 +1,7 @@
|
|||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
import os
|
||||
import random
|
||||
import time
|
||||
import unittest
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusNetworking
|
||||
from mock import patch
|
||||
|
||||
|
|
|
@ -1,28 +1,20 @@
|
|||
from hydrus.client import ClientCaches
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDownloading
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIShortcuts
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing import ClientImportOptions
|
||||
from hydrus.client.importing import ClientImportSubscriptions
|
||||
from hydrus.client.importing import ClientImportSubscriptionQuery
|
||||
from hydrus.client import ClientMedia
|
||||
from hydrus.client import ClientMediaManagers
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
from hydrus.client import ClientRatings
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client import ClientTags
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.test import TestController as TC
|
||||
import os
|
||||
import unittest
|
||||
from qtpy import QtCore as QC
|
||||
|
||||
class TestSerialisables( unittest.TestCase ):
|
||||
|
||||
|
|
|
@ -1,21 +1,14 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientLocalServer
|
||||
from hydrus.client import ClientMedia
|
||||
from hydrus.client import ClientMediaManagers
|
||||
from hydrus.client import ClientRatings
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
import hashlib
|
||||
import http.client
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusEncryption
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusServer
|
||||
from hydrus.core import HydrusServerResources
|
||||
from hydrus.core import HydrusText
|
||||
import json
|
||||
import os
|
||||
import random
|
||||
from hydrus.server import ServerFiles
|
||||
|
@ -24,10 +17,7 @@ import ssl
|
|||
from hydrus.test import TestController
|
||||
import time
|
||||
import unittest
|
||||
import urllib
|
||||
from twisted.internet import reactor
|
||||
#from twisted.internet.endpoints import TCP4ClientEndpoint, connectProtocol
|
||||
from twisted.internet.defer import deferredGenerator, waitForDeferred
|
||||
import twisted.internet.ssl
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
|
|
@ -1,11 +1,8 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
import collections
|
||||
import hashlib
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusSessions
|
||||
import os
|
||||
import unittest
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
|
|
@ -1,37 +1,10 @@
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDB
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDownloading
|
||||
from hydrus.client import ClientExporting
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.gui import ClientGUIManagement
|
||||
from hydrus.client.gui import ClientGUIPages
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
from hydrus.client.importing import ClientImportOptions
|
||||
from hydrus.client.importing import ClientImportFileSeeds
|
||||
from hydrus.client import ClientRatings
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTags
|
||||
import collections
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusNetwork
|
||||
from hydrus.core import HydrusSerialisable
|
||||
import itertools
|
||||
import os
|
||||
from hydrus.server import ServerDB
|
||||
import shutil
|
||||
import sqlite3
|
||||
import stat
|
||||
from hydrus.test import TestController
|
||||
import time
|
||||
import threading
|
||||
import unittest
|
||||
|
||||
class TestServerDB( unittest.TestCase ):
|
||||
|
|
Loading…
Reference in New Issue