Version 297
This commit is contained in:
parent
a8399eb618
commit
24fa015c89
|
@ -8,9 +8,50 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 297</h3></li>
|
||||
<ul>
|
||||
<li>finished a prototype 'file notes' system. thumbnails and media viewer canvas now support 'manage->file notes' in their right-click menus. this launches a simple text box which will save its contents to db</li>
|
||||
<li>added 'manage_file_notes' shortcut to the 'media' shortcut set</li>
|
||||
<li>tag summary generators now have a simple show/hide checkbox and (for thumbnails) custom colours for background and text including alpha channel!</li>
|
||||
<li>fixed a variety of timing and display logic related to subscription query DEAD vs next check time calculation</li>
|
||||
<li>all currently dead subscription queries will be revived on update, just in case they were formerly set dead by accident</li>
|
||||
<li>the 'fetch tags even if url known and file already in db' option is moved from the download/subscription panel's cog icon to tag import options</li>
|
||||
<li>cleaned up tag import options layout, controls, internal workflow, and help button</li>
|
||||
<li>added 'select all/none' buttons to tag import options panels with multiple namespaces</li>
|
||||
<li>if a subscription is blocked by bandwidth, the manage subscriptions dialog will display that in its 'recent error/delay' column</li>
|
||||
<li>the edit subscription dialog will show similar bandwidth blocking info on a per-query basis, under a new 'recent delays' column</li>
|
||||
<li>the review bandwidth usage panel will no longer show some unusual results by default that you can see with 'show all' hit anyway</li>
|
||||
<li>the review bandwidth usage panel will show the usage at the current search distance in a new column</li>
|
||||
<li>the review bandiwdth usage panel will show number of requests after data usage. this might be info-overload, so I might alter the syntax or roll it back entirely</li>
|
||||
<li>fixed an issue with hentai foundry parser pulling images placed in the image description area instead of main image. this particularly affected the artist 'teku'</li>
|
||||
<li>tags for deviant art and tumblr and thread watchers, which were formerly stored in volatile session memory--meaning half-completed import queues were losing their tags through a program restart--are now saved to the new import object directly</li>
|
||||
<li>removed all the old volatile session memory patch code</li>
|
||||
<li>added the new import object through a larger part of the parsing pipeline</li>
|
||||
<li>deleted the old remains of the giphy parser--if it comes back, it'll all be rewritten in the new system</li>
|
||||
<li>harmonised some other import pipeline code to the new system</li>
|
||||
<li>added a new 'management and preview panels' submenu to the 'pages' menu</li>
|
||||
<li>added an option to control 'save sash positions on close' to this menu</li>
|
||||
<li>added an entry to force-save the current sash positions to this menu</li>
|
||||
<li>added an entry to 'restore' the currently saved sash positions to all pages to this menu (this is useful if your window resizes real small and all your pages get crushed up)</li>
|
||||
<li>rejiggered how URL Classes are matched with URLs to make sure some Post URLs are not lost (this was affecting Hentai Foundry Post URLs, which were sometimes not displaying in the media viewer despite matching)</li>
|
||||
<li>fixed an issue where the duplicate filter page's jobs would not trigger an update after a job finished</li>
|
||||
<li>fixed an outside chance of a crash after running a duplicate filter page job</li>
|
||||
<li>improved how strings are coerced to unicode--now the preferred system encoding will be tried before utf-16, which should improve support for é-type characters in various non-unicode sources (like neighbouring .txt files)</li>
|
||||
<li>fixed an issue with the client's local booru and flash files (and some other file fetching and mime reporting is a bit faster and neater overall)</li>
|
||||
<li>the options should be more reliable about redrawing all thumbnail banner summaries on an option ok now</li>
|
||||
<li>the options->media->media zooms option will now remove any <=0.0 values when it saves</li>
|
||||
<li>fixed up some old test code</li>
|
||||
<li>improved how some thread-to-gui update reporting code works</li>
|
||||
<li>deleted some old network object code</li>
|
||||
<li>converted manage subscriptions panel to an edit panel--a decoupling refactor I will likely ultimately make across the program</li>
|
||||
<li>wrote a help page for content parsers</li>
|
||||
<li>did the first half of a help page for page parsers</li>
|
||||
<li>misc refactoring</li>
|
||||
<li>misc cleanup</li>
|
||||
</ul>
|
||||
<li><h3>version 296</h3></li>
|
||||
<ul>
|
||||
<li>?the 'allow decompression bombs' option is now moved to 'file import options'. it defaults to False</li>
|
||||
<li>the 'allow decompression bombs' option is now moved to 'file import options'. it defaults to False</li>
|
||||
<li>file import options now allow max size and max resolution rules. they default to None</li>
|
||||
<li>file import options now allows a max gif size rule to deal with THE SFM COMMUNITY. it defaults to 32MB</li>
|
||||
<li>file imports will give better quality errors if they fail due to file import option exclusion rules</li>
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
<ul>
|
||||
<li><a href="downloader_parsers_formulae.html"><b>Formulae:</b></a> Take parsable data, search it in some manner, and return 0 to n strings.</li>
|
||||
<li><a href="downloader_parsers_content_parsers.html"><b>Content Parsers:</b></a> Take parsable data, apply a formula to it to get some strings, and apply a single metadata 'type' and perhaps some additional modifiers.</li>
|
||||
<li><a href="downloader_parsers_page_parsers.html"><b>Page Parsers:</b></a> Take parsable data, apply content parsers to it, and return all the metadata.</li>
|
||||
<li><a href="downloader_parsers_page_parsers.html"><b>Page Parsers:</b></a> Take parsable data, apply content parsers to it, and return all the metadata in an appropriate structure.</li>
|
||||
</ul>
|
||||
<p>Once you are comfortable with these objects, you might like to check out these walkthroughs, which create full parsers from nothing:</p>
|
||||
<ul>
|
||||
|
|
|
@ -8,9 +8,59 @@
|
|||
<div class="content">
|
||||
<p><a href="downloader_parsers.html"><---- Back to main parsers page</a></p>
|
||||
<h3 id="content_parsers">content parsers</h3>
|
||||
<p>different types and what they mean</p>
|
||||
<p>hash needs conversion to bytes</p>
|
||||
<p>vetos</p>
|
||||
<p>So, we can now generate some strings from a document. Content Parsers will let us apply a single metadata type to those strings to inform hydrus what they are.</p>
|
||||
<p><img src="edit_content_parser_panel_tags.png" /></p>
|
||||
<p>A content parser has a name, a content type, and a formula. This example fetches the character tags from a danbooru post.</p>
|
||||
<p>The name is just decorative, but it is generally a good idea so you can find things again when you next revisit them.</p>
|
||||
<p>The current content types are:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<h3>urls</h3>
|
||||
<p>This should be applied to relative ('/image/smile.jpg') and absolute ('https://mysite.com/content/image/smile.jpg') URLs. If the URL is relative, the client will attempt to generate an absolute URL based on the original URL used to fetch the current data being parsed.</p>
|
||||
<p>You can set several types of URL:</p>
|
||||
<ul>
|
||||
<li><b>actual file</b> means a File URL in our URL Classes system. An actual raw file like a jpg or webm. The client will typically be downloading and attempting to import these URLs, so make sure you are not accidentally linking to an html wrapper page.</li>
|
||||
<li><b>post page</b> means a Post URL. You will typically find these URLs as linked from thumbnails on a gallery page.</li>
|
||||
<li><b>next gallery page</b> means the next Gallery URL on from the current one. This will aid the downloader engine in finding a next page if that is otherwise difficult to guess (some sites have a nice page=1, page=2, page=3 system that we can predict elsewhere in the system, but others are not so simple).</b>
|
||||
</ul>
|
||||
<p>The 'quality precedence' allows the client to select the best of several possible URLs. Given multiple content parsers producing URLs at the same 'level' of parsing, it will select the one with the highest value. Consider these two posts:</p>
|
||||
<ul>
|
||||
<li><a href="https://danbooru.donmai.us/posts/3016415">https://danbooru.donmai.us/posts/3016415</a></li>
|
||||
<li><a href="https://danbooru.donmai.us/posts/3040603">https://danbooru.donmai.us/posts/3040603</a></li>
|
||||
</ul>
|
||||
<p>The Garnet image fits into a regular page and so Danbooru embed the whole original file in the main media canvas. One easy way to find the full File URL in this case would be to select the "src" attribute of the "img" tag with id="image".</p>
|
||||
<p>The Cirno one, however, is much larger and has been scaled down. The src of the main canvas tag points to a resized 'sample' link. The full link can be found at the 'view original' link up top, which is an "a" tag with id="image-resize-link".</p>
|
||||
<p>The Garnet post does not have the 'view original' link, so to cover both situations we might want two content parsers--one fetching the 'canvas' "src" and the other finding the 'view original' "href". If we set the canvas one with a quality of 40 and the view original 60, then the parsing system would know to select the 60 when it was available but to fall back to the 40 if not.</p>
|
||||
<p>As it happens, Danbooru (afaik, always) gives a link to the original file under the 'Size:' metadata to the left. This is the same 'best link' for both posts above, but it isn't so easy to identify. It is a quiet "a" tag without an "id" and it isn't always in the same location, but if you could pin it down reliably, it might be nice to circumvent the whole issue.</p>
|
||||
<p>Sites can change suddenly, so it is nice to have a bit of redundancy here if it is easy.</p>
|
||||
</li>
|
||||
<li>
|
||||
<h3>tags</h3>
|
||||
<p>These are simple--they tell the client that the given strings are tags. You set the namespace here as well. I recommend you parse 'splashbrush' and set the namespace 'creator' here rather than trying to mess around with 'append prefix "creator:"' string conversions at the formula level--it is simpler up here and it lets hydrus handle any edge case logic for you.</p>
|
||||
<p>Leave the namespace field blank for unnamespaced tags.</p>
|
||||
</li>
|
||||
<li>
|
||||
<h3>file hash</h3>
|
||||
<p>This says 'this is the hash for the file otherwise referenced in this parser'. So, if you have another content parser finding a File or Post URL, this lets the client know early that that destination happens to have a particular MD5, for instance. The client will look for that hash in its own database, and if it finds a match, it can predetermine if it already has the file (or has previously deleted it) without ever having to download it. Furthermore, if it does find the file for this URL but has never seen the URL before, it will still associate it with that file's 'known urls' as if it <i>had</i> downloaded it!</p>
|
||||
<p>If you understand this concept, it is great to include. It saves time and bandwidth for everyone. Many site APIs include a hash for this exact reason--they want you to be able to skip a needless download just as much as you do.</p>
|
||||
<p><img src="edit_content_parser_panel_hash.png" /></p>
|
||||
<p>The usual suite of hash types are supported: MD5, SHA1, SHA256, and SHA512. <b>This expects the hash as raw bytes</b>, so if your source provides it as hex or base64 (as above), make sure to decode it! In the area for test results, it will present the hash in hex for your convenience.</p>
|
||||
</li>
|
||||
<li>
|
||||
<h3>timestamp</h3>
|
||||
<p>This lets you say that a given number refers to a particular time for a file. At the moment, I only support 'source time', which represents a 'post' time for the file and is useful for thread and subscription check time calculations. It takes a Unix time integer, like 1520203484, which many APIs will provide. If you are feeling very clever, you can decode a 'MM/DD/YYYY hh:mm:ss' style string to a Unix time integer using string converters, but I may need to put more time into that UI to make it more user friendly!</p>
|
||||
</li>
|
||||
<li>
|
||||
<h3>thread watcher page title</h3>
|
||||
<p>This lets the thread watcher know a good name for its page tab. The subject of a thread is obviously ideal here, but failing that you can try to fetch the first part of the first post's comment. It has precendence, like for URLs, so you can tell the parser which to prefer if you have multiple options. Just for neatness and ease of testing, you probably want to use a string converter here to cut it down to the first 64 characters or so.</p>
|
||||
</li>
|
||||
<li>
|
||||
<h3>veto</h3>
|
||||
<p>This is a special content type--it tells the next highest stage of parsing that this 'post' of parsing is invalid and to cancel and not return any data. For instance, if a thread post's file was deleted, the site might provide a default '404' stock File URL using the same markup structure as it would for normal images. You don't want to give the user the same 404 image ten times over (with fifteen kinds of tag and source time metadata attached), so you can add a little rule here that says "If the image link is 'https://somesite.com/404.png', raise a veto: File 404" or "If the page has 'No results found' in its main content div, raise a veto: No results found" or "If the expected download tag does not have 'download link' as its text, raise a veto: No Download Link found--possibly Ugoira?" and so on.</p>
|
||||
<p><img src="edit_content_parser_panel_veto.png" /></p>
|
||||
<p>They will associate their name with the veto being raised, so it is useful to give these a decent descriptive name so you can see what might be going right or wrong during testing. If it is an appropriate and serious enough veto, it may also rise up to the user level and will be useful if they need to report you an error (like "After five pages of parsing, it gives 'veto: no next page link'").</p>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
|
@ -8,10 +8,55 @@
|
|||
<div class="content">
|
||||
<p><a href="downloader_parsers.html"><---- Back to main parsers page</a></p>
|
||||
<h3 id="page_parsers">page parsers</h3>
|
||||
<p>pre-parsing conversion example for tumblr</p>
|
||||
<p>example urls are helpful</p>
|
||||
<p>We can now produce individual rows of rich metadata. To arrange them all into a useful structure, we will use Page Parsers.</p>
|
||||
<p>The Page Parser is the top level parsing object. It takes a single document and produces a list--or a list of lists--of metadata. Here's the main UI:</p>
|
||||
<p><img src="edit_page_parser_panel_e621_main.png" /></p>
|
||||
<p>Notice that the edit panel has three sub-pages.</p>
|
||||
<h3>main</h3>
|
||||
<ul>
|
||||
<li><b>Name</b>: Like for content parsers, I recommend you add good names for your parsers.</li>
|
||||
<li><b>Pre-parsing conversion</b>: If your API source encodes or wraps the data you want to parse, you can do some string transformations here. You won't need to use this very often, but if your source gives the JSON wrapped in javascript (like the old tumblr API), it can be invaluable.</li>
|
||||
<li><b>Example URLs</b>: Here you should add a list of example URLs the parser works for. This lets the client automatically link this parser up with URL classes for you and any users you share the parser with.</li>
|
||||
</ul>
|
||||
<h3>content parsers</h3>
|
||||
<p>This page is just a simple list:</p>
|
||||
<p><img src="edit_page_parser_panel_e621_content_parsers.png" /></p>
|
||||
<p>Each content parser here will be applied to the document and returned in this page parser's results list. Like most boorus, e621's File Pages only ever present one file, and they have simple markup, so the solution here was simple. The full contents of that test window are:</p>
|
||||
<p><pre>*** 1 RESULTS BEGIN ***
|
||||
|
||||
tag: character:krystal
|
||||
tag: creator:s mino930
|
||||
file url: https://static1.e621.net/data/fc/b6/fcb673ed89241a7b8d87a5dcb3a08af7.jpg
|
||||
tag: anthro
|
||||
tag: black nose
|
||||
tag: blue fur
|
||||
tag: blue hair
|
||||
tag: clothing
|
||||
tag: female
|
||||
tag: fur
|
||||
tag: green eyes
|
||||
tag: hair
|
||||
tag: hair ornament
|
||||
tag: jewelry
|
||||
tag: short hair
|
||||
tag: solo
|
||||
tag: video games
|
||||
tag: white fur
|
||||
tag: series:nintendo
|
||||
tag: series:star fox
|
||||
tag: species:canine
|
||||
tag: species:fox
|
||||
tag: species:mammal
|
||||
|
||||
*** RESULTS END ***</pre></p>
|
||||
<p>When the client sees this in a downloader context, it will where to download the file and which tags to associate with it based on what the user has chosen in their 'tag import options'.</p>
|
||||
<h3>subsidiary page parsers</h3>
|
||||
<p>But what if you have multiple files per page? What sort of <i>shape</i> of parsing would you need? You might be able to get away with a single content parser for a simple gallery page, where you could just find n thumbnail links, but what if you want to pull some other metadata for each thumbnail (like a title or ratings tag) at the same time? What if you are parsing an imageboard thread, where each file's links and filename tags and source times are all on just the one page? It is easy to find such a page's tags and URLs, but not so easy to group the correct rows together.</p>
|
||||
<p>The solution is to split the file up into smaller 'posts' and to parse each in turn with a subsidiary page parsers:</p>
|
||||
<p><img src="edit_page_parser_panel_4chan_subsidiary_page_parsers.png" /></p>
|
||||
<p>talk about get html/json in the formula, show the new fourth tab and delve into that. show how splitting the html of an example html thread parser goes</p>
|
||||
<p>Each subsidiary page parser will produce 0-n file results (typically for a Gallery or Watchable URL).</p>
|
||||
<p>mention vetos again</p>
|
||||
<p>subsidiary page parsers and what that is for</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
Binary file not shown.
After Width: | Height: | Size: 43 KiB |
Binary file not shown.
After Width: | Height: | Size: 51 KiB |
Binary file not shown.
After Width: | Height: | Size: 42 KiB |
Binary file not shown.
After Width: | Height: | Size: 58 KiB |
Binary file not shown.
After Width: | Height: | Size: 60 KiB |
|
@ -336,7 +336,7 @@ SHORTCUTS_RESERVED_NAMES = [ 'archive_delete_filter', 'duplicate_filter', 'media
|
|||
|
||||
# shortcut commands
|
||||
|
||||
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'archive_file', 'inbox_file', 'delete_file', 'remove_file_from_view', 'open_file_in_external_program', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative' ]
|
||||
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'manage_file_notes', 'archive_file', 'inbox_file', 'delete_file', 'remove_file_from_view', 'open_file_in_external_program', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative' ]
|
||||
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ 'move_animation_to_previous_frame', 'move_animation_to_next_frame', 'switch_between_fullscreen_borderless_and_regular_framed_window', 'pan_up', 'pan_down', 'pan_left', 'pan_right', 'zoom_in', 'zoom_out', 'switch_between_100_percent_and_canvas_zoom', 'flip_darkmode' ]
|
||||
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ 'view_next', 'view_first', 'view_last', 'view_previous' ]
|
||||
SHORTCUTS_MAIN_GUI_ACTIONS = [ 'refresh', 'new_page', 'synchronised_wait_switch', 'set_media_focus', 'show_hide_splitters', 'set_search_focus', 'unclose_page', 'close_page', 'redo', 'undo', 'flip_darkmode', 'check_all_import_folders' ]
|
||||
|
@ -582,6 +582,8 @@ LOCAL_UPDATE_SERVICE_KEY = 'repository updates'
|
|||
|
||||
LOCAL_BOORU_SERVICE_KEY = 'local booru'
|
||||
|
||||
LOCAL_NOTES_SERVICE_KEY = 'local notes'
|
||||
|
||||
TRASH_SERVICE_KEY = 'trash'
|
||||
|
||||
COMBINED_LOCAL_FILE_SERVICE_KEY = 'all local files'
|
||||
|
|
|
@ -2792,6 +2792,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._CreateIndex( 'files_info', [ 'duration' ] )
|
||||
self._CreateIndex( 'files_info', [ 'num_frames' ] )
|
||||
|
||||
self._c.execute( 'CREATE TABLE file_notes ( hash_id INTEGER PRIMARY KEY, notes TEXT );' )
|
||||
|
||||
self._c.execute( 'CREATE TABLE file_transfers ( service_id INTEGER REFERENCES services ON DELETE CASCADE, hash_id INTEGER, PRIMARY KEY ( service_id, hash_id ) );' )
|
||||
self._CreateIndex( 'file_transfers', [ 'hash_id' ] )
|
||||
|
||||
|
@ -2908,8 +2910,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
init_service_info.append( ( CC.COMBINED_FILE_SERVICE_KEY, HC.COMBINED_FILE, CC.COMBINED_FILE_SERVICE_KEY ) )
|
||||
init_service_info.append( ( CC.COMBINED_TAG_SERVICE_KEY, HC.COMBINED_TAG, CC.COMBINED_TAG_SERVICE_KEY ) )
|
||||
init_service_info.append( ( CC.LOCAL_BOORU_SERVICE_KEY, HC.LOCAL_BOORU, CC.LOCAL_BOORU_SERVICE_KEY ) )
|
||||
|
||||
self._combined_files_ac_caches = {}
|
||||
init_service_info.append( ( CC.LOCAL_NOTES_SERVICE_KEY, HC.LOCAL_NOTES, CC.LOCAL_NOTES_SERVICE_KEY ) )
|
||||
|
||||
for ( service_key, service_type, name ) in init_service_info:
|
||||
|
||||
|
@ -3796,6 +3797,24 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return desired_hashes
|
||||
|
||||
|
||||
def _GetFileNotes( self, hash ):
|
||||
|
||||
hash_id = self._GetHashId( hash )
|
||||
|
||||
result = self._c.execute( 'SELECT notes FROM file_notes WHERE hash_id = ?;', ( hash_id, ) ).fetchone()
|
||||
|
||||
if result is None:
|
||||
|
||||
return ''
|
||||
|
||||
else:
|
||||
|
||||
( notes, ) = result
|
||||
|
||||
return notes
|
||||
|
||||
|
||||
|
||||
def _GetFileSystemPredicates( self, service_key ):
|
||||
|
||||
service_id = self._GetServiceId( service_key )
|
||||
|
@ -7595,6 +7614,21 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
elif service_type == HC.LOCAL_NOTES:
|
||||
|
||||
if action == HC.CONTENT_UPDATE_SET:
|
||||
|
||||
( notes, hash ) = row
|
||||
|
||||
hash_id = self._GetHashId( hash )
|
||||
|
||||
self._c.execute( 'DELETE FROM file_notes WHERE hash_id = ?;', ( hash_id, ) )
|
||||
|
||||
if len( notes ) > 0:
|
||||
|
||||
self._c.execute( 'INSERT OR IGNORE INTO file_notes ( hash_id, notes ) VALUES ( ?, ? );', ( hash_id, notes ) )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( ultimate_mappings_ids ) + len( ultimate_deleted_mappings_ids ) + len( ultimate_pending_mappings_ids ) + len( ultimate_pending_rescinded_mappings_ids ) + len( ultimate_petitioned_mappings_ids ) + len( ultimate_petitioned_rescinded_mappings_ids ) > 0:
|
||||
|
@ -8187,6 +8221,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'duplicate_types_to_counts': result = self._CacheSimilarFilesGetDupeStatusesToCounts( *args, **kwargs )
|
||||
elif action == 'unique_duplicate_pairs': result = self._CacheSimilarFilesGetUniqueDuplicatePairs( *args, **kwargs )
|
||||
elif action == 'file_hashes': result = self._GetFileHashes( *args, **kwargs )
|
||||
elif action == 'file_notes': result = self._GetFileNotes( *args, **kwargs )
|
||||
elif action == 'file_query_ids': result = self._GetHashIdsFromQuery( *args, **kwargs )
|
||||
elif action == 'file_system_predicates': result = self._GetFileSystemPredicates( *args, **kwargs )
|
||||
elif action == 'filter_hashes': result = self._FilterHashes( *args, **kwargs )
|
||||
|
@ -10573,6 +10608,34 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 296:
|
||||
|
||||
try:
|
||||
|
||||
subscriptions = self._GetJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.ReviveDead()
|
||||
|
||||
self._SetJSONDump( subscription )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'While attempting to revive dead subscription queries, I had this problem:' )
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._c.execute( 'CREATE TABLE file_notes ( hash_id INTEGER PRIMARY KEY, notes TEXT );' )
|
||||
|
||||
dictionary = ClientServices.GenerateDefaultServiceDictionary( HC.LOCAL_NOTES )
|
||||
|
||||
self._AddService( CC.LOCAL_NOTES_SERVICE_KEY, HC.LOCAL_NOTES, CC.LOCAL_NOTES_SERVICE_KEY, dictionary )
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -305,6 +305,12 @@ def DeletePath( path ):
|
|||
HydrusPaths.DeletePath( path )
|
||||
|
||||
|
||||
def GetAlphaOfColour( colour, alpha ):
|
||||
|
||||
( r, g, b, a ) = colour.Get()
|
||||
|
||||
return wx.Colour( r, g, b, alpha )
|
||||
|
||||
def GetDifferentLighterDarkerColour( colour, intensity = 3 ):
|
||||
|
||||
( r, g, b, a ) = colour.Get()
|
||||
|
@ -791,6 +797,168 @@ class Booru( HydrusData.HydrusYAMLBase ):
|
|||
|
||||
sqlite3.register_adapter( Booru, yaml.safe_dump )
|
||||
|
||||
class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CHECKER_OPTIONS
|
||||
SERIALISABLE_NAME = 'Checker Timing Options'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, intended_files_per_check = 8, never_faster_than = 300, never_slower_than = 86400, death_file_velocity = ( 1, 86400 ) ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._intended_files_per_check = intended_files_per_check
|
||||
self._never_faster_than = never_faster_than
|
||||
self._never_slower_than = never_slower_than
|
||||
self._death_file_velocity = death_file_velocity
|
||||
|
||||
|
||||
def _GetCurrentFilesVelocity( self, seed_cache, last_check_time ):
|
||||
|
||||
( death_files_found, death_time_delta ) = self._death_file_velocity
|
||||
|
||||
since = last_check_time - death_time_delta
|
||||
|
||||
current_files_found = seed_cache.GetNumNewFilesSince( since )
|
||||
|
||||
# when a thread is only 30mins old (i.e. first file was posted 30 mins ago), we don't want to calculate based on a longer delete time delta
|
||||
# we want next check to be like 30mins from now, not 12 hours
|
||||
# so we'll say "5 files in 30 mins" rather than "5 files in 24 hours"
|
||||
|
||||
earliest_source_time = seed_cache.GetEarliestSourceTime()
|
||||
|
||||
if earliest_source_time is None:
|
||||
|
||||
current_time_delta = death_time_delta
|
||||
|
||||
else:
|
||||
|
||||
early_time_delta = max( last_check_time - earliest_source_time, 30 )
|
||||
|
||||
current_time_delta = min( early_time_delta, death_time_delta )
|
||||
|
||||
|
||||
return ( current_files_found, current_time_delta )
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity ) = serialisable_info
|
||||
|
||||
|
||||
def GetNextCheckTime( self, seed_cache, last_check_time ):
|
||||
|
||||
if len( seed_cache ) == 0:
|
||||
|
||||
if last_check_time == 0:
|
||||
|
||||
return 0 # haven't checked yet, so should check immediately
|
||||
|
||||
else:
|
||||
|
||||
return HydrusData.GetNow() + self._never_slower_than
|
||||
|
||||
|
||||
else:
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
if current_files_found == 0:
|
||||
|
||||
# this shouldn't typically matter, since a dead checker won't care about next check time
|
||||
# so let's just have a nice safe value in case this is ever asked legit
|
||||
check_period = self._never_slower_than
|
||||
|
||||
else:
|
||||
|
||||
approx_time_per_file = current_time_delta / current_files_found
|
||||
|
||||
ideal_check_period = self._intended_files_per_check * approx_time_per_file
|
||||
|
||||
# if a thread produced lots of files and then stopped completely for whatever reason, we don't want to keep checking fast
|
||||
# so, we set a lower limit of time since last file upload, neatly doubling our check period in these situations
|
||||
|
||||
latest_source_time = seed_cache.GetLatestSourceTime()
|
||||
|
||||
time_since_latest_file = max( last_check_time - latest_source_time, 30 )
|
||||
|
||||
never_faster_than = max( self._never_faster_than, time_since_latest_file )
|
||||
|
||||
check_period = min( max( never_faster_than, ideal_check_period ), self._never_slower_than )
|
||||
|
||||
|
||||
return last_check_time + check_period
|
||||
|
||||
|
||||
|
||||
def GetRawCurrentVelocity( self, seed_cache, last_check_time ):
|
||||
|
||||
return self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
|
||||
def GetPrettyCurrentVelocity( self, seed_cache, last_check_time, no_prefix = False ):
|
||||
|
||||
if len( seed_cache ) == 0:
|
||||
|
||||
if last_check_time == 0:
|
||||
|
||||
pretty_current_velocity = 'no files yet'
|
||||
|
||||
else:
|
||||
|
||||
pretty_current_velocity = 'no files, unable to determine velocity'
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if no_prefix:
|
||||
|
||||
pretty_current_velocity = ''
|
||||
|
||||
else:
|
||||
|
||||
pretty_current_velocity = 'at last check, found '
|
||||
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
pretty_current_velocity += HydrusData.ConvertIntToPrettyString( current_files_found ) + ' files in previous ' + HydrusData.ConvertTimeDeltaToPrettyString( current_time_delta )
|
||||
|
||||
|
||||
return pretty_current_velocity
|
||||
|
||||
|
||||
def IsDead( self, seed_cache, last_check_time ):
|
||||
|
||||
if len( seed_cache ) == 0 and last_check_time == 0:
|
||||
|
||||
return False
|
||||
|
||||
else:
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
( death_files_found, deleted_time_delta ) = self._death_file_velocity
|
||||
|
||||
current_file_velocity_float = current_files_found / float( current_time_delta )
|
||||
death_file_velocity_float = death_files_found / float( deleted_time_delta )
|
||||
|
||||
return current_file_velocity_float < death_file_velocity_float
|
||||
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_CHECKER_OPTIONS ] = CheckerOptions
|
||||
|
||||
class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CLIENT_OPTIONS
|
||||
|
@ -842,8 +1010,6 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'add_parents_on_manage_tags' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'replace_siblings_on_manage_tags' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'get_tags_if_url_known_and_file_redundant' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'permit_watchers_to_name_their_pages' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_related_tags' ] = False
|
||||
|
@ -871,6 +1037,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'ac_select_first_with_count' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'saving_sash_positions_on_exit' ] = True
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'colours' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
@ -1008,7 +1176,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
example_tags = HydrusTags.CleanTags( [ 'creator:creator', 'series:series', 'title:title' ] )
|
||||
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info, separator, example_tags )
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info = namespace_info, separator = separator, example_tags = example_tags )
|
||||
|
||||
self._dictionary[ 'tag_summary_generators' ][ 'thumbnail_top' ] = tsg
|
||||
|
||||
|
@ -1022,7 +1190,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
example_tags = HydrusTags.CleanTags( [ 'volume:3', 'chapter:10', 'page:330', 'page:331' ] )
|
||||
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info, separator, example_tags )
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info = namespace_info, separator = separator, example_tags = example_tags )
|
||||
|
||||
self._dictionary[ 'tag_summary_generators' ][ 'thumbnail_bottom_right' ] = tsg
|
||||
|
||||
|
@ -1039,7 +1207,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
example_tags = HydrusTags.CleanTags( [ 'creator:creator', 'series:series', 'title:title', 'volume:1', 'chapter:1', 'page:1' ] )
|
||||
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info, separator, example_tags )
|
||||
tsg = ClientTags.TagSummaryGenerator( namespace_info = namespace_info, separator = separator, example_tags = example_tags )
|
||||
|
||||
self._dictionary[ 'tag_summary_generators' ][ 'media_viewer_top' ] = tsg
|
||||
|
||||
|
@ -1331,6 +1499,14 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def FlipBoolean( self, name ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._dictionary[ 'booleans' ][ name ] = not self._dictionary[ 'booleans' ][ name ]
|
||||
|
||||
|
||||
|
||||
def GetBoolean( self, name ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1902,11 +2078,6 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
if name == 'current_colourset' and it_changed:
|
||||
|
||||
HG.client_controller.pub( 'notify_new_colourset' )
|
||||
|
||||
|
||||
|
||||
|
||||
def SetStringList( self, name, value ):
|
||||
|
@ -2898,165 +3069,3 @@ class TagCensor( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_TAG_CENSOR ] = TagCensor
|
||||
|
||||
class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CHECKER_OPTIONS
|
||||
SERIALISABLE_NAME = 'Checker Timing Options'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, intended_files_per_check = 8, never_faster_than = 300, never_slower_than = 86400, death_file_velocity = ( 1, 86400 ) ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._intended_files_per_check = intended_files_per_check
|
||||
self._never_faster_than = never_faster_than
|
||||
self._never_slower_than = never_slower_than
|
||||
self._death_file_velocity = death_file_velocity
|
||||
|
||||
|
||||
def _GetCurrentFilesVelocity( self, seed_cache, last_check_time ):
|
||||
|
||||
( death_files_found, death_time_delta ) = self._death_file_velocity
|
||||
|
||||
since = last_check_time - death_time_delta
|
||||
|
||||
current_files_found = seed_cache.GetNumNewFilesSince( since )
|
||||
|
||||
# when a thread is only 30mins old (i.e. first file was posted 30 mins ago), we don't want to calculate based on a longer delete time delta
|
||||
# we want next check to be like 30mins from now, not 12 hours
|
||||
# so we'll say "5 files in 30 mins" rather than "5 files in 24 hours"
|
||||
|
||||
earliest_source_time = seed_cache.GetEarliestSourceTime()
|
||||
|
||||
if earliest_source_time is None:
|
||||
|
||||
current_time_delta = death_time_delta
|
||||
|
||||
else:
|
||||
|
||||
early_time_delta = max( last_check_time - earliest_source_time, 30 )
|
||||
|
||||
current_time_delta = min( early_time_delta, death_time_delta )
|
||||
|
||||
|
||||
return ( current_files_found, current_time_delta )
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity ) = serialisable_info
|
||||
|
||||
|
||||
def GetNextCheckTime( self, seed_cache, last_check_time ):
|
||||
|
||||
if len( seed_cache ) == 0:
|
||||
|
||||
if last_check_time == 0:
|
||||
|
||||
return 0 # haven't checked yet, so should check immediately
|
||||
|
||||
else:
|
||||
|
||||
return HydrusData.GetNow() + self._never_slower_than
|
||||
|
||||
|
||||
else:
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
if current_files_found == 0:
|
||||
|
||||
# this shouldn't typically matter, since a dead checker won't care about next check time
|
||||
# so let's just have a nice safe value in case this is ever asked legit
|
||||
check_period = self._never_slower_than
|
||||
|
||||
else:
|
||||
|
||||
approx_time_per_file = current_time_delta / current_files_found
|
||||
|
||||
ideal_check_period = self._intended_files_per_check * approx_time_per_file
|
||||
|
||||
# if a thread produced lots of files and then stopped completely for whatever reason, we don't want to keep checking fast
|
||||
# so, we set a lower limit of time since last file upload, neatly doubling our check period in these situations
|
||||
|
||||
latest_source_time = seed_cache.GetLatestSourceTime()
|
||||
|
||||
time_since_latest_file = max( last_check_time - latest_source_time, 30 )
|
||||
|
||||
never_faster_than = max( self._never_faster_than, time_since_latest_file )
|
||||
|
||||
check_period = min( max( never_faster_than, ideal_check_period ), self._never_slower_than )
|
||||
|
||||
|
||||
return last_check_time + check_period
|
||||
|
||||
|
||||
|
||||
def GetRawCurrentVelocity( self, seed_cache, last_check_time ):
|
||||
|
||||
return self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
|
||||
def GetPrettyCurrentVelocity( self, seed_cache, last_check_time, no_prefix = False ):
|
||||
|
||||
if len( seed_cache ) == 0:
|
||||
|
||||
if last_check_time == 0:
|
||||
|
||||
pretty_current_velocity = 'no files yet'
|
||||
|
||||
else:
|
||||
|
||||
pretty_current_velocity = 'no files, unable to determine velocity'
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if no_prefix:
|
||||
|
||||
pretty_current_velocity = ''
|
||||
|
||||
else:
|
||||
|
||||
pretty_current_velocity = 'at last check, found '
|
||||
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
pretty_current_velocity += HydrusData.ConvertIntToPrettyString( current_files_found ) + ' files in previous ' + HydrusData.ConvertTimeDeltaToPrettyString( current_time_delta )
|
||||
|
||||
|
||||
return pretty_current_velocity
|
||||
|
||||
|
||||
def IsDead( self, seed_cache, last_check_time ):
|
||||
|
||||
if len( seed_cache ) == 0 and last_check_time == 0:
|
||||
|
||||
return False
|
||||
|
||||
else:
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( seed_cache, last_check_time )
|
||||
|
||||
( death_files_found, deleted_time_delta ) = self._death_file_velocity
|
||||
|
||||
current_file_velocity_float = current_files_found / float( current_time_delta )
|
||||
death_file_velocity_float = death_files_found / float( deleted_time_delta )
|
||||
|
||||
return current_file_velocity_float < death_file_velocity_float
|
||||
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._intended_files_per_check, self._never_faster_than, self._never_slower_than, self._death_file_velocity )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_CHECKER_OPTIONS ] = CheckerOptions
|
||||
|
|
|
@ -24,20 +24,6 @@ import HydrusGlobals as HG
|
|||
URL_EXTRA_INFO = {}
|
||||
URL_EXTRA_INFO_LOCK = threading.Lock()
|
||||
|
||||
def GetExtraURLInfo( url ):
|
||||
|
||||
with URL_EXTRA_INFO_LOCK:
|
||||
|
||||
if url in URL_EXTRA_INFO:
|
||||
|
||||
return URL_EXTRA_INFO[ url ]
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
def GetGalleryStreamIdentifiers( gallery_identifier ):
|
||||
|
||||
site_type = gallery_identifier.GetSiteType()
|
||||
|
@ -57,13 +43,6 @@ def GetGalleryStreamIdentifiers( gallery_identifier ):
|
|||
|
||||
return gallery_stream_identifiers
|
||||
|
||||
def SetExtraURLInfo( url, info ):
|
||||
|
||||
with URL_EXTRA_INFO_LOCK:
|
||||
|
||||
URL_EXTRA_INFO[ url ] = info
|
||||
|
||||
|
||||
def GetGallery( gallery_identifier ):
|
||||
|
||||
site_type = gallery_identifier.GetSiteType()
|
||||
|
@ -78,10 +57,6 @@ def GetGallery( gallery_identifier ):
|
|||
|
||||
return GalleryDeviantArt()
|
||||
|
||||
elif site_type == HC.SITE_TYPE_GIPHY:
|
||||
|
||||
return GalleryGiphy()
|
||||
|
||||
elif site_type in ( HC.SITE_TYPE_HENTAI_FOUNDRY, HC.SITE_TYPE_HENTAI_FOUNDRY_ARTIST ):
|
||||
|
||||
return GalleryHentaiFoundry()
|
||||
|
@ -606,9 +581,22 @@ class Gallery( object ):
|
|||
|
||||
data = self._FetchData( gallery_url )
|
||||
|
||||
( page_of_urls, definitely_no_more_pages ) = self._ParseGalleryPage( data, gallery_url )
|
||||
( page_of_urls_and_tags, definitely_no_more_pages ) = self._ParseGalleryPage( data, gallery_url )
|
||||
|
||||
return ( page_of_urls, definitely_no_more_pages )
|
||||
import ClientImporting
|
||||
|
||||
page_of_seeds = []
|
||||
|
||||
for ( url, tags ) in page_of_urls_and_tags:
|
||||
|
||||
seed = ClientImporting.Seed( ClientImporting.SEED_TYPE_URL, url )
|
||||
|
||||
seed.AddTags( tags )
|
||||
|
||||
page_of_seeds.append( seed )
|
||||
|
||||
|
||||
return ( page_of_seeds, definitely_no_more_pages )
|
||||
|
||||
|
||||
def GetTags( self, url ):
|
||||
|
@ -814,7 +802,9 @@ class GalleryBooru( Gallery ):
|
|||
urls = [ ClientData.ConvertHTTPToHTTPS( url ) for url in urls ]
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
urls_and_tags = [ ( url, set() ) for url in urls ]
|
||||
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def _ParseImagePage( self, html, url_base ):
|
||||
|
@ -1039,7 +1029,7 @@ class GalleryDeviantArt( Gallery ):
|
|||
|
||||
definitely_no_more_pages = False
|
||||
|
||||
urls = []
|
||||
urls_and_tags = []
|
||||
|
||||
soup = GetSoup( html )
|
||||
|
||||
|
@ -1053,8 +1043,6 @@ class GalleryDeviantArt( Gallery ):
|
|||
|
||||
url = thumb[ 'href' ] # something in the form of blah.da.com/art/blah-123456
|
||||
|
||||
urls.append( url )
|
||||
|
||||
tags = []
|
||||
|
||||
tags.append( 'creator:' + artist )
|
||||
|
@ -1071,10 +1059,10 @@ class GalleryDeviantArt( Gallery ):
|
|||
|
||||
|
||||
|
||||
SetExtraURLInfo( url, tags )
|
||||
urls_and_tags.append( ( url, tags ) )
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def _ParseImagePage( self, html, referral_url ):
|
||||
|
@ -1158,82 +1146,7 @@ class GalleryDeviantArt( Gallery ):
|
|||
|
||||
def GetTags( self, url ):
|
||||
|
||||
result = GetExtraURLInfo( url )
|
||||
|
||||
if result is None:
|
||||
|
||||
return []
|
||||
|
||||
else:
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
||||
class GalleryGiphy( Gallery ):
|
||||
|
||||
def _GetGalleryPageURL( self, query, page_index ):
|
||||
|
||||
tag = query
|
||||
|
||||
return 'http://giphy.com/api/gifs?tag=' + urllib.quote( HydrusData.ToByteString( tag ).replace( ' ', '+' ), '' ) + '&page=' + str( page_index + 1 )
|
||||
|
||||
|
||||
def _ParseGalleryPage( self, data, url_base ):
|
||||
|
||||
definitely_no_more_pages = False
|
||||
|
||||
json_dict = json.loads( data )
|
||||
|
||||
urls = []
|
||||
|
||||
if 'data' in json_dict:
|
||||
|
||||
json_data = json_dict[ 'data' ]
|
||||
|
||||
for d in json_data:
|
||||
|
||||
url = d[ 'image_original_url' ]
|
||||
id = d[ 'id' ]
|
||||
|
||||
SetExtraURLInfo( url, id )
|
||||
|
||||
urls.append( url )
|
||||
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
|
||||
|
||||
def GetTags( self, url ):
|
||||
|
||||
id = GetExtraURLInfo( url )
|
||||
|
||||
if id is None:
|
||||
|
||||
return []
|
||||
|
||||
else:
|
||||
|
||||
url = 'http://giphy.com/api/gifs/' + str( id )
|
||||
|
||||
try:
|
||||
|
||||
raw_json = self._FetchData( url )
|
||||
|
||||
json_dict = json.loads( raw_json )
|
||||
|
||||
tags_data = json_dict[ 'data' ][ 'tags' ]
|
||||
|
||||
return [ tag_data[ 'name' ] for tag_data in tags_data ]
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return []
|
||||
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
class GalleryHentaiFoundry( Gallery ):
|
||||
|
@ -1303,7 +1216,9 @@ class GalleryHentaiFoundry( Gallery ):
|
|||
definitely_no_more_pages = True
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
urls_and_tags = [ ( url, set() ) for url in urls ]
|
||||
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def _ParseImagePage( self, html, url_base ):
|
||||
|
@ -1312,14 +1227,36 @@ class GalleryHentaiFoundry( Gallery ):
|
|||
# find http://pictures.hentai-foundry.com//
|
||||
# then extend it to http://pictures.hentai-foundry.com//k/KABOS/172144/image.jpg
|
||||
# the .jpg bit is what we really need, but whatever
|
||||
|
||||
# an example of this:
|
||||
# http://www.hentai-foundry.com/pictures/user/Sparrow/440257/Meroulix-LeBeau
|
||||
|
||||
# addendum:
|
||||
# some users put pictures.hentai-foundry.com links in their profile images, which then gets repeated up above in some <meta> tag
|
||||
# so, lets limit this search to a smaller bit of html
|
||||
|
||||
# example of this:
|
||||
# http://www.hentai-foundry.com/pictures/user/teku/572881/Special-Gang-Bang
|
||||
|
||||
try:
|
||||
|
||||
index = html.index( 'pictures.hentai-foundry.com' )
|
||||
image_soup = GetSoup( html )
|
||||
|
||||
image_url = html[ index : index + 256 ]
|
||||
image_html = unicode( image_soup.find( 'section', id = 'picBox' ) )
|
||||
|
||||
if '"' in image_url: ( image_url, gumpf ) = image_url.split( '"', 1 )
|
||||
if ''' in image_url: ( image_url, gumpf ) = image_url.split( ''', 1 )
|
||||
index = image_html.index( 'pictures.hentai-foundry.com' )
|
||||
|
||||
image_url = image_html[ index : index + 256 ]
|
||||
|
||||
if '"' in image_url:
|
||||
|
||||
( image_url, gumpf ) = image_url.split( '"', 1 )
|
||||
|
||||
|
||||
if ''' in image_url:
|
||||
|
||||
( image_url, gumpf ) = image_url.split( ''', 1 )
|
||||
|
||||
|
||||
image_url = 'http://' + image_url
|
||||
|
||||
|
@ -1453,7 +1390,9 @@ class GalleryNewgrounds( Gallery ):
|
|||
|
||||
definitely_no_more_pages = True
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
urls_and_tags = [ ( url, set() ) for url in urls ]
|
||||
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def _ParseImagePage( self, html, url_base ):
|
||||
|
@ -1599,7 +1538,9 @@ class GalleryPixiv( Gallery ):
|
|||
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
urls_and_tags = [ ( url, set() ) for url in urls ]
|
||||
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def _ParseImagePage( self, html, page_url ):
|
||||
|
@ -1777,7 +1718,7 @@ class GalleryTumblr( Gallery ):
|
|||
|
||||
json_object = json.loads( processed_raw_json )
|
||||
|
||||
urls = []
|
||||
urls_and_tags = []
|
||||
|
||||
if 'posts' in json_object:
|
||||
|
||||
|
@ -1790,8 +1731,14 @@ class GalleryTumblr( Gallery ):
|
|||
|
||||
raw_url_available = date_struct.tm_year > 2012
|
||||
|
||||
if 'tags' in post: tags = post[ 'tags' ]
|
||||
else: tags = []
|
||||
if 'tags' in post:
|
||||
|
||||
tags = post[ 'tags' ]
|
||||
|
||||
else:
|
||||
|
||||
tags = []
|
||||
|
||||
|
||||
post_type = post[ 'type' ]
|
||||
|
||||
|
@ -1833,9 +1780,7 @@ class GalleryTumblr( Gallery ):
|
|||
|
||||
url = ClientData.ConvertHTTPToHTTPS( url )
|
||||
|
||||
SetExtraURLInfo( url, tags )
|
||||
|
||||
urls.append( url )
|
||||
urls_and_tags.append( ( url, tags ) )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -1857,7 +1802,7 @@ class GalleryTumblr( Gallery ):
|
|||
|
||||
url = vp_source[ 'src' ]
|
||||
|
||||
urls.append( url )
|
||||
urls_and_tags.append( ( url, tags ) )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -1868,20 +1813,11 @@ class GalleryTumblr( Gallery ):
|
|||
|
||||
|
||||
|
||||
return ( urls, definitely_no_more_pages )
|
||||
return ( urls_and_tags, definitely_no_more_pages )
|
||||
|
||||
|
||||
def GetTags( self, url ):
|
||||
|
||||
result = GetExtraURLInfo( url )
|
||||
|
||||
if result is None:
|
||||
|
||||
return []
|
||||
|
||||
else:
|
||||
|
||||
return result
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
|
|
|
@ -1122,7 +1122,16 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'refresh', 'If the current page has a search, refresh it.', self._Refresh )
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'show/hide management and preview panels', 'Show or hide the panels on the left.', self._ShowHideSplitters )
|
||||
|
||||
splitter_menu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, splitter_menu, 'show/hide', 'Show or hide the panels on the left.', self._ShowHideSplitters )
|
||||
ClientGUIMenus.AppendSeparator( splitter_menu )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, splitter_menu, 'save current page\'s sash positions on client exit', 'Set whether sash position should be saved over on client exit.', self._new_options.GetBoolean( 'saving_sash_positions_on_exit' ), self._new_options.FlipBoolean, 'saving_sash_positions_on_exit' )
|
||||
ClientGUIMenus.AppendMenuItem( self, splitter_menu, 'save current page\'s sash positions now', 'Save the current page\'s sash positions.', self._SaveSplitterPositions )
|
||||
ClientGUIMenus.AppendMenuItem( self, splitter_menu, 'restore all pages\' sash positions to saved value', 'Restore the current sash positions for all pages to the values that are saved.', self._RestoreSplitterPositions )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, splitter_menu, 'management and preview panels' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
@ -2037,6 +2046,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
self._controller.pub( 'wake_daemons' )
|
||||
self._controller.gui.SetStatusBarDirty()
|
||||
self._controller.pub( 'refresh_page_name' )
|
||||
self._controller.pub( 'notify_new_colourset' )
|
||||
|
||||
|
||||
def _ManageParsers( self ):
|
||||
|
@ -2134,7 +2144,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
def _ManageSubscriptions( self ):
|
||||
|
||||
def wx_do_it():
|
||||
def wx_do_it( subscriptions ):
|
||||
|
||||
if not self:
|
||||
|
||||
|
@ -2144,13 +2154,18 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
title = 'manage subscriptions'
|
||||
frame_key = 'manage_subscriptions_dialog'
|
||||
|
||||
with ClientGUITopLevelWindows.DialogManage( self, title, frame_key ) as dlg:
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, title, frame_key ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsManagement.ManageSubscriptionsPanel( dlg )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditSubscriptionsPanel( dlg, subscriptions )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
dlg.ShowModal()
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
subscriptions = panel.GetValue()
|
||||
|
||||
HG.client_controller.Write( 'serialisables_overwrite', [ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ], subscriptions )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -2188,7 +2203,9 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
controller.CallBlockingToWx( wx_do_it )
|
||||
subscriptions = HG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||
|
||||
controller.CallBlockingToWx( wx_do_it, subscriptions )
|
||||
|
||||
finally:
|
||||
|
||||
|
@ -2588,6 +2605,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _RestoreSplitterPositions( self ):
|
||||
|
||||
self._controller.pub( 'set_splitter_positions', HC.options[ 'hpos' ], HC.options[ 'vpos' ] )
|
||||
|
||||
|
||||
def _ReviewBandwidth( self ):
|
||||
|
||||
frame = ClientGUITopLevelWindows.FrameThatTakesScrollablePanel( self, 'review bandwidth' )
|
||||
|
@ -2606,6 +2628,16 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
frame.SetPanel( panel )
|
||||
|
||||
|
||||
def _SaveSplitterPositions( self ):
|
||||
|
||||
page = self._notebook.GetCurrentMediaPage()
|
||||
|
||||
if page is not None:
|
||||
|
||||
( HC.options[ 'hpos' ], HC.options[ 'vpos' ] ) = page.GetSashPositions()
|
||||
|
||||
|
||||
|
||||
def _SetPassword( self ):
|
||||
|
||||
message = '''You can set a password to be asked for whenever the client starts.
|
||||
|
@ -3550,11 +3582,9 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
self._notebook.CleanBeforeDestroy()
|
||||
|
||||
page = self._notebook.GetCurrentMediaPage()
|
||||
|
||||
if page is not None:
|
||||
if self._new_options.GetBoolean( 'saving_sash_positions_on_exit' ):
|
||||
|
||||
( HC.options[ 'hpos' ], HC.options[ 'vpos' ] ) = page.GetSashPositions()
|
||||
self._SaveSplitterPositions()
|
||||
|
||||
|
||||
ClientGUITopLevelWindows.SaveTLWSizeAndPosition( self, self._frame_key )
|
||||
|
@ -3608,6 +3638,8 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
self._new_options.SetString( 'current_colourset', new_colourset )
|
||||
|
||||
HG.client_controller.pub( 'notify_new_colourset' )
|
||||
|
||||
|
||||
def FlushOutPredicates( self, predicates ):
|
||||
|
||||
|
|
|
@ -10,6 +10,7 @@ import ClientGUIDialogs
|
|||
import ClientGUIDialogsManage
|
||||
import ClientGUIHoverFrames
|
||||
import ClientGUIMenus
|
||||
import ClientGUIScrolledPanels
|
||||
import ClientGUIScrolledPanelsEdit
|
||||
import ClientGUIScrolledPanelsManagement
|
||||
import ClientGUIShortcuts
|
||||
|
@ -1546,6 +1547,67 @@ class Canvas( wx.Window ):
|
|||
|
||||
|
||||
|
||||
def _ManageNotes( self ):
|
||||
|
||||
def wx_do_it( media, notes ):
|
||||
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
title = 'manage notes'
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, title ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
control = wx.TextCtrl( panel, style = wx.TE_MULTILINE )
|
||||
|
||||
size = ClientData.ConvertTextToPixels( control, ( 80, 14 ) )
|
||||
|
||||
control.SetInitialSize( size )
|
||||
|
||||
control.SetValue( notes )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
wx.CallAfter( control.SetFocus )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
notes = control.GetValue()
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_SET, ( notes, hash ) ) ]
|
||||
|
||||
service_keys_to_content_updates = { CC.LOCAL_NOTES_SERVICE_KEY : content_updates }
|
||||
|
||||
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
|
||||
def thread_wait( media ):
|
||||
|
||||
# if it ultimately makes sense, I can load/cache notes in the media result
|
||||
|
||||
notes = HG.client_controller.Read( 'file_notes', media.GetHash() )
|
||||
|
||||
wx.CallAfter( wx_do_it, media, notes )
|
||||
|
||||
|
||||
if self._current_media is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( thread_wait, self._current_media )
|
||||
|
||||
|
||||
def _ManageRatings( self ):
|
||||
|
||||
if self._current_media is None:
|
||||
|
@ -1691,6 +1753,10 @@ class Canvas( wx.Window ):
|
|||
|
||||
self._ManageURLs()
|
||||
|
||||
elif action == 'manage_file_notes':
|
||||
|
||||
self._ManageNotes()
|
||||
|
||||
elif action == 'archive_file':
|
||||
|
||||
self._Archive()
|
||||
|
@ -2364,6 +2430,7 @@ class CanvasPanel( Canvas ):
|
|||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'known urls', 'Manage this file\'s known URLs.', self._ManageURLs )
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'notes', 'Manage this file\'s notes.', self._ManageNotes )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, manage_menu, 'manage' )
|
||||
|
||||
|
@ -4659,6 +4726,7 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
|
|||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'known urls', 'Manage this file\'s known urls.', self._ManageURLs )
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'notes', 'Manage this file\'s notes.', self._ManageNotes )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, manage_menu, 'manage' )
|
||||
|
||||
|
|
|
@ -1107,6 +1107,49 @@ class ChoiceSort( wx.Panel ):
|
|||
self._UpdateAscLabels()
|
||||
|
||||
|
||||
class AlphaColourControl( wx.Panel ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
wx.Panel.__init__( self, parent )
|
||||
|
||||
self._colour_picker = wx.ColourPickerCtrl( self )
|
||||
|
||||
self._alpha_selector = wx.SpinCtrl( self, min = 0, max = 255 )
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( self._colour_picker, CC.FLAGS_VCENTER )
|
||||
hbox.Add( BetterStaticText( self, 'alpha: ' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._alpha_selector, CC.FLAGS_VCENTER )
|
||||
|
||||
self.SetSizer( hbox )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
colour = self._colour_picker.GetColour()
|
||||
|
||||
( r, g, b, a ) = colour.Get() # no alpha support here, so it'll be 255
|
||||
|
||||
a = self._alpha_selector.GetValue()
|
||||
|
||||
colour = wx.Colour( r, g, b, a )
|
||||
|
||||
return colour
|
||||
|
||||
|
||||
def SetValue( self, colour ):
|
||||
|
||||
( r, g, b, a ) = colour.Get()
|
||||
|
||||
picker_colour = wx.Colour( r, g, b )
|
||||
|
||||
self._colour_picker.SetColour( picker_colour )
|
||||
|
||||
self._alpha_selector.SetValue( a )
|
||||
|
||||
|
||||
class ExportPatternButton( BetterButton ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
@ -3029,8 +3072,6 @@ class TextAndGauge( wx.Panel ):
|
|||
self._gauge.SetValue( value )
|
||||
|
||||
|
||||
( DirtyEvent, EVT_DIRTY ) = wx.lib.newevent.NewEvent()
|
||||
|
||||
class TextAndPasteCtrl( wx.Panel ):
|
||||
|
||||
def __init__( self, parent, add_callable ):
|
||||
|
@ -3107,23 +3148,31 @@ class TextAndPasteCtrl( wx.Panel ):
|
|||
|
||||
class ThreadToGUIUpdater( object ):
|
||||
|
||||
def __init__( self, event_handler, func ):
|
||||
def __init__( self, win, func ):
|
||||
|
||||
self._event_handler = event_handler
|
||||
self._win = win
|
||||
self._func = func
|
||||
|
||||
self._lock = threading.Lock()
|
||||
self._dirty_count = 0
|
||||
|
||||
self._args = None
|
||||
self._kwargs = None
|
||||
|
||||
event_handler.Bind( EVT_DIRTY, self.EventDirty )
|
||||
self._doing_it = False
|
||||
|
||||
|
||||
def EventDirty( self, event ):
|
||||
def WXDoIt( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if not self._win:
|
||||
|
||||
self._win = None
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
self._func( *self._args, **self._kwargs )
|
||||
|
@ -3134,10 +3183,11 @@ class ThreadToGUIUpdater( object ):
|
|||
|
||||
|
||||
self._dirty_count = 0
|
||||
self._doing_it = False
|
||||
|
||||
|
||||
|
||||
# the point here is that we can spam this a hundred times a second and wx will catch up to it when the single event gets processed
|
||||
# the point here is that we can spam this a hundred times a second, updating the args and kwargs, and wx will catch up to it when it can
|
||||
# if wx feels like running fast, it'll update at 60fps
|
||||
# if not, we won't get bungled up with 10,000+ pubsub events in the event queue
|
||||
def Update( self, *args, **kwargs ):
|
||||
|
@ -3147,19 +3197,11 @@ class ThreadToGUIUpdater( object ):
|
|||
self._args = args
|
||||
self._kwargs = kwargs
|
||||
|
||||
if self._dirty_count == 0 and not HG.view_shutdown:
|
||||
if not self._doing_it and not HG.view_shutdown:
|
||||
|
||||
def wx_code():
|
||||
|
||||
if not self._event_handler:
|
||||
|
||||
return
|
||||
|
||||
|
||||
wx.QueueEvent( self._event_handler, DirtyEvent() )
|
||||
|
||||
wx.CallAfter( self.WXDoIt )
|
||||
|
||||
wx.CallAfter( wx_code )
|
||||
self._doing_it = True
|
||||
|
||||
|
||||
self._dirty_count += 1
|
||||
|
|
|
@ -2466,7 +2466,7 @@ class DialogSetupExport( Dialog ):
|
|||
|
||||
def wx_update_label( text ):
|
||||
|
||||
if not self:
|
||||
if not self or not self._export:
|
||||
|
||||
return
|
||||
|
||||
|
@ -2476,7 +2476,7 @@ class DialogSetupExport( Dialog ):
|
|||
|
||||
def wx_done():
|
||||
|
||||
if not self:
|
||||
if not self or not self._export:
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -2473,7 +2473,7 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
|
||||
self._tag_box = ClientGUICommon.StaticBox( self._panel, 'tag options' )
|
||||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._tag_box, [], tag_import_options )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._tag_box, [], tag_import_options, show_url_options = False )
|
||||
|
||||
filename_tagging_options_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._tag_box )
|
||||
|
||||
|
|
|
@ -1010,13 +1010,14 @@ class EditFilenameTaggingOptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, namespaces, tag_import_options, update_callable = None ):
|
||||
def __init__( self, parent, namespaces, tag_import_options, update_callable = None, show_url_options = True ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'tag import options', self._EditOptions )
|
||||
|
||||
self._namespaces = namespaces
|
||||
self._tag_import_options = tag_import_options
|
||||
self._update_callable = update_callable
|
||||
self._show_url_options = show_url_options
|
||||
|
||||
self._SetToolTip()
|
||||
|
||||
|
@ -1025,7 +1026,7 @@ class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptions( dlg, self._namespaces, self._tag_import_options )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptionsPanel( dlg, self._namespaces, self._tag_import_options, show_url_options = self._show_url_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -1040,7 +1041,7 @@ class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
def _SetToolTip( self ):
|
||||
|
||||
self.SetToolTip( self._tag_import_options.GetSummary() )
|
||||
self.SetToolTip( self._tag_import_options.GetSummary( self._show_url_options ) )
|
||||
|
||||
|
||||
def _SetValue( self, tag_import_options ):
|
||||
|
|
|
@ -1206,9 +1206,19 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
def _THREADWaitOnJob( self, job_key ):
|
||||
|
||||
def wx_done():
|
||||
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._RefreshAndUpdateStatus()
|
||||
|
||||
|
||||
while not job_key.IsDone():
|
||||
|
||||
if HydrusThreading.IsThreadShuttingDown:
|
||||
if HydrusThreading.IsThreadShuttingDown():
|
||||
|
||||
return
|
||||
|
||||
|
@ -1216,7 +1226,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
time.sleep( 0.25 )
|
||||
|
||||
|
||||
wx.CallAfter( self._RefreshAndUpdateStatus )
|
||||
wx.CallAfter( wx_done )
|
||||
|
||||
|
||||
def EventSearchDistanceChanged( self, event ):
|
||||
|
@ -1316,23 +1326,6 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
|
|||
|
||||
self._query_input = ClientGUICommon.TextAndPasteCtrl( self._pending_queries_panel, self._PendQueries )
|
||||
|
||||
menu_items = []
|
||||
|
||||
invert_call = self._gallery_import.InvertGetTagsIfURLKnownAndFileRedundant
|
||||
value_call = self._gallery_import.GetTagsIfURLKnownAndFileRedundant
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerCalls( invert_call, value_call )
|
||||
|
||||
menu_items.append( ( 'check', 'get tags even if url is known and file is already in db (this downloader)', 'If this is selected, the client will fetch the tags from a file\'s page even if it has the file and already previously downloaded it from that location.', check_manager ) )
|
||||
|
||||
menu_items.append( ( 'separator', 0, 0, 0 ) )
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerOptions( 'get_tags_if_url_known_and_file_redundant' )
|
||||
|
||||
menu_items.append( ( 'check', 'get tags even if url is known and file is already in db (default)', 'Set the default for this value.', check_manager ) )
|
||||
|
||||
self._cog_button = ClientGUICommon.MenuBitmapButton( self._gallery_downloader_panel, CC.GlobalBMPs.cog, menu_items )
|
||||
|
||||
self._file_limit = ClientGUICommon.NoneableSpinCtrl( self._gallery_downloader_panel, 'stop after this many files', min = 1, none_phrase = 'no limit' )
|
||||
self._file_limit.Bind( wx.EVT_SPINCTRL, self.EventFileLimit )
|
||||
self._file_limit.SetToolTip( 'per query, stop searching the gallery once this many files has been reached' )
|
||||
|
@ -1385,7 +1378,6 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
|
|||
self._gallery_downloader_panel.Add( self._import_queue_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._gallery_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._pending_queries_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._cog_button, CC.FLAGS_LONE_BUTTON )
|
||||
self._gallery_downloader_panel.Add( self._file_limit, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._file_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._tag_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
|
|
@ -8,6 +8,7 @@ import ClientGUICommon
|
|||
import ClientGUIDialogs
|
||||
import ClientGUIDialogsManage
|
||||
import ClientGUIMenus
|
||||
import ClientGUIScrolledPanels
|
||||
import ClientGUIScrolledPanelsEdit
|
||||
import ClientGUIScrolledPanelsManagement
|
||||
import ClientGUIShortcuts
|
||||
|
@ -839,6 +840,69 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
|
|||
|
||||
|
||||
|
||||
def _ManageNotes( self ):
|
||||
|
||||
def wx_do_it( media, notes ):
|
||||
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
title = 'manage notes'
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, title ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
control = wx.TextCtrl( panel, style = wx.TE_MULTILINE )
|
||||
|
||||
size = ClientData.ConvertTextToPixels( control, ( 80, 14 ) )
|
||||
|
||||
control.SetInitialSize( size )
|
||||
|
||||
control.SetValue( notes )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
wx.CallAfter( control.SetFocus )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
notes = control.GetValue()
|
||||
|
||||
hash = media.GetHash()
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_SET, ( notes, hash ) ) ]
|
||||
|
||||
service_keys_to_content_updates = { CC.LOCAL_NOTES_SERVICE_KEY : content_updates }
|
||||
|
||||
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
self.SetFocus()
|
||||
|
||||
|
||||
def thread_wait( media ):
|
||||
|
||||
# if it ultimately makes sense, I can load/cache notes in the media result
|
||||
|
||||
notes = HG.client_controller.Read( 'file_notes', media.GetHash() )
|
||||
|
||||
wx.CallAfter( wx_do_it, media, notes )
|
||||
|
||||
|
||||
if self._focussed_media is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( thread_wait, self._focussed_media.GetDisplayMedia() )
|
||||
|
||||
|
||||
def _ManageRatings( self ):
|
||||
|
||||
if len( self._selected_media ) > 0:
|
||||
|
@ -1051,6 +1115,10 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
|
|||
|
||||
self._ManageURLs()
|
||||
|
||||
elif action == 'manage_file_notes':
|
||||
|
||||
self._ManageNotes()
|
||||
|
||||
elif action == 'archive_file':
|
||||
|
||||
self._Archive()
|
||||
|
@ -3211,6 +3279,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'file\'s known urls', 'Manage urls for the focused file.', self._ManageURLs )
|
||||
ClientGUIMenus.AppendMenuItem( self, manage_menu, 'file\'s notes', 'Manage notes for the focused file.', self._ManageNotes )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, manage_menu, 'manage' )
|
||||
|
||||
|
@ -4007,52 +4076,63 @@ class Thumbnail( Selectable ):
|
|||
|
||||
tags = siblings_manager.CollapseTags( CC.COMBINED_TAG_SERVICE_KEY, tags )
|
||||
|
||||
tags_summary_generator = new_options.GetTagSummaryGenerator( 'thumbnail_top' )
|
||||
upper_tag_summary_generator = new_options.GetTagSummaryGenerator( 'thumbnail_top' )
|
||||
|
||||
upper_summary = tags_summary_generator.GenerateSummary( tags )
|
||||
upper_summary = upper_tag_summary_generator.GenerateSummary( tags )
|
||||
|
||||
if len( upper_summary ) > 0:
|
||||
|
||||
dc.SetFont( wx.SystemSettings.GetFont( wx.SYS_DEFAULT_GUI_FONT ) )
|
||||
|
||||
( text_x, text_y ) = dc.GetTextExtent( upper_summary )
|
||||
|
||||
top_left_x = int( ( width - text_x ) / 2 )
|
||||
top_left_y = CC.THUMBNAIL_BORDER
|
||||
|
||||
dc.SetBrush( wx.Brush( CC.COLOUR_UNSELECTED ) )
|
||||
|
||||
dc.SetTextForeground( CC.COLOUR_SELECTED_DARK )
|
||||
|
||||
dc.SetPen( wx.TRANSPARENT_PEN )
|
||||
|
||||
dc.DrawRectangle( 0, top_left_y - 1, width, text_y + 2 )
|
||||
|
||||
dc.DrawText( upper_summary, top_left_x, top_left_y )
|
||||
|
||||
lower_tag_summary_generator = new_options.GetTagSummaryGenerator( 'thumbnail_bottom_right' )
|
||||
|
||||
tags_summary_generator = new_options.GetTagSummaryGenerator( 'thumbnail_bottom_right' )
|
||||
lower_summary = lower_tag_summary_generator.GenerateSummary( tags )
|
||||
|
||||
lower_summary = tags_summary_generator.GenerateSummary( tags )
|
||||
|
||||
if len( lower_summary ) > 0:
|
||||
if len( upper_summary ) > 0 or len( lower_summary ) > 0:
|
||||
|
||||
dc.SetFont( wx.SystemSettings.GetFont( wx.SYS_DEFAULT_GUI_FONT ) )
|
||||
gc = wx.GraphicsContext.Create( dc )
|
||||
|
||||
( text_x, text_y ) = dc.GetTextExtent( lower_summary )
|
||||
if len( upper_summary ) > 0:
|
||||
|
||||
text_colour_with_alpha = upper_tag_summary_generator.GetTextColour()
|
||||
|
||||
gc.SetFont( wx.SystemSettings.GetFont( wx.SYS_DEFAULT_GUI_FONT ), text_colour_with_alpha )
|
||||
|
||||
background_colour_with_alpha = upper_tag_summary_generator.GetBackgroundColour()
|
||||
|
||||
gc.SetBrush( wx.Brush( background_colour_with_alpha ) )
|
||||
|
||||
gc.SetPen( wx.TRANSPARENT_PEN )
|
||||
|
||||
( text_x, text_y ) = gc.GetTextExtent( upper_summary )
|
||||
|
||||
top_left_x = int( ( width - text_x ) / 2 )
|
||||
top_left_y = CC.THUMBNAIL_BORDER
|
||||
|
||||
gc.DrawRectangle( 0, top_left_y - 1, width, text_y + 2 )
|
||||
|
||||
gc.DrawText( upper_summary, top_left_x, top_left_y )
|
||||
|
||||
|
||||
top_left_x = width - text_x - CC.THUMBNAIL_BORDER
|
||||
top_left_y = height - text_y - CC.THUMBNAIL_BORDER
|
||||
if len( lower_summary ) > 0:
|
||||
|
||||
text_colour_with_alpha = lower_tag_summary_generator.GetTextColour()
|
||||
|
||||
gc.SetFont( wx.SystemSettings.GetFont( wx.SYS_DEFAULT_GUI_FONT ), text_colour_with_alpha )
|
||||
|
||||
background_colour_with_alpha = lower_tag_summary_generator.GetBackgroundColour()
|
||||
|
||||
gc.SetBrush( wx.Brush( background_colour_with_alpha ) )
|
||||
|
||||
gc.SetPen( wx.TRANSPARENT_PEN )
|
||||
|
||||
( text_x, text_y ) = gc.GetTextExtent( lower_summary )
|
||||
|
||||
top_left_x = width - text_x - CC.THUMBNAIL_BORDER
|
||||
top_left_y = height - text_y - CC.THUMBNAIL_BORDER
|
||||
|
||||
gc.DrawRectangle( top_left_x - 1, top_left_y - 1, text_x + 2, text_y + 2 )
|
||||
|
||||
gc.DrawText( lower_summary, top_left_x, top_left_y )
|
||||
|
||||
|
||||
dc.SetBrush( wx.Brush( CC.COLOUR_UNSELECTED ) )
|
||||
|
||||
dc.SetTextForeground( CC.COLOUR_SELECTED_DARK )
|
||||
|
||||
dc.SetPen( wx.TRANSPARENT_PEN )
|
||||
|
||||
dc.DrawRectangle( top_left_x - 1, top_left_y - 1, text_x + 2, text_y + 2 )
|
||||
|
||||
dc.DrawText( lower_summary, top_left_x, top_left_y )
|
||||
del gc
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -434,6 +434,7 @@ class Page( wx.SplitterWindow ):
|
|||
|
||||
|
||||
self._controller.sub( self, 'SetPrettyStatus', 'new_page_status' )
|
||||
self._controller.sub( self, 'SetSplitterPositions', 'set_splitter_positions' )
|
||||
|
||||
|
||||
def _SetPrettyStatus( self, status ):
|
||||
|
@ -695,6 +696,32 @@ class Page( wx.SplitterWindow ):
|
|||
self._controller.pub( 'set_search_focus', self._page_key )
|
||||
|
||||
|
||||
def SetSplitterPositions( self, hpos, vpos ):
|
||||
|
||||
if self._search_preview_split.IsSplit():
|
||||
|
||||
self._search_preview_split.SetSashPosition( vpos )
|
||||
|
||||
else:
|
||||
|
||||
self._search_preview_split.SplitHorizontally( self._management_panel, self._preview_panel, vpos )
|
||||
|
||||
|
||||
if self.IsSplit():
|
||||
|
||||
self.SetSashPosition( hpos )
|
||||
|
||||
else:
|
||||
|
||||
self.SplitVertically( self._search_preview_split, self._media_panel, hpos )
|
||||
|
||||
|
||||
if HC.options[ 'hide_preview' ]:
|
||||
|
||||
wx.CallAfter( self._search_preview_split.Unsplit, self._preview_panel )
|
||||
|
||||
|
||||
|
||||
def SetSynchronisedWait( self ):
|
||||
|
||||
self._controller.pub( 'synchronised_wait_switch', self._page_key )
|
||||
|
|
|
@ -4482,9 +4482,9 @@ class TestPanel( wx.Panel ):
|
|||
|
||||
if len( example_data ) > 0:
|
||||
|
||||
parse_phrase = 'did not parse'
|
||||
parse_phrase = 'uncertain data type'
|
||||
|
||||
# can't just throw this at bs4, as that'll wrap any unparsable string in some bare <html><body><p> tags
|
||||
# can't just throw this at bs4 to see if it 'works', as it'll just wrap any unparsable string in some bare <html><body><p> tags
|
||||
if '<html' in example_data:
|
||||
|
||||
parse_phrase = 'looks like HTML'
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1529,8 +1529,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
HC.options[ 'namespace_colours' ] = self._namespace_colours.GetNamespaceColours()
|
||||
|
||||
HG.client_controller.pub( 'notify_new_colourset' )
|
||||
|
||||
|
||||
|
||||
class _ConnectionPanel( wx.Panel ):
|
||||
|
@ -1914,7 +1912,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptions( dlg, namespaces, tag_import_options )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptionsPanel( dlg, namespaces, tag_import_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -1961,7 +1959,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptions( dlg, namespaces, tag_import_options )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptionsPanel( dlg, namespaces, tag_import_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -2884,6 +2882,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
media_zooms = [ float( media_zoom ) for media_zoom in self._media_zooms.GetValue().split( ',' ) ]
|
||||
|
||||
media_zooms = [ media_zoom for media_zoom in media_zooms if media_zoom > 0.0 ]
|
||||
|
||||
if len( media_zooms ) > 0:
|
||||
|
||||
self._new_options.SetMediaZooms( media_zooms )
|
||||
|
@ -4866,693 +4866,6 @@ class ManageShortcutsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
|
||||
|
||||
class ManageSubscriptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
ClientGUIScrolledPanels.ManagePanel.__init__( self, parent )
|
||||
|
||||
subscriptions = HG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||
|
||||
#
|
||||
|
||||
menu_items = []
|
||||
|
||||
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/getting_started_subscriptions.html' )
|
||||
|
||||
menu_items.append( ( 'normal', 'open the html subscriptions help', 'Open the help page for subscriptions in your web browesr.', page_func ) )
|
||||
|
||||
help_button = ClientGUICommon.MenuBitmapButton( self, CC.GlobalBMPs.help, menu_items )
|
||||
|
||||
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', wx.Colour( 0, 0, 255 ) )
|
||||
|
||||
subscriptions_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
columns = [ ( 'name', -1 ), ( 'site', 20 ), ( 'query status', 25 ), ( 'last new file time', 20 ), ( 'last checked', 20 ), ( 'recent error/delay?', 20 ), ( 'urls', 8 ), ( 'failures', 8 ), ( 'paused', 8 ) ]
|
||||
|
||||
self._subscriptions = ClientGUIListCtrl.BetterListCtrl( subscriptions_panel, 'subscriptions', 25, 20, columns, self._ConvertSubscriptionToListCtrlTuples, delete_key_callback = self.Delete, activation_callback = self.Edit )
|
||||
|
||||
subscriptions_panel.SetListCtrl( self._subscriptions )
|
||||
|
||||
subscriptions_panel.AddButton( 'add', self.Add )
|
||||
|
||||
menu_items = []
|
||||
|
||||
menu_items.append( ( 'normal', 'to clipboard', 'Serialise the script and put it on your clipboard.', self.ExportToClipboard ) )
|
||||
menu_items.append( ( 'normal', 'to png', 'Serialise the script and encode it to an image file you can easily share with other hydrus users.', self.ExportToPng ) )
|
||||
|
||||
subscriptions_panel.AddMenuButton( 'export', menu_items, enabled_only_on_selection = True )
|
||||
|
||||
menu_items = []
|
||||
|
||||
menu_items.append( ( 'normal', 'from clipboard', 'Load a script from text in your clipboard.', self.ImportFromClipboard ) )
|
||||
menu_items.append( ( 'normal', 'from png', 'Load a script from an encoded png.', self.ImportFromPng ) )
|
||||
|
||||
subscriptions_panel.AddMenuButton( 'import', menu_items )
|
||||
subscriptions_panel.AddButton( 'duplicate', self.Duplicate, enabled_only_on_selection = True )
|
||||
subscriptions_panel.AddButton( 'edit', self.Edit, enabled_only_on_selection = True )
|
||||
subscriptions_panel.AddButton( 'delete', self.Delete, enabled_only_on_selection = True )
|
||||
|
||||
subscriptions_panel.NewButtonRow()
|
||||
|
||||
subscriptions_panel.AddButton( 'merge', self.Merge, enabled_check_func = self._CanMerge )
|
||||
subscriptions_panel.AddButton( 'separate', self.Separate, enabled_check_func = self._CanSeparate )
|
||||
|
||||
subscriptions_panel.AddSeparator()
|
||||
|
||||
subscriptions_panel.AddButton( 'pause/resume', self.PauseResume, enabled_only_on_selection = True )
|
||||
subscriptions_panel.AddButton( 'retry failures', self.RetryFailures, enabled_check_func = self._CanRetryFailures )
|
||||
subscriptions_panel.AddButton( 'scrub delays', self.ScrubDelays, enabled_check_func = self._CanScrubDelays )
|
||||
subscriptions_panel.AddButton( 'check queries now', self.CheckNow, enabled_check_func = self._CanCheckNow )
|
||||
subscriptions_panel.AddButton( 'reset', self.Reset, enabled_check_func = self._CanReset )
|
||||
|
||||
subscriptions_panel.NewButtonRow()
|
||||
|
||||
subscriptions_panel.AddButton( 'select subscriptions', self.SelectSubscriptions )
|
||||
subscriptions_panel.AddButton( 'overwrite checker timings', self.SetCheckerOptions, enabled_only_on_selection = True )
|
||||
|
||||
#
|
||||
|
||||
self._subscriptions.AddDatas( subscriptions )
|
||||
|
||||
#
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( help_hbox, CC.FLAGS_BUTTON_SIZER )
|
||||
vbox.Add( subscriptions_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _CanCheckNow( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
return True in ( subscription.CanCheckNow() for subscription in subscriptions )
|
||||
|
||||
|
||||
def _CanMerge( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
# only subs with queries can be merged
|
||||
|
||||
subscriptions = [ subscription for subscription in subscriptions if len( subscription.GetQueries() ) > 0 ]
|
||||
|
||||
gallery_identifiers = { subscription.GetGalleryIdentifier() for subscription in subscriptions }
|
||||
|
||||
# if there are fewer, there must be dupes, so we must be able to merge
|
||||
|
||||
return len( gallery_identifiers ) < len( subscriptions )
|
||||
|
||||
|
||||
def _CanReset( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
return True in ( subscription.CanReset() for subscription in subscriptions )
|
||||
|
||||
|
||||
def _CanRetryFailures( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
return True in ( subscription.CanRetryFailures() for subscription in subscriptions )
|
||||
|
||||
|
||||
def _CanScrubDelays( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
return True in ( subscription.CanScrubDelay() for subscription in subscriptions )
|
||||
|
||||
|
||||
def _CanSeparate( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
if len( subscription.GetQueries() ) > 1:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _ConvertSubscriptionToListCtrlTuples( self, subscription ):
|
||||
|
||||
( name, gallery_identifier, gallery_stream_identifiers, queries, checker_options, get_tags_if_url_known_and_file_redundant, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until, no_work_until_reason ) = subscription.ToTuple()
|
||||
|
||||
pretty_site = gallery_identifier.ToString()
|
||||
|
||||
period = 100
|
||||
pretty_period = 'fix this'
|
||||
|
||||
if len( queries ) > 0:
|
||||
|
||||
last_new_file_time = max( ( query.GetLatestAddedTime() for query in queries ) )
|
||||
pretty_last_new_file_time = HydrusData.ConvertTimestampToPrettyAgo( last_new_file_time )
|
||||
|
||||
last_checked = max( ( query.GetLastChecked() for query in queries ) )
|
||||
pretty_last_checked = HydrusData.ConvertTimestampToPrettyAgo( last_checked )
|
||||
|
||||
else:
|
||||
|
||||
last_new_file_time = 0
|
||||
pretty_last_new_file_time = 'n/a'
|
||||
|
||||
last_checked = 0
|
||||
pretty_last_checked = 'n/a'
|
||||
|
||||
|
||||
#
|
||||
|
||||
num_queries = len( queries )
|
||||
num_dead = 0
|
||||
num_paused = 0
|
||||
|
||||
for query in queries:
|
||||
|
||||
if query.IsDead():
|
||||
|
||||
num_dead += 1
|
||||
|
||||
elif query.IsPaused():
|
||||
|
||||
num_paused += 1
|
||||
|
||||
|
||||
|
||||
num_ok = num_queries - ( num_dead + num_paused )
|
||||
|
||||
status = ( num_queries, num_paused, num_dead )
|
||||
|
||||
if num_queries == 0:
|
||||
|
||||
pretty_status = 'no queries'
|
||||
|
||||
else:
|
||||
|
||||
status_components = [ HydrusData.ConvertIntToPrettyString( num_ok ) + ' working' ]
|
||||
|
||||
if num_paused > 0:
|
||||
|
||||
status_components.append( HydrusData.ConvertIntToPrettyString( num_paused ) + ' paused' )
|
||||
|
||||
|
||||
if num_dead > 0:
|
||||
|
||||
status_components.append( HydrusData.ConvertIntToPrettyString( num_dead ) + ' dead' )
|
||||
|
||||
|
||||
pretty_status = ', '.join( status_components )
|
||||
|
||||
|
||||
#
|
||||
|
||||
if HydrusData.TimeHasPassed( no_work_until ):
|
||||
|
||||
pretty_delay = ''
|
||||
delay = 0
|
||||
|
||||
else:
|
||||
|
||||
pretty_delay = 'delaying ' + HydrusData.ConvertTimestampToPrettyPending( no_work_until, prefix = 'for' ) + ' - ' + no_work_until_reason
|
||||
delay = HydrusData.GetTimeDeltaUntilTime( no_work_until )
|
||||
|
||||
|
||||
num_urls_done = 0
|
||||
num_urls = 0
|
||||
num_failed = 0
|
||||
|
||||
for query in queries:
|
||||
|
||||
( query_num_urls_unknown, query_num_urls, query_num_failed ) = query.GetNumURLsAndFailed()
|
||||
|
||||
num_urls_done += query_num_urls - query_num_urls_unknown
|
||||
|
||||
num_urls += query_num_urls
|
||||
|
||||
num_failed += query_num_failed
|
||||
|
||||
|
||||
if num_urls_done == num_urls:
|
||||
|
||||
pretty_urls = HydrusData.ConvertIntToPrettyString( num_urls )
|
||||
|
||||
else:
|
||||
|
||||
pretty_urls = HydrusData.ConvertValueRangeToPrettyString( num_urls_done, num_urls )
|
||||
|
||||
|
||||
if num_urls > 0:
|
||||
|
||||
sort_float = float( num_urls_done ) / num_urls
|
||||
|
||||
else:
|
||||
|
||||
sort_float = 0.0
|
||||
|
||||
|
||||
num_urls_sortable = ( sort_float, num_urls, num_urls_done )
|
||||
|
||||
pretty_failures = HydrusData.ConvertIntToPrettyString( num_failed )
|
||||
|
||||
if paused:
|
||||
|
||||
pretty_paused = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_paused = ''
|
||||
|
||||
|
||||
display_tuple = ( name, pretty_site, pretty_status, pretty_last_new_file_time, pretty_last_checked, pretty_delay, pretty_urls, pretty_failures, pretty_paused )
|
||||
sort_tuple = ( name, pretty_site, status, last_new_file_time, last_checked, delay, num_urls_sortable, num_failed, paused )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData()
|
||||
|
||||
names = { subscription.GetName() for subscription in subscriptions }
|
||||
|
||||
return names
|
||||
|
||||
|
||||
def _GetExportObject( self ):
|
||||
|
||||
to_export = HydrusSerialisable.SerialisableList()
|
||||
|
||||
for subscription in self._subscriptions.GetData( only_selected = True ):
|
||||
|
||||
to_export.append( subscription )
|
||||
|
||||
|
||||
if len( to_export ) == 0:
|
||||
|
||||
return None
|
||||
|
||||
elif len( to_export ) == 1:
|
||||
|
||||
return to_export[0]
|
||||
|
||||
else:
|
||||
|
||||
return to_export
|
||||
|
||||
|
||||
|
||||
def _ImportObject( self, obj ):
|
||||
|
||||
if isinstance( obj, HydrusSerialisable.SerialisableList ):
|
||||
|
||||
for sub_obj in obj:
|
||||
|
||||
self._ImportObject( sub_obj )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if isinstance( obj, ClientImporting.Subscription ):
|
||||
|
||||
subscription = obj
|
||||
|
||||
subscription.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._subscriptions.AddDatas( ( subscription, ) )
|
||||
|
||||
else:
|
||||
|
||||
wx.MessageBox( 'That was not a subscription--it was a: ' + type( obj ).__name__ )
|
||||
|
||||
|
||||
|
||||
|
||||
def Add( self ):
|
||||
|
||||
empty_subscription = ClientImporting.Subscription( 'new subscription' )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit subscription' ) as dlg_edit:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditSubscriptionPanel( dlg_edit, empty_subscription )
|
||||
|
||||
dlg_edit.SetPanel( panel )
|
||||
|
||||
if dlg_edit.ShowModal() == wx.ID_OK:
|
||||
|
||||
new_subscription = panel.GetValue()
|
||||
|
||||
new_subscription.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._subscriptions.AddDatas( ( new_subscription, ) )
|
||||
|
||||
|
||||
|
||||
|
||||
def CheckNow( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.CheckNow()
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
def CommitChanges( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData()
|
||||
|
||||
HG.client_controller.Write( 'serialisables_overwrite', [ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ], subscriptions )
|
||||
|
||||
# we pubsub daemon wake outside, so not needed here--it happens even on cancel
|
||||
|
||||
|
||||
def Delete( self ):
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'Remove all selected?' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._subscriptions.DeleteSelected()
|
||||
|
||||
|
||||
|
||||
|
||||
def Duplicate( self ):
|
||||
|
||||
subs_to_dupe = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subs_to_dupe:
|
||||
|
||||
dupe_subscription = subscription.Duplicate()
|
||||
|
||||
dupe_subscription.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._subscriptions.AddDatas( ( dupe_subscription, ) )
|
||||
|
||||
|
||||
|
||||
def Edit( self ):
|
||||
|
||||
subs_to_edit = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subs_to_edit:
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit subscription' ) as dlg:
|
||||
|
||||
original_name = subscription.GetName()
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditSubscriptionPanel( dlg, subscription )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
result = dlg.ShowModal()
|
||||
|
||||
if result == wx.ID_OK:
|
||||
|
||||
self._subscriptions.DeleteDatas( ( subscription, ) )
|
||||
|
||||
edited_subscription = panel.GetValue()
|
||||
|
||||
edited_subscription.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._subscriptions.AddDatas( ( edited_subscription, ) )
|
||||
|
||||
elif result == wx.ID_CANCEL:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._subscriptions.Sort()
|
||||
|
||||
|
||||
def ExportToClipboard( self ):
|
||||
|
||||
export_object = self._GetExportObject()
|
||||
|
||||
if export_object is not None:
|
||||
|
||||
json = export_object.DumpToString()
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', json )
|
||||
|
||||
|
||||
|
||||
def ExportToPng( self ):
|
||||
|
||||
export_object = self._GetExportObject()
|
||||
|
||||
if export_object is not None:
|
||||
|
||||
with ClientGUITopLevelWindows.DialogNullipotent( self, 'export to png' ) as dlg:
|
||||
|
||||
panel = ClientGUISerialisable.PngExportPanel( dlg, export_object )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
dlg.ShowModal()
|
||||
|
||||
|
||||
|
||||
|
||||
def ImportFromClipboard( self ):
|
||||
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
try:
|
||||
|
||||
obj = HydrusSerialisable.CreateFromString( raw_text )
|
||||
|
||||
self._ImportObject( obj )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
wx.MessageBox( 'I could not understand what was in the clipboard' )
|
||||
|
||||
|
||||
|
||||
def ImportFromPng( self ):
|
||||
|
||||
with wx.FileDialog( self, 'select the png with the encoded script', wildcard = 'PNG (*.png)|*.png' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
path = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
|
||||
try:
|
||||
|
||||
payload = ClientSerialisable.LoadFromPng( path )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
wx.MessageBox( HydrusData.ToUnicode( e ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
obj = HydrusSerialisable.CreateFromNetworkString( payload )
|
||||
|
||||
self._ImportObject( obj )
|
||||
|
||||
except:
|
||||
|
||||
wx.MessageBox( 'I could not understand what was encoded in the png!' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def Merge( self ):
|
||||
|
||||
message = 'Are you sure you want to merge the selected subscriptions? This will combine all selected subscriptions that share the same downloader, wrapping all their different queries into one subscription.'
|
||||
message += os.linesep * 2
|
||||
message += 'This is a big operation, so if it does not do what you expect, hit cancel afterwards!'
|
||||
message += os.linesep * 2
|
||||
message += 'Please note that all other subscription settings settings (like name and paused status and file limits and tag options) will be merged as well, so double-check your merged subs\' settings after the merge.'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
to_be_merged_subs = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
self._subscriptions.DeleteDatas( to_be_merged_subs )
|
||||
|
||||
to_be_merged_subs = list( to_be_merged_subs )
|
||||
|
||||
merged_subs = []
|
||||
|
||||
while len( to_be_merged_subs ) > 1:
|
||||
|
||||
primary_sub = to_be_merged_subs.pop()
|
||||
|
||||
unmerged_subs = primary_sub.Merge( to_be_merged_subs )
|
||||
|
||||
merged_subs.append( primary_sub )
|
||||
|
||||
to_be_merged_subs = unmerged_subs
|
||||
|
||||
|
||||
self._subscriptions.AddDatas( merged_subs )
|
||||
self._subscriptions.AddDatas( to_be_merged_subs )
|
||||
|
||||
self._subscriptions.Sort()
|
||||
|
||||
|
||||
|
||||
|
||||
def PauseResume( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.PauseResume()
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
def Reset( self ):
|
||||
|
||||
message = '''Resetting these subscriptions will delete all their remembered urls, meaning when they next run, they will try to download them all over again. This may be expensive in time and data. Only do it if you are willing to wait. Do you want to do it?'''
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.Reset()
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
|
||||
|
||||
def RetryFailures( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.RetryFailures()
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
def ScrubDelays( self ):
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.ScrubDelay()
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
def SelectSubscriptions( self ):
|
||||
|
||||
message = 'This selects subscriptions based on query text. Please enter some search text, and any subscription that has a query that includes that text will be selected.'
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
search_text = dlg.GetValue()
|
||||
|
||||
self._subscriptions.SelectNone()
|
||||
|
||||
selectee_subscriptions = []
|
||||
|
||||
for subscription in self._subscriptions.GetData():
|
||||
|
||||
if subscription.HasQuerySearchText( search_text ):
|
||||
|
||||
selectee_subscriptions.append( subscription )
|
||||
|
||||
|
||||
|
||||
self._subscriptions.SelectDatas( selectee_subscriptions )
|
||||
|
||||
|
||||
|
||||
|
||||
def Separate( self ):
|
||||
|
||||
message = 'Are you sure you want to separate the selected subscriptions? This will cause all the subscriptions with multiple queries to be split into duplicates that each only have one query.'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
to_be_separate_subs = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
self._subscriptions.DeleteDatas( to_be_separate_subs )
|
||||
|
||||
for subscription in to_be_separate_subs:
|
||||
|
||||
separate_subs = subscription.Separate()
|
||||
|
||||
for separate_subscription in separate_subs:
|
||||
|
||||
separate_subscription.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._subscriptions.AddDatas( ( separate_subscription, ) )
|
||||
|
||||
|
||||
|
||||
self._subscriptions.Sort()
|
||||
|
||||
|
||||
|
||||
|
||||
def SetCheckerOptions( self ):
|
||||
|
||||
checker_options = ClientData.CheckerOptions( intended_files_per_check = 5, never_faster_than = 86400, never_slower_than = 90 * 86400, death_file_velocity = ( 1, 90 * 86400 ) )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit check timings' ) as dlg:
|
||||
|
||||
panel = ClientGUITime.EditCheckerOptions( dlg, checker_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
checker_options = panel.GetValue()
|
||||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.SetCheckerOptions( checker_options )
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
||||
|
||||
|
||||
|
||||
class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
||||
|
||||
def __init__( self, parent, file_service_key, media, immediate_commit = False, canvas_key = None ):
|
||||
|
|
|
@ -315,7 +315,7 @@ class ReviewAllBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._history_time_delta_none = wx.CheckBox( self, label = 'show all' )
|
||||
self._history_time_delta_none.Bind( wx.EVT_CHECKBOX, self.EventTimeDeltaChanged )
|
||||
|
||||
self._bandwidths = ClientGUIListCtrl.BetterListCtrl( self, 'bandwidth review', 20, 30, [ ( 'name', -1 ), ( 'type', 14 ), ( 'current usage', 14 ), ( 'past 24 hours', 15 ), ( 'this month', 12 ), ( 'has specific rules', 18 ), ( 'blocked?', 10 ) ], self._ConvertNetworkContextsToListCtrlTuples, activation_callback = self.ShowNetworkContext )
|
||||
self._bandwidths = ClientGUIListCtrl.BetterListCtrl( self, 'bandwidth review', 20, 30, [ ( 'name', -1 ), ( 'type', 14 ), ( 'current usage', 14 ), ( 'past 24 hours', 15 ), ( 'search distance', 17 ), ( 'this month', 12 ), ( 'has specific rules', 18 ), ( 'blocked?', 10 ) ], self._ConvertNetworkContextsToListCtrlTuples, activation_callback = self.ShowNetworkContext )
|
||||
|
||||
self._edit_default_bandwidth_rules_button = ClientGUICommon.BetterButton( self, 'edit default bandwidth rules', self._EditDefaultBandwidthRules )
|
||||
|
||||
|
@ -328,7 +328,7 @@ class ReviewAllBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._history_time_delta_threshold.SetValue( 86400 * 30 )
|
||||
self._history_time_delta_threshold.SetValue( 86400 * 7 )
|
||||
|
||||
self._bandwidths.Sort( 0 )
|
||||
|
||||
|
@ -367,8 +367,33 @@ class ReviewAllBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
sortable_network_context = ( network_context.context_type, network_context.context_data )
|
||||
sortable_context_type = CC.network_context_type_string_lookup[ network_context.context_type ]
|
||||
current_usage = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, 1, for_user = True )
|
||||
day_usage = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, 86400 )
|
||||
month_usage = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, None )
|
||||
|
||||
day_usage_requests = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_REQUESTS, 86400 )
|
||||
day_usage_data = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, 86400 )
|
||||
|
||||
day_usage = ( day_usage_data, day_usage_requests )
|
||||
|
||||
month_usage_requests = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_REQUESTS, None )
|
||||
month_usage_data = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, None )
|
||||
|
||||
month_usage = ( month_usage_data, month_usage_requests )
|
||||
|
||||
if self._history_time_delta_none.GetValue():
|
||||
|
||||
search_usage = 0
|
||||
pretty_search_usage = ''
|
||||
|
||||
else:
|
||||
|
||||
search_delta = self._history_time_delta_threshold.GetValue()
|
||||
|
||||
search_usage_requests = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_REQUESTS, search_delta )
|
||||
search_usage_data = bandwidth_tracker.GetUsage( HC.BANDWIDTH_TYPE_DATA, search_delta )
|
||||
|
||||
search_usage = ( search_usage_data, search_usage_requests )
|
||||
|
||||
pretty_search_usage = HydrusData.ConvertIntToBytes( search_usage_data ) + ' in ' + HydrusData.ConvertIntToPrettyString( search_usage_requests ) + ' requests'
|
||||
|
||||
|
||||
pretty_network_context = network_context.ToUnicode()
|
||||
pretty_context_type = CC.network_context_type_string_lookup[ network_context.context_type ]
|
||||
|
@ -382,8 +407,8 @@ class ReviewAllBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
pretty_current_usage = HydrusData.ConvertIntToBytes( current_usage ) + '/s'
|
||||
|
||||
|
||||
pretty_day_usage = HydrusData.ConvertIntToBytes( day_usage )
|
||||
pretty_month_usage = HydrusData.ConvertIntToBytes( month_usage )
|
||||
pretty_day_usage = HydrusData.ConvertIntToBytes( day_usage_data ) + ' in ' + HydrusData.ConvertIntToPrettyString( day_usage_requests ) + ' requests'
|
||||
pretty_month_usage = HydrusData.ConvertIntToBytes( month_usage_data ) + ' in ' + HydrusData.ConvertIntToPrettyString( month_usage_requests ) + ' requests'
|
||||
|
||||
if has_rules:
|
||||
|
||||
|
@ -405,7 +430,10 @@ class ReviewAllBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
pretty_blocked = ''
|
||||
|
||||
|
||||
return ( ( pretty_network_context, pretty_context_type, pretty_current_usage, pretty_day_usage, pretty_month_usage, pretty_has_rules, pretty_blocked ), ( sortable_network_context, sortable_context_type, current_usage, day_usage, month_usage, has_rules, blocked ) )
|
||||
display_tuple = ( pretty_network_context, pretty_context_type, pretty_current_usage, pretty_day_usage, pretty_search_usage, pretty_month_usage, pretty_has_rules, pretty_blocked )
|
||||
sort_tuple = ( sortable_network_context, sortable_context_type, current_usage, day_usage, search_usage, month_usage, has_rules, blocked )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _DeleteNetworkContexts( self ):
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -41,11 +41,15 @@ class HydrusResourceBooruFile( HydrusResourceBooru ):
|
|||
|
||||
HG.client_controller.local_booru_manager.CheckFileAuthorised( share_key, hash )
|
||||
|
||||
media_result = HG.client_controller.local_booru_manager.GetMediaResult( share_key, hash )
|
||||
|
||||
mime = media_result.GetMime()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
path = client_files_manager.GetFilePath( hash )
|
||||
path = client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, path = path )
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, path = path )
|
||||
|
||||
return response_context
|
||||
|
||||
|
@ -248,9 +252,18 @@ class HydrusResourceBooruThumbnail( HydrusResourceBooru ):
|
|||
|
||||
response_context_mime = HC.APPLICATION_UNKNOWN
|
||||
|
||||
elif mime in HC.AUDIO: path = os.path.join( HC.STATIC_DIR, 'audio.png' )
|
||||
elif mime == HC.APPLICATION_PDF: path = os.path.join( HC.STATIC_DIR, 'pdf.png' )
|
||||
else: path = os.path.join( HC.STATIC_DIR, 'hydrus.png' )
|
||||
elif mime in HC.AUDIO:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'audio.png' )
|
||||
|
||||
elif mime == HC.APPLICATION_PDF:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'pdf.png' )
|
||||
|
||||
else:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'hydrus.png' )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = response_context_mime, path = path )
|
||||
|
||||
|
|
|
@ -1689,7 +1689,7 @@ class MediaSingleton( Media ):
|
|||
|
||||
new_options = HG.client_controller.new_options
|
||||
|
||||
tags_summary_generator = new_options.GetTagSummaryGenerator( 'media_viewer_top' )
|
||||
tag_summary_generator = new_options.GetTagSummaryGenerator( 'media_viewer_top' )
|
||||
|
||||
tm = self.GetTagsManager()
|
||||
|
||||
|
@ -1704,7 +1704,7 @@ class MediaSingleton( Media ):
|
|||
|
||||
tags = siblings_manager.CollapseTags( CC.COMBINED_TAG_SERVICE_KEY, tags )
|
||||
|
||||
summary = tags_summary_generator.GenerateSummary( tags )
|
||||
summary = tag_summary_generator.GenerateSummary( tags )
|
||||
|
||||
return summary
|
||||
|
||||
|
|
|
@ -338,22 +338,6 @@ class NetworkBandwidthManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
result = set()
|
||||
|
||||
for ( network_context, bandwidth_rules ) in self._network_contexts_to_bandwidth_rules.items():
|
||||
|
||||
if network_context.IsDefault() or network_context.IsEphemeral():
|
||||
|
||||
continue
|
||||
|
||||
|
||||
# if a context has rules but no activity, list it so the user can edit the rules if needed
|
||||
# in case they set too restrictive rules on an old context and now can't get it up again with activity because of the rules!
|
||||
|
||||
if network_context not in self._network_contexts_to_bandwidth_trackers or self._network_contexts_to_bandwidth_trackers[ network_context ].GetUsage( HC.BANDWIDTH_TYPE_REQUESTS, None ) == 0:
|
||||
|
||||
result.add( network_context )
|
||||
|
||||
|
||||
|
||||
for ( network_context, bandwidth_tracker ) in self._network_contexts_to_bandwidth_trackers.items():
|
||||
|
||||
if network_context.IsDefault() or network_context.IsEphemeral():
|
||||
|
|
|
@ -247,11 +247,33 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
# is before
|
||||
# post url
|
||||
|
||||
# also, put more 'precise' URL types above more typically permissive, in the order:
|
||||
# file
|
||||
# post
|
||||
# gallery
|
||||
# watchable
|
||||
# sorting in reverse, so higher number means more precise
|
||||
|
||||
def key( u_m ):
|
||||
|
||||
u_t = u_m.GetURLType()
|
||||
|
||||
if u_t == HC.URL_TYPE_FILE:
|
||||
|
||||
u_t_precision_value = 2
|
||||
|
||||
elif u_t == HC.URL_TYPE_POST:
|
||||
|
||||
u_t_precision_value = 1
|
||||
|
||||
else:
|
||||
|
||||
u_t_precision_value = 0
|
||||
|
||||
|
||||
u_e = u_m.GetExampleURL()
|
||||
|
||||
return ( u_e.count( '/' ), u_e.count( '=' ) )
|
||||
return ( u_t_precision_value, u_e.count( '/' ), u_e.count( '=' ) )
|
||||
|
||||
|
||||
for url_matches in self._domains_to_url_matches.values():
|
||||
|
|
|
@ -245,9 +245,19 @@ class TagSummaryGenerator( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_TAG_SUMMARY_GENERATOR
|
||||
SERIALISABLE_NAME = 'Tag Summary Generator'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, namespace_info = None, separator = None, example_tags = None ):
|
||||
def __init__( self, background_colour = None, text_colour = None, namespace_info = None, separator = None, example_tags = None, show = True ):
|
||||
|
||||
if background_colour is None:
|
||||
|
||||
background_colour = wx.Colour( 223, 227, 230, 255 )
|
||||
|
||||
|
||||
if text_colour is None:
|
||||
|
||||
text_colour = wx.Colour( 1, 17, 26, 255 )
|
||||
|
||||
|
||||
if namespace_info is None:
|
||||
|
||||
|
@ -268,21 +278,32 @@ class TagSummaryGenerator( HydrusSerialisable.SerialisableBase ):
|
|||
example_tags = []
|
||||
|
||||
|
||||
self._background_colour = background_colour
|
||||
self._text_colour = text_colour
|
||||
self._namespace_info = namespace_info
|
||||
self._separator = separator
|
||||
self._example_tags = list( example_tags )
|
||||
self._show = show
|
||||
|
||||
self._UpdateNamespaceLookup()
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self._namespace_info, self._separator, self._example_tags )
|
||||
return ( list( self._background_colour.Get() ), list( self._text_colour.Get() ), self._namespace_info, self._separator, self._example_tags, self._show )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._namespace_info, self._separator, self._example_tags ) = serialisable_info
|
||||
( background_rgba, text_rgba, self._namespace_info, self._separator, self._example_tags, self._show ) = serialisable_info
|
||||
|
||||
( r, g, b, a ) = background_rgba
|
||||
|
||||
self._background_colour = wx.Colour( r, g, b, a )
|
||||
|
||||
( r, g, b, a ) = text_rgba
|
||||
|
||||
self._text_colour = wx.Colour( r, g, b, a )
|
||||
|
||||
self._namespace_info = [ tuple( row ) for row in self._namespace_info ]
|
||||
|
||||
|
@ -294,13 +315,41 @@ class TagSummaryGenerator( HydrusSerialisable.SerialisableBase ):
|
|||
self._interesting_namespaces = { namespace for ( namespace, prefix, separator ) in self._namespace_info }
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( namespace_info, separator, example_tags ) = old_serialisable_info
|
||||
|
||||
background_rgba = ( 223, 227, 230, 255 )
|
||||
text_rgba = ( 1, 17, 26, 255 )
|
||||
show = True
|
||||
|
||||
new_serialisable_info = ( background_rgba, text_rgba, namespace_info, separator, example_tags, show )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GenerateExampleSummary( self ):
|
||||
|
||||
return self.GenerateSummary( self._example_tags )
|
||||
if not self._show:
|
||||
|
||||
return 'not showing'
|
||||
|
||||
else:
|
||||
|
||||
return self.GenerateSummary( self._example_tags )
|
||||
|
||||
|
||||
|
||||
def GenerateSummary( self, tags, max_length = None ):
|
||||
|
||||
if not self._show:
|
||||
|
||||
return ''
|
||||
|
||||
|
||||
namespaces_to_subtags = collections.defaultdict( list )
|
||||
|
||||
for tag in tags:
|
||||
|
@ -346,9 +395,19 @@ class TagSummaryGenerator( HydrusSerialisable.SerialisableBase ):
|
|||
return summary
|
||||
|
||||
|
||||
def GetBackgroundColour( self ):
|
||||
|
||||
return self._background_colour
|
||||
|
||||
|
||||
def GetTextColour( self ):
|
||||
|
||||
return self._text_colour
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._namespace_info, self._separator, self._example_tags )
|
||||
return ( self._background_colour, self._text_colour, self._namespace_info, self._separator, self._example_tags, self._show )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_TAG_SUMMARY_GENERATOR ] = TagSummaryGenerator
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 296
|
||||
SOFTWARE_VERSION = 297
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -110,6 +110,7 @@ CONTENT_TYPE_VARIABLE = 14
|
|||
CONTENT_TYPE_HASH = 15
|
||||
CONTENT_TYPE_TIMESTAMP = 16
|
||||
CONTENT_TYPE_TITLE = 17
|
||||
CONTENT_TYPE_NOTES = 18
|
||||
|
||||
content_type_string_lookup = {}
|
||||
|
||||
|
@ -131,6 +132,7 @@ content_type_string_lookup[ CONTENT_TYPE_VARIABLE ] = 'variable'
|
|||
content_type_string_lookup[ CONTENT_TYPE_HASH ] = 'hash'
|
||||
content_type_string_lookup[ CONTENT_TYPE_TIMESTAMP ] = 'timestamp'
|
||||
content_type_string_lookup[ CONTENT_TYPE_TITLE ] = 'title'
|
||||
content_type_string_lookup[ CONTENT_TYPE_NOTES ] = 'notes'
|
||||
|
||||
REPOSITORY_CONTENT_TYPES = [ CONTENT_TYPE_FILES, CONTENT_TYPE_MAPPINGS, CONTENT_TYPE_TAG_PARENTS, CONTENT_TYPE_TAG_SIBLINGS ]
|
||||
|
||||
|
@ -301,6 +303,7 @@ IPFS = 13
|
|||
LOCAL_FILE_TRASH_DOMAIN = 14
|
||||
COMBINED_LOCAL_FILE = 15
|
||||
TEST_SERVICE = 16
|
||||
LOCAL_NOTES = 17
|
||||
SERVER_ADMIN = 99
|
||||
NULL_SERVICE = 100
|
||||
|
||||
|
@ -322,13 +325,14 @@ service_string_lookup[ COMBINED_FILE ] = 'virtual combined file service'
|
|||
service_string_lookup[ LOCAL_BOORU ] = 'hydrus local booru'
|
||||
service_string_lookup[ IPFS ] = 'ipfs daemon'
|
||||
service_string_lookup[ TEST_SERVICE ] = 'test service'
|
||||
service_string_lookup[ LOCAL_NOTES ] = 'local file notes service'
|
||||
service_string_lookup[ SERVER_ADMIN ] = 'hydrus server administration service'
|
||||
service_string_lookup[ NULL_SERVICE ] = 'null service'
|
||||
|
||||
LOCAL_FILE_SERVICES = ( LOCAL_FILE_DOMAIN, LOCAL_FILE_TRASH_DOMAIN, COMBINED_LOCAL_FILE )
|
||||
LOCAL_TAG_SERVICES = ( LOCAL_TAG, )
|
||||
|
||||
LOCAL_SERVICES = LOCAL_FILE_SERVICES + LOCAL_TAG_SERVICES + ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, LOCAL_BOORU )
|
||||
LOCAL_SERVICES = LOCAL_FILE_SERVICES + LOCAL_TAG_SERVICES + ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, LOCAL_BOORU, LOCAL_NOTES )
|
||||
|
||||
RATINGS_SERVICES = ( LOCAL_RATING_LIKE, LOCAL_RATING_NUMERICAL, RATING_LIKE_REPOSITORY, RATING_NUMERICAL_REPOSITORY )
|
||||
REPOSITORIES = ( TAG_REPOSITORY, FILE_REPOSITORY, RATING_LIKE_REPOSITORY, RATING_NUMERICAL_REPOSITORY )
|
||||
|
|
|
@ -1292,13 +1292,13 @@ def ToUnicode( text_producing_object ):
|
|||
|
||||
try:
|
||||
|
||||
text = text.decode( 'utf-16' )
|
||||
text = text.decode( locale.getpreferredencoding() )
|
||||
|
||||
except:
|
||||
|
||||
try:
|
||||
|
||||
text = text.decode( locale.getpreferredencoding() )
|
||||
text = text.decode( 'utf-16' )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -1331,171 +1331,6 @@ class HydrusYAMLBase( yaml.YAMLObject ):
|
|||
yaml_loader = yaml.SafeLoader
|
||||
yaml_dumper = yaml.SafeDumper
|
||||
|
||||
class Account( HydrusYAMLBase ):
|
||||
|
||||
yaml_tag = u'!Account'
|
||||
|
||||
def __init__( self, account_key, account_type, created, expires, used_bytes, used_requests, banned_info = None ):
|
||||
|
||||
HydrusYAMLBase.__init__( self )
|
||||
|
||||
self._info = {}
|
||||
|
||||
self._info[ 'account_key' ] = account_key
|
||||
self._info[ 'account_type' ] = account_type
|
||||
self._info[ 'created' ] = created
|
||||
self._info[ 'expires' ] = expires
|
||||
self._info[ 'used_bytes' ] = used_bytes
|
||||
self._info[ 'used_requests' ] = used_requests
|
||||
if banned_info is not None: self._info[ 'banned_info' ] = banned_info
|
||||
|
||||
self._info[ 'fresh_timestamp' ] = GetNow()
|
||||
|
||||
|
||||
def __repr__( self ): return self.ConvertToString()
|
||||
|
||||
def __str__( self ): return self.ConvertToString()
|
||||
|
||||
def _IsBanned( self ):
|
||||
|
||||
if 'banned_info' not in self._info: return False
|
||||
else:
|
||||
|
||||
( reason, created, expires ) = self._info[ 'banned_info' ]
|
||||
|
||||
if expires is None: return True
|
||||
else: return not TimeHasPassed( expires )
|
||||
|
||||
|
||||
|
||||
def _IsBytesExceeded( self ):
|
||||
|
||||
account_type = self._info[ 'account_type' ]
|
||||
|
||||
max_num_bytes = account_type.GetMaxBytes()
|
||||
|
||||
used_bytes = self._info[ 'used_bytes' ]
|
||||
|
||||
return max_num_bytes is not None and used_bytes > max_num_bytes
|
||||
|
||||
|
||||
def _IsExpired( self ):
|
||||
|
||||
if self._info[ 'expires' ] is None: return False
|
||||
else: return TimeHasPassed( self._info[ 'expires' ] )
|
||||
|
||||
|
||||
def _IsRequestsExceeded( self ):
|
||||
|
||||
account_type = self._info[ 'account_type' ]
|
||||
|
||||
max_num_requests = account_type.GetMaxRequests()
|
||||
|
||||
used_requests = self._info[ 'used_requests' ]
|
||||
|
||||
return max_num_requests is not None and used_requests > max_num_requests
|
||||
|
||||
|
||||
def CheckPermission( self, permission ):
|
||||
|
||||
if self._IsBanned(): raise HydrusExceptions.PermissionException( 'This account is banned!' )
|
||||
|
||||
if self._IsExpired(): raise HydrusExceptions.PermissionException( 'This account is expired.' )
|
||||
|
||||
if self._IsBytesExceeded(): raise HydrusExceptions.PermissionException( 'You have hit your data transfer limit, and cannot make any more requests for the month.' )
|
||||
|
||||
if self._IsRequestsExceeded(): raise HydrusExceptions.PermissionException( 'You have hit your requests limit, and cannot make any more requests for the month.' )
|
||||
|
||||
if not self._info[ 'account_type' ].HasPermission( permission ): raise HydrusExceptions.PermissionException( 'You do not have permission to do that.' )
|
||||
|
||||
|
||||
def ConvertToString( self ): return ConvertTimestampToPrettyAge( self._info[ 'created' ] ) + os.linesep + self._info[ 'account_type' ].ConvertToString() + os.linesep + 'which '+ ConvertTimestampToPrettyExpires( self._info[ 'expires' ] )
|
||||
|
||||
def GetAccountKey( self ): return self._info[ 'account_key' ]
|
||||
|
||||
def GetAccountType( self ): return self._info[ 'account_type' ]
|
||||
|
||||
def GetCreated( self ): return self._info[ 'created' ]
|
||||
|
||||
def GetExpires( self ): return self._info[ 'expires' ]
|
||||
|
||||
def GetExpiresString( self ):
|
||||
|
||||
if self._IsBanned():
|
||||
|
||||
( reason, created, expires ) = self._info[ 'banned_info' ]
|
||||
|
||||
return 'banned ' + ConvertTimestampToPrettyAge( created ) + ', ' + ConvertTimestampToPrettyExpires( expires ) + ' because: ' + reason
|
||||
|
||||
else: return ConvertTimestampToPrettyAge( self._info[ 'created' ] ) + ' and ' + ConvertTimestampToPrettyExpires( self._info[ 'expires' ] )
|
||||
|
||||
|
||||
def GetUsedBytesString( self ):
|
||||
|
||||
max_num_bytes = self._info[ 'account_type' ].GetMaxBytes()
|
||||
|
||||
used_bytes = self._info[ 'used_bytes' ]
|
||||
|
||||
if max_num_bytes is None: return ConvertIntToBytes( used_bytes ) + ' used this month'
|
||||
else: return ConvertIntToBytes( used_bytes ) + '/' + ConvertIntToBytes( max_num_bytes ) + ' used this month'
|
||||
|
||||
|
||||
def GetUsedRequestsString( self ):
|
||||
|
||||
max_num_requests = self._info[ 'account_type' ].GetMaxRequests()
|
||||
|
||||
used_requests = self._info[ 'used_requests' ]
|
||||
|
||||
if max_num_requests is None: return ConvertIntToPrettyString( used_requests ) + ' requests used this month'
|
||||
else: return ConvertValueRangeToPrettyString( used_requests, max_num_requests ) + ' requests used this month'
|
||||
|
||||
|
||||
def GetUsedBytes( self ): return self._info[ 'used_bytes' ]
|
||||
|
||||
def GetUsedRequests( self ): return self._info[ 'used_bytes' ]
|
||||
|
||||
def HasAccountKey( self ):
|
||||
|
||||
if 'account_key' in self._info and self._info[ 'account_key' ] is not None: return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def HasPermission( self, permission ):
|
||||
|
||||
if self._IsBanned(): return False
|
||||
|
||||
if self._IsExpired(): return False
|
||||
|
||||
if self._IsBytesExceeded(): return False
|
||||
|
||||
if self._IsRequestsExceeded(): return False
|
||||
|
||||
return self._info[ 'account_type' ].HasPermission( permission )
|
||||
|
||||
|
||||
def IsBanned( self ): return self._IsBanned()
|
||||
|
||||
def IsStale( self ): return self._info[ 'fresh_timestamp' ] + HC.UPDATE_DURATION * 5 < GetNow()
|
||||
|
||||
def IsUnknownAccount( self ): return self._info[ 'account_type' ].IsUnknownAccountType()
|
||||
|
||||
def MakeFresh( self ): self._info[ 'fresh_timestamp' ] = GetNow()
|
||||
|
||||
def MakeStale( self ): self._info[ 'fresh_timestamp' ] = 0
|
||||
|
||||
def ReportDataUsed( self, num_bytes ):
|
||||
|
||||
self._info[ 'used_bytes' ] += num_bytes
|
||||
|
||||
|
||||
def ReportRequestUsed( self ):
|
||||
|
||||
self._info[ 'used_requests' ] += 1
|
||||
|
||||
|
||||
sqlite3.register_adapter( Account, yaml.safe_dump )
|
||||
|
||||
class AccountIdentifier( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_ACCOUNT_IDENTIFIER
|
||||
|
@ -1768,6 +1603,15 @@ class ContentUpdate( object ):
|
|||
( rating, hashes ) = self._row
|
||||
|
||||
|
||||
elif self._data_type == HC.CONTENT_TYPE_NOTES:
|
||||
|
||||
if self._action == HC.CONTENT_UPDATE_SET:
|
||||
|
||||
( notes, hash ) = self._row
|
||||
|
||||
hashes = { hash }
|
||||
|
||||
|
||||
|
||||
if not isinstance( hashes, set ):
|
||||
|
||||
|
|
|
@ -1148,7 +1148,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
NUM_DEFAULT_SERVICES = 8
|
||||
NUM_DEFAULT_SERVICES = 9
|
||||
|
||||
services = self._read( 'services' )
|
||||
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import ClientConstants as CC
|
||||
import ClientDefaults
|
||||
import ClientGUIDialogs
|
||||
import ClientGUIScrolledPanelsEdit
|
||||
import ClientGUIScrolledPanelsManagement
|
||||
import ClientGUITopLevelWindows
|
||||
import ClientThreading
|
||||
|
@ -74,13 +75,11 @@ class TestDBDialogs( unittest.TestCase ):
|
|||
|
||||
def test_dialog_manage_subs( self ):
|
||||
|
||||
HG.test_controller.SetRead( 'serialisable_named', [] )
|
||||
|
||||
title = 'subs test'
|
||||
|
||||
with ClientGUITopLevelWindows.DialogManage( None, title ) as dlg:
|
||||
with ClientGUITopLevelWindows.DialogEdit( None, title ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsManagement.ManageSubscriptionsPanel( dlg )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditSubscriptionsPanel( dlg, [] )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -471,7 +471,6 @@ class TestSerialisables( unittest.TestCase ):
|
|||
self.assertEqual( obj._gallery_identifier, dupe_obj._gallery_identifier )
|
||||
self.assertEqual( obj._gallery_stream_identifiers, dupe_obj._gallery_stream_identifiers )
|
||||
self.assertEqual( len( obj._queries ), len( dupe_obj._queries ) )
|
||||
self.assertEqual( obj._get_tags_if_url_known_and_file_redundant, dupe_obj._get_tags_if_url_known_and_file_redundant )
|
||||
self.assertEqual( obj._initial_file_limit, dupe_obj._initial_file_limit )
|
||||
self.assertEqual( obj._periodic_file_limit, dupe_obj._periodic_file_limit )
|
||||
self.assertEqual( obj._paused, dupe_obj._paused )
|
||||
|
@ -490,7 +489,6 @@ class TestSerialisables( unittest.TestCase ):
|
|||
gallery_stream_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( gallery_identifier )
|
||||
queries = [ ClientImporting.SubscriptionQuery( 'test query' ), ClientImporting.SubscriptionQuery( 'test query 2' ) ]
|
||||
checker_options = ClientData.CheckerOptions()
|
||||
get_tags_if_url_known_and_file_redundant = True
|
||||
initial_file_limit = 100
|
||||
periodic_file_limit = 50
|
||||
paused = False
|
||||
|
@ -500,7 +498,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
no_work_until = HydrusData.GetNow() - 86400 * 20
|
||||
|
||||
sub.SetTuple( gallery_identifier, gallery_stream_identifiers, queries, checker_options, get_tags_if_url_known_and_file_redundant, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until )
|
||||
sub.SetTuple( gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until )
|
||||
|
||||
self.assertEqual( sub.GetGalleryIdentifier(), gallery_identifier )
|
||||
self.assertEqual( sub.GetTagImportOptions(), tag_import_options )
|
||||
|
|
|
@ -46,6 +46,8 @@ class TestServer( unittest.TestCase ):
|
|||
@classmethod
|
||||
def setUpClass( cls ):
|
||||
|
||||
cls._access_key = HydrusData.GenerateKey()
|
||||
|
||||
services = []
|
||||
|
||||
cls._serverside_file_service = HydrusNetwork.GenerateService( HydrusData.GenerateKey(), HC.FILE_REPOSITORY, 'file repo', HC.DEFAULT_SERVICE_PORT + 1 )
|
||||
|
@ -56,9 +58,9 @@ class TestServer( unittest.TestCase ):
|
|||
cls._clientside_tag_service = ClientServices.GenerateService( HydrusData.GenerateKey(), HC.TAG_REPOSITORY, 'tag repo' )
|
||||
cls._clientside_admin_service = ClientServices.GenerateService( HydrusData.GenerateKey(), HC.SERVER_ADMIN, 'server admin' )
|
||||
|
||||
cls._clientside_file_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVICE_PORT + 1 ) )
|
||||
cls._clientside_tag_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVICE_PORT ) )
|
||||
cls._clientside_admin_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVER_ADMIN_PORT ) )
|
||||
cls._clientside_file_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVICE_PORT + 1, cls._access_key ) )
|
||||
cls._clientside_tag_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVICE_PORT, cls._access_key ) )
|
||||
cls._clientside_admin_service.SetCredentials( HydrusNetwork.Credentials( '127.0.0.1', HC.DEFAULT_SERVER_ADMIN_PORT, cls._access_key ) )
|
||||
|
||||
cls._local_booru = ClientServices.GenerateService( HydrusData.GenerateKey(), HC.LOCAL_BOORU, 'local booru' )
|
||||
|
||||
|
@ -71,15 +73,12 @@ class TestServer( unittest.TestCase ):
|
|||
permissions = [ HC.GET_DATA, HC.POST_DATA, HC.POST_PETITIONS, HC.RESOLVE_PETITIONS, HC.MANAGE_USERS, HC.GENERAL_ADMIN, HC.EDIT_SERVICES ]
|
||||
|
||||
account_key = HydrusData.GenerateKey()
|
||||
account_type = HydrusData.AccountType( 'account', permissions, ( None, None ) )
|
||||
account_type = HydrusNetwork.AccountType.GenerateAdminAccountType( HC.SERVER_ADMIN )
|
||||
created = HydrusData.GetNow() - 100000
|
||||
expires = None
|
||||
used_bytes = 0
|
||||
used_requests = 0
|
||||
|
||||
cls._account = HydrusData.Account( account_key, account_type, created, expires, used_bytes, used_requests )
|
||||
cls._account = HydrusNetwork.Account( account_key, account_type, created, expires )
|
||||
|
||||
cls._access_key = HydrusData.GenerateKey()
|
||||
cls._file_hash = HydrusData.GenerateKey()
|
||||
|
||||
def TWISTEDSetup():
|
||||
|
@ -147,10 +146,6 @@ class TestServer( unittest.TestCase ):
|
|||
|
||||
def _test_file_repo( self, service ):
|
||||
|
||||
info = service.GetInfo()
|
||||
|
||||
info[ 'access_key' ] = self._access_key
|
||||
|
||||
# file
|
||||
|
||||
path = ServerFiles.GetExpectedFilePath( self._file_hash )
|
||||
|
@ -442,10 +437,6 @@ class TestServer( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( response[ 'access_key' ], self._access_key )
|
||||
|
||||
info = service.GetInfo()
|
||||
|
||||
info[ 'access_key' ] = self._access_key
|
||||
|
||||
# set up session
|
||||
|
||||
last_error = 0
|
||||
|
|
|
@ -2,6 +2,7 @@ import ClientConstants as CC
|
|||
import collections
|
||||
import HydrusConstants as HC
|
||||
import HydrusExceptions
|
||||
import HydrusNetwork
|
||||
import HydrusSessions
|
||||
import os
|
||||
import TestConstants
|
||||
|
@ -22,13 +23,11 @@ class TestSessions( unittest.TestCase ):
|
|||
|
||||
access_key = HydrusData.GenerateKey()
|
||||
account_key = HydrusData.GenerateKey()
|
||||
account_type = HydrusData.AccountType( 'account', permissions, ( None, None ) )
|
||||
account_type = HydrusNetwork.AccountType.GenerateAdminAccountType( HC.SERVER_ADMIN )
|
||||
created = HydrusData.GetNow() - 100000
|
||||
expires = HydrusData.GetNow() + 300
|
||||
used_bytes = 0
|
||||
used_requests = 0
|
||||
|
||||
account = HydrusData.Account( account_key, account_type, created, expires, used_bytes, used_requests )
|
||||
account = HydrusNetwork.Account( account_key, account_type, created, expires )
|
||||
|
||||
expires = HydrusData.GetNow() - 10
|
||||
|
||||
|
@ -59,7 +58,7 @@ class TestSessions( unittest.TestCase ):
|
|||
|
||||
account_key_2 = HydrusData.GenerateKey()
|
||||
|
||||
account_2 = HydrusData.Account( account_key_2, account_type, created, expires, used_bytes, used_requests )
|
||||
account_2 = HydrusNetwork.Account( account_key_2, account_type, created, expires )
|
||||
|
||||
HG.test_controller.SetRead( 'account_key_from_access_key', account_key_2 )
|
||||
HG.test_controller.SetRead( 'account', account_2 )
|
||||
|
@ -101,7 +100,7 @@ class TestSessions( unittest.TestCase ):
|
|||
|
||||
expires = HydrusData.GetNow() + 300
|
||||
|
||||
updated_account = HydrusData.Account( account_key, account_type, created, expires, 1, 1 )
|
||||
updated_account = HydrusNetwork.Account( account_key, account_type, created, expires )
|
||||
|
||||
HG.test_controller.SetRead( 'account', updated_account )
|
||||
|
||||
|
@ -119,7 +118,7 @@ class TestSessions( unittest.TestCase ):
|
|||
|
||||
expires = HydrusData.GetNow() + 300
|
||||
|
||||
updated_account_2 = HydrusData.Account( account_key, account_type, created, expires, 2, 2 )
|
||||
updated_account_2 = HydrusNetwork.Account( account_key, account_type, created, expires )
|
||||
|
||||
HG.test_controller.SetRead( 'sessions', [ ( session_key_1, service_key, updated_account_2, expires ), ( session_key_2, service_key, account_2, expires ), ( session_key_3, service_key, updated_account_2, expires ) ] )
|
||||
|
||||
|
|
11
test.py
11
test.py
|
@ -10,6 +10,8 @@ from include import ClientConstants as CC
|
|||
from include import HydrusGlobals as HG
|
||||
from include import ClientDefaults
|
||||
from include import ClientNetworking
|
||||
from include import ClientNetworkingDomain
|
||||
from include import ClientNetworkingLogin
|
||||
from include import ClientServices
|
||||
from include import ClientThreading
|
||||
from include import HydrusExceptions
|
||||
|
@ -139,6 +141,15 @@ class Controller( object ):
|
|||
self.services_manager = ClientCaches.ServicesManager( self )
|
||||
self.client_files_manager = ClientCaches.ClientFilesManager( self )
|
||||
|
||||
bandwidth_manager = ClientNetworking.NetworkBandwidthManager()
|
||||
session_manager = ClientNetworking.NetworkSessionManager()
|
||||
domain_manager = ClientNetworkingDomain.NetworkDomainManager()
|
||||
login_manager = ClientNetworkingLogin.NetworkLoginManager()
|
||||
|
||||
self.network_engine = ClientNetworking.NetworkEngine( self, bandwidth_manager, session_manager, domain_manager, login_manager )
|
||||
|
||||
self.CallToThreadLongRunning( self.network_engine.MainLoop )
|
||||
|
||||
self._managers[ 'tag_censorship' ] = ClientCaches.TagCensorshipManager( self )
|
||||
self._managers[ 'tag_siblings' ] = ClientCaches.TagSiblingsManager( self )
|
||||
self._managers[ 'tag_parents' ] = ClientCaches.TagParentsManager( self )
|
||||
|
|
Loading…
Reference in New Issue