parent
1c70b00e4a
commit
8f20b37432
|
@ -7,6 +7,48 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 579](https://github.com/hydrusnetwork/hydrus/releases/tag/v579)
|
||||
|
||||
### some url-checking logic
|
||||
|
||||
* the 'during URL check, check for neighbour-spam?' checkbox in _file import options_ has some sophisticated new logic. check the issue for a longer explanation, but long story short is if you have two different booru URLs that share the same source URL (with one or both simply being incorrect e.g. both point to the same 'clean' source, even though one is 'messy'), then that bad source URL will no longer cause the second booru import job to get 'already in db'. it now recognises this is an untrustworthy mapping and goes ahead with the download, just as you actually want. once the file is imported, it is still able, as normal, to quickly recognise the true positive 'already in db' result, so I believe have successfully plugged a logical hole here without affecting normal good operation! (issue #1563)
|
||||
* the 'associate source urls' option in file import options is more careful about the above logic. source urls are now definitely not included in the pre-import file url checks if this option is off
|
||||
|
||||
### some regex quality of life
|
||||
|
||||
* regex input text boxes have been given a pass. the regex 'help' links are folded into the button, the links are updated to something newer (one of the older ones seems to have died), the button is now put aside the input and labelled `.*`, the menu is a little neater, and the input has placeholder text and now shows green/red (valid/invalid in the stylesheet) depending on whether the current regex text compiles ok. just a nicer widget overall
|
||||
* this widget is now in filename tagging, the String Match panel regex match, the String Converter panel regex step, and the 'regex favourites' options panel, which I was surprised to learn the existence of
|
||||
* the regex menu for the String Converter regex step also now shows how to do grouping in python regex. I hadn't experimented with this properly in python, but I learned this past week that this thing can handle `(...) -> \1` group-replace fine and can do named groups with `(?P<name>...) -> \g<name>` too!
|
||||
* for convenience, when editing a String Match, if you flick from 'any' to 'fixed' or 'regex', it now puts whatever was in your example text beforehand as the new value for the fixed text or regex
|
||||
|
||||
### list selecting and scrolling
|
||||
|
||||
* I added some new scroll-to tech to my multi-column lists
|
||||
* pasting a URL into the 'edit URL Classes' dialog's test input now selects and scrolls to the matching URL Class
|
||||
* the following lists should all have better list sort/select preservation, and will now scroll to and maintain visibility, on various edit/add events: edit url classes, edit gugs, edit parsers, edit shortcut sets, edit shortcut set, the options dialog frame locations, the options dialog media viewer options, manage services, manage account types, manage logins, manage login scripts, edit login script, and some weird legacy stuff. lots more to do in future
|
||||
* when you 'add from defaults' for many lists, it will now try and scroll to what was just added. may not be perfect!
|
||||
* same deal with 'import' buttons. it will now try and scroll to what you import!
|
||||
* I am also moving to 'when you edit, you only edit one row at a time'. in general, when I have written list edit functions, I write them to edit each row of a multi-selection in turn with a new dialog, but: this is not used very much, can be confusing/annoying to the user, and increases code complexity, so I am undoing it. as I continue to work here, if you have a multi-selection, an edit call will increasingly just edit the top selected row. maybe in this case I'll reduce the selection, maybe I'll add some different way to do multi-edit again, let me know what you think
|
||||
|
||||
### misc
|
||||
|
||||
* import folders now work in a far more efficient way. previously, the client loaded import folders every three minutes to see which were ready to run; now, it loads them once on startup or change and then consults each folder to determine how long to wait until loading it again. it isn't perfect yet, but this ancient, terrible code from back when 100 files was a lot is now far more efficient. users with large import folders may notice less background lag, let me know how you get on. thanks to the users who spotted this--there's doubtless more out there
|
||||
* to help muscle memory, the 'undo' menu is now disabled when there is nothing for it to hold, not invisible. same deal for the 'pending' menu, although this will still hide if you have no services to pend to (ipfs, hydrus repositories). see how this feels, maybe I'll add options for it
|
||||
* the new 'is this webp animated?' check is now a little faster
|
||||
* if your similar file search tree is missing a branch (this can happen after db damage or crash desync during a file import) and a new file import (wanting to add a new leaf) runs into this gap, the database now imports the file successfully and the user gets a popup message telling them to regen their similar files search tree when convenient (rather than raising an error and failing the import)
|
||||
* added a FAQ question 'I just imported files from my hard drive collection. How can I get their tags from the boorus?', to talk about my feelings on this technical question and to link to the user guide here: https://wiki.hydrus.network/books/hydrus-manual/page/file-look-up
|
||||
* the default bandwidth rules for a hydrus repository are boosted from 512MB a day to 2GB. my worries about a database syncing 'too fast' for maintenance timers to kick in are less critical these days
|
||||
|
||||
### build and cleanup
|
||||
|
||||
* since the recent test 'future build' went without any problems, I am folding its library updates into the normal build. Qt (PySide6) goes from 6.6.0 to 6.6.3.1 for Linux and Windows, there's a newer SQLite dll on Windows, and there's a newer mpv dll on Windows
|
||||
* updated all the requirements.txts to specify to not use the brand new numpy 2.0.0, which it seems just released this week and breaks anything that was compiled to work with 1.x.x. if you tried to set up a new venv in the past few days and got weird numpy errors, please rebuild your venv in v579, it should work again
|
||||
* thanks to a user, the Docker build's `requests` 'no_proxy' patch is fixed for python >3.10
|
||||
* cleaned up a ton of `SyntaxWarnings` boot logspam on python >=3.12 due to un-`r`-texted escape sequences like `\s`. thanks to the user who submitted all this, let me know if I missed any
|
||||
* cleaned up some regex ui code
|
||||
* cleaned up some garbage in the string panel ui code
|
||||
* fixed some weird vertical stretch in some single-control dialogs
|
||||
|
||||
## [Version 578](https://github.com/hydrusnetwork/hydrus/releases/tag/v578)
|
||||
|
||||
### animated webp
|
||||
|
@ -332,43 +374,3 @@ title: Changelog
|
|||
* cleaned up how some text and exceptions are split by newlines to handle different sorts of newline, and cleaned up how I fetch the first 'summary' line of text in all cases across the program
|
||||
* replaced `os.linesep` with `\n` across the program. Qt only wants `\n` anyway, most logging wants `\n` (and sometimes converts on the fly behind the scenes), and this helps KISS otherwise. I might bring back `os.linesep` for sidecars and stuff if it proves a problem, but most text editors and scripting languages are very happy with `\n`, so we'll see
|
||||
* multi-column lists now show multiline tooltips if the underlying text in the cell was originally multiline (although tbh this is rare)
|
||||
|
||||
## [Version 569](https://github.com/hydrusnetwork/hydrus/releases/tag/v569)
|
||||
|
||||
### user contributions
|
||||
|
||||
* thanks to a user, fixed a problem with the recent URL changes that caused downloaders examining multi-file posts to only grab the first file
|
||||
* thanks to a user, all the menubar commands that launch a modal dialog are now suffix'd by an ellipsis
|
||||
* thanks to a user, fixed an issue regarding KDE 6 quitting the program as soon as the pre-boot 'your database is missing a location, let's find it' repair dialog was ok'd
|
||||
* thanks to a user, the application icon is fixed in KDE Plasma Wayland (and anything else that pulls icon from .desktop file). if you have been using a hydrus.desktop file and don't see a program icon, you should rename it to `/usr/share/applications/io.github.hydrusnetwork.hydrus.desktop` . more importantly, if you manage a package for hydrus--please output to this file path instead of `hydrus.desktop` if you make one
|
||||
* thanks to a user, updated the `hydrus_client.sh` file to include `"$@"`, which passes parameters given to the .sh file to the .py call
|
||||
|
||||
### more on last week's URL work
|
||||
|
||||
* fixed the 'show the Request URL under "additional urls" submenu' thing on the file log list menu. I screwed up the logic and was effectively testing for when `1 != 1`
|
||||
* the converter that generates a Referral URL now operates on the API/redirect conversion principle too--it normalises the Source URL to its 'Request URL' state--keeping defined ephemeral params and filling in defaults but dropping any extra gubbins not asked for--before applying the conversion
|
||||
* fixed the 'manage url class' dialog to correctly display an example API/redirect-converted URL based on the new _request url_, not the _normalised url_ (so the api/redirect example will now show the new ephemeral params properly). this was working in requests correctly behind the scenes, it was just the example text box in the dialog that was showing wrong
|
||||
* improved the 'is this query text pre-encoded?' test to check for `%hh`, where `h` is a hexadecimal character, instead of the hackier 'is % in it while not followed by whitespace or end of string?'
|
||||
* improved/simplified/optimised the overall procedure that figures out if an entered URL is pre-encoded or not. this routine now only runs at the stage where a URL is ingested and it obeys the `%hh` rule. these ingestion points are currently: the text boxes in a urls downloader/simple downloader page; the 'import new sources' function of file log menus; a URL `ContentParser` in the parsing system; the test box in `manage url classes`; and the main gui's 'import url' landing pad, which is used by the drag and drop system, the clipboard watcher, and the client api's 'import url' command. note that this does not occur on 'manage known urls' editing, where you can do what you want with whatever, and I won't coerce it to anything
|
||||
|
||||
### misc
|
||||
|
||||
* fixed a variety of logical cases around >0, =0, !=0, <0 for the `NumberTest` objects I recently applied to system:duration and elsewhere. when it comes to file searching, files that have 'None' duration are now considered equivalent to files that have an explicit 0 duration in all cases. previously, I was trying to thread a needle where '=0' would find null results but <x would not, and it was a mess. now it all works the same way. if you want to search for 'duration < x' and want to exclude still images, either add a filetype pred or slap on 'has duration'
|
||||
* improved the stability of the manual file exporter process. it was consulting an object in a thread that it shouldn't have
|
||||
* improved the ability of the manual file exporter process to report errors on a very large export that encounters errors after the dialog has closed
|
||||
* fixed the 'remember last used default tag service in manage tag dialogs' and its accompanying dropdown not saving their current value on options dialog ok. sorry for the trouble!
|
||||
* fixed the system that truncates very long filenames (for export folders and drag and drop exports) on Linux when the exporter is also outputting a sidecar that has a long extra suffix
|
||||
* the 'find potential duplicate pairs' routine that runs in idle time now properly obeys the work/rest times in `options->maintenance and processing`. previously, it was just the 'run now' routine that was resting in that way, and the idle thing was just doing a hardcoded 'work for 60 seconds every 10 mins or so'. thanks to the reporting user who cleverly noticed this
|
||||
* the `options->connection` page now mentions your proxy needs to be `http://`
|
||||
|
||||
### boring stuff
|
||||
|
||||
* updated the windows setup_venv.bat to allow for custom python or venv locations using parameters. this was so I could set up a multi-python testing situation easier
|
||||
* added some unit tests for the new URL encoding gubbins
|
||||
* improved un-encoded URL parsing in the downloader when the URL is relative and needs to be joined to the source url
|
||||
* improved some URL parsing and ingestion to better handle urls with non-ascii characters in the domain
|
||||
* replaced several 'does it start with "http"?' areas with a better and unified scheme/netloc test
|
||||
* wrote a routine to split URL paths into path components, and spammed it everywhere so this code is now unified. I expect we'll get a `PathComponent` class at some point, too. there will be a future question about what to do with double slashes, `//` in paths--it turns out the logic has been mixed here, and I think I will probably collapse them to `/` in all cases
|
||||
* rewrote an unhealthy call that indirectly caused the above multi-file post parsing problem
|
||||
* fixed some None/0 `NumberTest` stuff if you manage to enter '<0' or >-5 and similar
|
||||
* I figured out the problems with PyInstaller 6.x and some other stuff, there should be a 'Future Build' alongside this release in github for advanced users to test with
|
||||
|
|
14
docs/faq.md
14
docs/faq.md
|
@ -129,6 +129,20 @@ Not really. Unless your situation involves millions of richly locally tagged fil
|
|||
|
||||
Yes. I am working on updating the database infrastructure to allow a full purge, but the structure is complicated, so it will take some time. If you are afraid of someone stealing your hard drive and matriculating your sordid MLP collection (or, in this case, the historical log of horrors that you rejected), do some research into drive encryption. Hydrus runs fine off an encrypted disk.
|
||||
|
||||
## I just imported files from my hard drive collection. How can I get their tags from the boorus?
|
||||
|
||||
The problem of 'what tags should these files have?' is technically difficult to solve, and there isn't a fast and easy way to query a booru and say 'hey, what are your tags for this?', particularly _en masse_. It is even more difficult to keep up with updates (e.g. someone adding a tag to a file some months or years after it was uploaded). This is the main problem I designed the PTR to solve.
|
||||
|
||||
If you cannot or do not want to devote the local resources to sync with the PTR, there are a few hacky ways to perform tag lookups, mostly with manual hash-based lookups. The big boorus support file search based on 'md5' hash, so there are ways to build a workflow where you can 'search' a booru or iqdb for one file at a time to see if there is a hit, and then get tags as if you were downloading it. An old system in the client called 'file lookup scripts' works like this, in the _manage tags_ dialog, and some users have figured out ways to make it work with some clever downloaders.
|
||||
|
||||
Be careful with these systems. They tend to be slow and use a lot of resources serverside, so you will be rude if you hit them too hard. They work for a handful of files every now and then, but please do not set up jobs of many many thousands of files, and absolutely do not repeat the job for the same files regularly--you will just waste a lot of CPU and network time for everyone, and only gain a couple of tags in the process. Note that the hash-based lookups only work if your files have not changed since being downloaded; if you have scaled them, stripped metadata, or optimised quality, then they will count as new files and the hashes will have changed, and you will need to think about services like iqdb or saucenao, or ultimately the hydrus duplicate resolution system.
|
||||
|
||||
That said, here is [a user guide on how to perform various kinds of file lookups](https://wiki.hydrus.network/books/hydrus-manual/page/file-look-up).
|
||||
|
||||
If you are feeling adventurous, you can also explore the newer [AI-tagging tools](client_api.html#auto-taggers) that users are working on.
|
||||
|
||||
Ultimately, though, a good and simple way to backfill your files' tags is just rely on normal downloading workflows. Try downloading your favourite artists (and later set up subscriptions) and you will naturally get files you like, with tags, and if, by (expected) serendipity, a file on the site is the same as one you already imported, hydrus will add the tags to it retroactively.
|
||||
|
||||
## Does Hydrus run ok off an encrypted drive partition? { id="encryption" }
|
||||
|
||||
Yes! Both the database and your files should be fine on any of the popular software solutions. These programs give your OS a virtual drive that on my end looks and operates like any other. I have yet to encounter one that SQLite has a problem with. Make sure you don't have auto-dismount set--or at least be hawkish that it will never trigger while hydrus is running--or you could damage your database.
|
||||
|
|
|
@ -34,6 +34,41 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_579"><a href="#version_579">version 579</a></h2>
|
||||
<ul>
|
||||
<li><h3>some url-checking logic</h3></li>
|
||||
<li>the 'during URL check, check for neighbour-spam?' checkbox in _file import options_ has some sophisticated new logic. check the issue for a longer explanation, but long story short is if you have two different booru URLs that share the same source URL (with one or both simply being incorrect e.g. both point to the same 'clean' source, even though one is 'messy'), then that bad source URL will no longer cause the second booru import job to get 'already in db'. it now recognises this is an untrustworthy mapping and goes ahead with the download, just as you actually want. once the file is imported, it is still able, as normal, to quickly recognise the true positive 'already in db' result, so I believe have successfully plugged a logical hole here without affecting normal good operation! (issue #1563)</li>
|
||||
<li>the 'associate source urls' option in file import options is more careful about the above logic. source urls are now definitely not included in the pre-import file url checks if this option is off</li>
|
||||
<li><h3>some regex quality of life</h3></li>
|
||||
<li>regex input text boxes have been given a pass. the regex 'help' links are folded into the button, the links are updated to something newer (one of the older ones seems to have died), the button is now put aside the input and labelled `.*`, the menu is a little neater, and the input has placeholder text and now shows green/red (valid/invalid in the stylesheet) depending on whether the current regex text compiles ok. just a nicer widget overall</li>
|
||||
<li>this widget is now in filename tagging, the String Match panel regex match, the String Converter panel regex step, and the 'regex favourites' options panel, which I was surprised to learn the existence of</li>
|
||||
<li>the regex menu for the String Converter regex step also now shows how to do grouping in python regex. I hadn't experimented with this properly in python, but I learned this past week that this thing can handle `(...) -> \1` group-replace fine and can do named groups with `(?P<name>...) -> \g<name>` too!</li>
|
||||
<li>for convenience, when editing a String Match, if you flick from 'any' to 'fixed' or 'regex', it now puts whatever was in your example text beforehand as the new value for the fixed text or regex</li>
|
||||
<li><h3>list selecting and scrolling</h3></li>
|
||||
<li>I added some new scroll-to tech to my multi-column lists</li>
|
||||
<li>pasting a URL into the 'edit URL Classes' dialog's test input now selects and scrolls to the matching URL Class</li>
|
||||
<li>the following lists should all have better list sort/select preservation, and will now scroll to and maintain visibility, on various edit/add events: edit url classes, edit gugs, edit parsers, edit shortcut sets, edit shortcut set, the options dialog frame locations, the options dialog media viewer options, manage services, manage account types, manage logins, manage login scripts, edit login script, and some weird legacy stuff. lots more to do in future</li>
|
||||
<li>when you 'add from defaults' for many lists, it will now try and scroll to what was just added. may not be perfect!</li>
|
||||
<li>same deal with 'import' buttons. it will now try and scroll to what you import!</li>
|
||||
<li>I am also moving to 'when you edit, you only edit one row at a time'. in general, when I have written list edit functions, I write them to edit each row of a multi-selection in turn with a new dialog, but: this is not used very much, can be confusing/annoying to the user, and increases code complexity, so I am undoing it. as I continue to work here, if you have a multi-selection, an edit call will increasingly just edit the top selected row. maybe in this case I'll reduce the selection, maybe I'll add some different way to do multi-edit again, let me know what you think</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>import folders now work in a far more efficient way. previously, the client loaded import folders every three minutes to see which were ready to run; now, it loads them once on startup or change and then consults each folder to determine how long to wait until loading it again. it isn't perfect yet, but this ancient, terrible code from back when 100 files was a lot is now far more efficient. users with large import folders may notice less background lag, let me know how you get on. thanks to the users who spotted this--there's doubtless more out there</li>
|
||||
<li>to help muscle memory, the 'undo' menu is now disabled when there is nothing for it to hold, not invisible. same deal for the 'pending' menu, although this will still hide if you have no services to pend to (ipfs, hydrus repositories). see how this feels, maybe I'll add options for it</li>
|
||||
<li>the new 'is this webp animated?' check is now a little faster</li>
|
||||
<li>if your similar file search tree is missing a branch (this can happen after db damage or crash desync during a file import) and a new file import (wanting to add a new leaf) runs into this gap, the database now imports the file successfully and the user gets a popup message telling them to regen their similar files search tree when convenient (rather than raising an error and failing the import)</li>
|
||||
<li>added a FAQ question 'I just imported files from my hard drive collection. How can I get their tags from the boorus?', to talk about my feelings on this technical question and to link to the user guide here: https://wiki.hydrus.network/books/hydrus-manual/page/file-look-up</li>
|
||||
<li>the default bandwidth rules for a hydrus repository are boosted from 512MB a day to 2GB. my worries about a database syncing 'too fast' for maintenance timers to kick in are less critical these days</li>
|
||||
<li><h3>build and cleanup</h3></li>
|
||||
<li>since the recent test 'future build' went without any problems, I am folding its library updates into the normal build. Qt (PySide6) goes from 6.6.0 to 6.6.3.1 for Linux and Windows, there's a newer SQLite dll on Windows, and there's a newer mpv dll on Windows</li>
|
||||
<li>updated all the requirements.txts to specify to not use the brand new numpy 2.0.0, which it seems just released this week and breaks anything that was compiled to work with 1.x.x. if you tried to set up a new venv in the past few days and got weird numpy errors, please rebuild your venv in v579, it should work again</li>
|
||||
<li>thanks to a user, the Docker build's `requests` 'no_proxy' patch is fixed for python >3.10</li>
|
||||
<li>cleaned up a ton of `SyntaxWarnings` boot logspam on python >=3.12 due to un-`r`-texted escape sequences like `\s`. thanks to the user who submitted all this, let me know if I missed any</li>
|
||||
<li>cleaned up some regex ui code</li>
|
||||
<li>cleaned up some garbage in the string panel ui code</li>
|
||||
<li>fixed some weird vertical stretch in some single-control dialogs</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_578"><a href="#version_578">version 578</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -345,8 +345,8 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
def _ShutdownManagers( self ):
|
||||
|
||||
self.database_maintenance_manager.Shutdown()
|
||||
self.import_folders_manager.Shutdown()
|
||||
self.files_maintenance_manager.Shutdown()
|
||||
|
||||
self.quick_download_manager.Shutdown()
|
||||
|
||||
managers = [ self.subscriptions_manager, self.tag_display_maintenance_manager ]
|
||||
|
@ -1291,16 +1291,20 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
|
||||
self.frame_splash_status.SetTitleText( 'booting gui' + HC.UNICODE_ELLIPSIS )
|
||||
|
||||
subscriptions = CG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||
|
||||
self.files_maintenance_manager = ClientFiles.FilesMaintenanceManager( self )
|
||||
|
||||
from hydrus.client import ClientDBMaintenanceManager
|
||||
|
||||
self.database_maintenance_manager = ClientDBMaintenanceManager.DatabaseMaintenanceManager( self )
|
||||
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
|
||||
self.import_folders_manager = ClientImportLocal.ImportFoldersManager( self )
|
||||
|
||||
from hydrus.client.importing import ClientImportSubscriptions
|
||||
|
||||
subscriptions = CG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||
|
||||
self.subscriptions_manager = ClientImportSubscriptions.SubscriptionsManager( self, subscriptions )
|
||||
|
||||
def qt_code_style():
|
||||
|
@ -1638,13 +1642,13 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
job.WakeOnPubSub( 'wake_idle_workers' )
|
||||
job.WakeOnPubSub( 'notify_network_traffic_unpaused' )
|
||||
self._daemon_jobs[ 'synchronise_repositories' ] = job
|
||||
|
||||
'''
|
||||
job = self.CallRepeating( 5.0, 180.0, ClientDaemons.DAEMONCheckImportFolders )
|
||||
job.WakeOnPubSub( 'notify_restart_import_folders_daemon' )
|
||||
job.WakeOnPubSub( 'notify_new_import_folders' )
|
||||
job.ShouldDelayOnWakeup( True )
|
||||
self._daemon_jobs[ 'import_folders' ] = job
|
||||
|
||||
'''
|
||||
job = self.CallRepeating( 5.0, 180.0, ClientDaemons.DAEMONCheckExportFolders )
|
||||
job.WakeOnPubSub( 'notify_restart_export_folders_daemon' )
|
||||
job.WakeOnPubSub( 'notify_new_export_folders' )
|
||||
|
@ -1673,6 +1677,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
|
||||
self.files_maintenance_manager.Start()
|
||||
self.database_maintenance_manager.Start()
|
||||
self.import_folders_manager.Start()
|
||||
self.subscriptions_manager.Start()
|
||||
|
||||
|
||||
|
|
|
@ -41,35 +41,6 @@ def DAEMONCheckExportFolders():
|
|||
|
||||
|
||||
|
||||
def DAEMONCheckImportFolders():
|
||||
|
||||
controller = CG.client_controller
|
||||
|
||||
if not controller.new_options.GetBoolean( 'pause_import_folders_sync' ):
|
||||
|
||||
HG.import_folders_running = True
|
||||
|
||||
try:
|
||||
|
||||
import_folder_names = controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER )
|
||||
|
||||
for name in import_folder_names:
|
||||
|
||||
import_folder = controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER, name )
|
||||
|
||||
if controller.new_options.GetBoolean( 'pause_import_folders_sync' ) or HydrusThreading.IsThreadShuttingDown():
|
||||
|
||||
break
|
||||
|
||||
|
||||
import_folder.DoWork()
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
HG.import_folders_running = False
|
||||
|
||||
|
||||
|
||||
def DAEMONMaintainTrash():
|
||||
|
||||
|
|
|
@ -817,7 +817,7 @@ def SetDefaultBandwidthManagerRules( bandwidth_manager ):
|
|||
|
||||
rules = HydrusNetworking.BandwidthRules()
|
||||
|
||||
rules.AddRule( HC.BANDWIDTH_TYPE_DATA, 86400, 512 * MB ) # don't sync a giant db in one day
|
||||
rules.AddRule( HC.BANDWIDTH_TYPE_DATA, 86400, 2 * GB ) # don't sync a giant db in one day, but we can push it more
|
||||
|
||||
bandwidth_manager.SetRules( ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_HYDRUS ), rules )
|
||||
|
||||
|
|
|
@ -317,7 +317,7 @@ class QuickDownloadManager( object ):
|
|||
exclude_deleted = False # this is the important part here
|
||||
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
|
||||
preimport_url_check_type = FileImportOptions.DO_CHECK
|
||||
preimport_url_check_looks_for_neighbours = True
|
||||
preimport_url_check_looks_for_neighbour_spam = True
|
||||
allow_decompression_bombs = True
|
||||
min_size = None
|
||||
max_size = None
|
||||
|
@ -331,7 +331,7 @@ class QuickDownloadManager( object ):
|
|||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbourSpam( preimport_url_check_looks_for_neighbour_spam )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'Downloaded File - {hash.hex()}' )
|
||||
|
|
|
@ -660,7 +660,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
exclude_deleted = True
|
||||
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
|
||||
preimport_url_check_type = FileImportOptions.DO_CHECK
|
||||
preimport_url_check_looks_for_neighbours = True
|
||||
preimport_url_check_looks_for_neighbour_spam = True
|
||||
allow_decompression_bombs = True
|
||||
min_size = None
|
||||
max_size = None
|
||||
|
@ -681,7 +681,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
quiet_file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
quiet_file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
quiet_file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
quiet_file_import_options.SetPreImportURLCheckLooksForNeighbourSpam( preimport_url_check_looks_for_neighbour_spam )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
quiet_file_import_options.SetPresentationImportOptions( presentation_import_options )
|
||||
quiet_file_import_options.SetDestinationLocationContext( ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY ) )
|
||||
|
@ -691,7 +691,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
loud_file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
loud_file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
loud_file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
loud_file_import_options.SetPreImportURLCheckLooksForNeighbourSpam( preimport_url_check_looks_for_neighbour_spam )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
loud_file_import_options.SetDestinationLocationContext( ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY ) )
|
||||
|
||||
|
|
|
@ -191,7 +191,7 @@ def GetPDFInfo( path: str ):
|
|||
|
||||
depunctuated_text = re.sub( r'[^\w\s]', ' ', text )
|
||||
|
||||
despaced_text = re.sub( '\s\s+', ' ', depunctuated_text )
|
||||
despaced_text = re.sub( r'\s\s+', ' ', depunctuated_text )
|
||||
|
||||
if despaced_text not in ( '', ' ' ):
|
||||
|
||||
|
|
|
@ -22,6 +22,8 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
self.modules_services = modules_services
|
||||
self.modules_files_storage = modules_files_storage
|
||||
|
||||
self._reported_on_a_broken_branch = False
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client similar files', cursor )
|
||||
|
||||
self._perceptual_hash_id_to_vp_tree_node_cache = {}
|
||||
|
@ -50,7 +52,32 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
|
||||
ancestor_id = next_ancestor_id
|
||||
|
||||
( ancestor_perceptual_hash, ancestor_radius, ancestor_inner_id, ancestor_inner_population, ancestor_outer_id, ancestor_outer_population ) = self._Execute( 'SELECT phash, radius, inner_id, inner_population, outer_id, outer_population FROM shape_perceptual_hashes NATURAL JOIN shape_vptree WHERE phash_id = ?;', ( ancestor_id, ) ).fetchone()
|
||||
result = self._Execute( 'SELECT phash, radius, inner_id, inner_population, outer_id, outer_population FROM shape_perceptual_hashes NATURAL JOIN shape_vptree WHERE phash_id = ?;', ( ancestor_id, ) ).fetchone()
|
||||
|
||||
if result is None:
|
||||
|
||||
if not self._reported_on_a_broken_branch:
|
||||
|
||||
message = 'Hey, while trying to import a file, hydrus discovered a hole in the similar files search tree. Please run _database->regenerate->similar files search tree_ when it is convenient!'
|
||||
message += '\n' * 2
|
||||
message += 'You will not see this message again this boot.'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
self._reported_on_a_broken_branch = True
|
||||
|
||||
|
||||
# ok so there is a missing branch. typically from an import crash desync, is my best bet
|
||||
# we still want to add our leaf because we need to add the file to the tree population, but we will add it to the ghost of the branch. no worries, the regen code will sort it all out
|
||||
parent_id = ancestor_id
|
||||
|
||||
# TODO: there's a secondary issue that we should add the ancestor_id's files to the file maintenance queue to check for presence in the similar files search system, I think
|
||||
# but we are too low level to talk to the maintenance queue here, so it'll have to be a more complicated answer
|
||||
|
||||
break
|
||||
|
||||
|
||||
( ancestor_perceptual_hash, ancestor_radius, ancestor_inner_id, ancestor_inner_population, ancestor_outer_id, ancestor_outer_population ) = result
|
||||
|
||||
distance_to_ancestor = HydrusData.Get64BitHammingDistance( perceptual_hash, ancestor_perceptual_hash )
|
||||
|
||||
|
|
|
@ -2904,7 +2904,11 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
ClientGUIMenus.SetMenuTitle( self._menubar_pending_submenu, 'pending ({})'.format( HydrusData.ToHumanInt( total_num_pending ) ) )
|
||||
|
||||
self._menubar_pending_submenu.menuAction().setVisible( total_num_pending > 0 )
|
||||
self._menubar_pending_submenu.menuAction().setEnabled( total_num_pending > 0 )
|
||||
|
||||
has_pending_services = len( self._controller.services_manager.GetServiceKeys( ( HC.TAG_REPOSITORY, HC.FILE_REPOSITORY, HC.IPFS ) ) ) > 0
|
||||
|
||||
self._menubar_pending_submenu.menuAction().setVisible( has_pending_services )
|
||||
|
||||
|
||||
return ClientGUIAsync.AsyncQtUpdater( self, loading_callable, work_callable, publish_callable )
|
||||
|
@ -3112,7 +3116,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
|
||||
|
||||
self._menubar_undo_submenu.menuAction().setVisible( have_closed_pages or have_undo_stuff )
|
||||
self._menubar_undo_submenu.menuAction().setEnabled( have_closed_pages or have_undo_stuff )
|
||||
|
||||
|
||||
return ClientGUIAsync.AsyncQtUpdater( self, loading_callable, work_callable, publish_callable )
|
||||
|
@ -4290,8 +4294,6 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
controller.new_options.SetBoolean( 'pause_import_folders_sync', original_pause_status )
|
||||
|
||||
controller.pub( 'notify_new_import_folders' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -4534,7 +4536,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
control.SetValue( nullification_period )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -4659,7 +4661,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
control.SetValue( update_period )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -5129,7 +5131,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
self._controller.new_options.FlipBoolean( 'pause_import_folders_sync' )
|
||||
|
||||
self._controller.pub( 'notify_restart_import_folders_daemon' )
|
||||
self._controller.import_folders_manager.Wake()
|
||||
|
||||
|
||||
self._controller.Write( 'save_options', HC.options )
|
||||
|
|
|
@ -20,6 +20,7 @@ from hydrus.client.gui import QtPorting as QP
|
|||
from hydrus.client.gui.lists import ClientGUIListBoxes
|
||||
from hydrus.client.gui.search import ClientGUIACDropdown
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIRegex
|
||||
|
||||
class Dialog( QP.Dialog ):
|
||||
|
||||
|
@ -263,12 +264,7 @@ class DialogInputNamespaceRegex( Dialog ):
|
|||
|
||||
self._namespace = QW.QLineEdit( self )
|
||||
|
||||
self._regex = QW.QLineEdit( self )
|
||||
|
||||
self._shortcuts = ClientGUICommon.RegexButton( self )
|
||||
|
||||
self._regex_intro_link = ClientGUICommon.BetterHyperLink( self, 'a good regex introduction', 'https://www.aivosto.com/vbtips/regex.html' )
|
||||
self._regex_practise_link = ClientGUICommon.BetterHyperLink( self, 'regex practice', 'https://regexr.com/3cvmf' )
|
||||
self._regex = ClientGUIRegex.RegexInput( self )
|
||||
|
||||
self._ok = QW.QPushButton( 'OK', self )
|
||||
self._ok.clicked.connect( self.EventOK )
|
||||
|
@ -281,7 +277,7 @@ class DialogInputNamespaceRegex( Dialog ):
|
|||
#
|
||||
|
||||
self._namespace.setText( namespace )
|
||||
self._regex.setText( regex )
|
||||
self._regex.SetValue( regex )
|
||||
|
||||
#
|
||||
|
||||
|
@ -301,9 +297,6 @@ class DialogInputNamespaceRegex( Dialog ):
|
|||
|
||||
QP.AddToLayout( vbox, ClientGUICommon.BetterStaticText(self,intro), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, control_box, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._shortcuts, CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( vbox, self._regex_intro_link, CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( vbox, self._regex_practise_link, CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( vbox, b_box, CC.FLAGS_ON_RIGHT )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
@ -348,11 +341,12 @@ class DialogInputNamespaceRegex( Dialog ):
|
|||
|
||||
namespace = self._namespace.text()
|
||||
|
||||
regex = self._regex.text()
|
||||
regex = self._regex.GetValue()
|
||||
|
||||
return ( namespace, regex )
|
||||
|
||||
|
||||
|
||||
class DialogInputTags( Dialog ):
|
||||
|
||||
def __init__( self, parent, service_key, tag_display_type, tags, message = '' ):
|
||||
|
|
|
@ -657,9 +657,7 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
gug = panel.GetValue()
|
||||
|
||||
self._AddGUG( gug )
|
||||
|
||||
self._gug_list_ctrl.Sort()
|
||||
self._AddGUG( gug, select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -680,29 +678,27 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
ngug = panel.GetValue()
|
||||
|
||||
self._AddNGUG( ngug )
|
||||
|
||||
self._ngug_list_ctrl.Sort()
|
||||
self._AddNGUG( ngug, select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def _AddGUG( self, gug ):
|
||||
def _AddGUG( self, gug, select_sort_and_scroll = False ):
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( gug, self._GetExistingNames() )
|
||||
|
||||
gug.RegenerateGUGKey()
|
||||
|
||||
self._gug_list_ctrl.AddDatas( ( gug, ) )
|
||||
self._gug_list_ctrl.AddDatas( ( gug, ), select_sort_and_scroll = select_sort_and_scroll )
|
||||
|
||||
|
||||
def _AddNGUG( self, ngug ):
|
||||
def _AddNGUG( self, ngug, select_sort_and_scroll = False ):
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( ngug, self._GetExistingNames() )
|
||||
|
||||
ngug.RegenerateGUGKey()
|
||||
|
||||
self._ngug_list_ctrl.AddDatas( ( ngug, ) )
|
||||
self._ngug_list_ctrl.AddDatas( ( ngug, ), select_sort_and_scroll = select_sort_and_scroll )
|
||||
|
||||
|
||||
def _ConvertGUGToListCtrlTuples( self, gug ):
|
||||
|
@ -819,9 +815,14 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditGUG( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._gug_list_ctrl.GetTopSelectedData()
|
||||
|
||||
for gug in self._gug_list_ctrl.GetData( only_selected = True ):
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
gug: ClientNetworkingGUG.GalleryURLGenerator = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit gallery url generator' ) as dlg:
|
||||
|
||||
|
@ -831,36 +832,32 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._gug_list_ctrl.DeleteDatas( ( gug, ) )
|
||||
existing_names = self._GetExistingNames()
|
||||
existing_names.discard( gug.GetName() )
|
||||
|
||||
gug = panel.GetValue()
|
||||
edited_gug = panel.GetValue()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( gug, self._GetExistingNames() )
|
||||
HydrusSerialisable.SetNonDupeName( edited_gug, existing_names )
|
||||
|
||||
self._gug_list_ctrl.AddDatas( ( gug, ) )
|
||||
|
||||
edited_datas.append( gug )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._gug_list_ctrl.ReplaceData( gug, edited_gug, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
self._gug_list_ctrl.SelectDatas( edited_datas )
|
||||
|
||||
self._gug_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _EditNGUG( self ):
|
||||
|
||||
data = self._ngug_list_ctrl.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
ngug: ClientNetworkingGUG.NestedGalleryURLGenerator = data
|
||||
|
||||
available_gugs = self._gug_list_ctrl.GetData()
|
||||
|
||||
edited_datas = []
|
||||
|
||||
for ngug in self._ngug_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit nested gallery url generator' ) as dlg:
|
||||
|
||||
panel = EditNGUGPanel( dlg, ngug, available_gugs )
|
||||
|
@ -869,28 +866,18 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._ngug_list_ctrl.DeleteDatas( ( ngug, ) )
|
||||
existing_names = self._GetExistingNames()
|
||||
existing_names.discard( ngug.GetName() )
|
||||
|
||||
ngug = panel.GetValue()
|
||||
edited_ngug = panel.GetValue()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( ngug, self._GetExistingNames() )
|
||||
HydrusSerialisable.SetNonDupeName( edited_ngug, existing_names )
|
||||
|
||||
self._ngug_list_ctrl.AddDatas( ( ngug, ) )
|
||||
|
||||
edited_datas.append( ngug )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._ngug_list_ctrl.ReplaceData( ngug, edited_ngug, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
self._ngug_list_ctrl.SelectDatas( edited_datas )
|
||||
|
||||
self._ngug_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
gugs = self._gug_list_ctrl.GetData()
|
||||
|
@ -2368,20 +2355,20 @@ class EditURLClassesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
url_class = panel.GetValue()
|
||||
|
||||
self._AddURLClass( url_class )
|
||||
self._AddURLClass( url_class, select_sort_and_scroll = True )
|
||||
|
||||
self._list_ctrl.Sort()
|
||||
|
||||
|
||||
|
||||
|
||||
def _AddURLClass( self, url_class ):
|
||||
def _AddURLClass( self, url_class, select_sort_and_scroll = False ):
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( url_class, self._GetExistingNames() )
|
||||
|
||||
url_class.RegenerateClassKey()
|
||||
|
||||
self._list_ctrl.AddDatas( ( url_class, ) )
|
||||
self._list_ctrl.AddDatas( ( url_class, ), select_sort_and_scroll = select_sort_and_scroll )
|
||||
|
||||
self._changes_made = True
|
||||
|
||||
|
@ -2412,9 +2399,14 @@ class EditURLClassesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._list_ctrl.GetTopSelectedData()
|
||||
|
||||
for url_class in self._list_ctrl.GetData( only_selected = True ):
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
url_class = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit url class' ) as dlg:
|
||||
|
||||
|
@ -2424,28 +2416,18 @@ class EditURLClassesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
self._list_ctrl.DeleteDatas( ( url_class, ) )
|
||||
existing_names = self._GetExistingNames()
|
||||
existing_names.discard( url_class.GetName() )
|
||||
|
||||
url_class = panel.GetValue()
|
||||
edited_url_class = panel.GetValue()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( url_class, self._GetExistingNames() )
|
||||
HydrusSerialisable.SetNonDupeName( edited_url_class, existing_names )
|
||||
|
||||
self._list_ctrl.AddDatas( ( url_class, ) )
|
||||
|
||||
edited_datas.append( url_class )
|
||||
self._list_ctrl.ReplaceData( url_class, edited_url_class, sort_and_scroll = True )
|
||||
|
||||
self._changes_made = True
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._list_ctrl.SelectDatas( edited_datas )
|
||||
|
||||
self._list_ctrl.Sort()
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
@ -2489,6 +2471,9 @@ class EditURLClassesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
text = 'Matches "' + url_class.GetName() + '"'
|
||||
|
||||
self._list_ctrl.SelectDatas( ( url_class, ), deselect_others = True )
|
||||
self._list_ctrl.ScrollToData( url_class )
|
||||
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
|
|
|
@ -80,7 +80,13 @@ def AppendMenuItem( menu, label, description, callable, *args, role: QW.QAction.
|
|||
return menu_item
|
||||
|
||||
|
||||
def AppendMenuLabel( menu, label, description = '', copy_text = '' ):
|
||||
def AppendMenuLabel( menu, label, description = '', copy_text = '', no_copy = False ):
|
||||
|
||||
if no_copy:
|
||||
|
||||
description = ''
|
||||
|
||||
else:
|
||||
|
||||
if description == label:
|
||||
|
||||
|
@ -97,14 +103,18 @@ def AppendMenuLabel( menu, label, description = '', copy_text = '' ):
|
|||
description = f'copy "{copy_text}" to clipboard'
|
||||
|
||||
|
||||
|
||||
menu_item = QW.QAction( menu )
|
||||
|
||||
SetMenuTexts( menu_item, label, description )
|
||||
|
||||
menu.addAction( menu_item )
|
||||
|
||||
if not no_copy:
|
||||
|
||||
BindMenuItem( menu_item, CG.client_controller.pub, 'clipboard', 'text', copy_text )
|
||||
|
||||
|
||||
return menu_item
|
||||
|
||||
def AppendMenuOrItem( menu, submenu_name, menu_tuples, sort_tuples = True ):
|
||||
|
|
|
@ -361,7 +361,7 @@ class RatingIncDec( QW.QWidget ):
|
|||
|
||||
control = ClientGUICommon.BetterSpinBox( self, initial = self._rating, min = 0, max = 1000000 )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -279,21 +279,26 @@ class EditShortcutSetPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
data = ( shortcut, command )
|
||||
|
||||
self._shortcuts.AddDatas( ( data, ) )
|
||||
self._shortcuts.AddDatas( ( data, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def EditShortcuts( self ):
|
||||
|
||||
name = self._name.text()
|
||||
data = self._shortcuts.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
for data in self._shortcuts.GetData( only_selected = True ):
|
||||
|
||||
( shortcut, command ) = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit shortcut command' ) as dlg:
|
||||
|
||||
name = self._name.text()
|
||||
|
||||
panel = EditShortcutAndCommandPanel( dlg, shortcut, command, name )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
@ -304,11 +309,7 @@ class EditShortcutSetPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
new_data = ( new_shortcut, new_command )
|
||||
|
||||
self._shortcuts.ReplaceData( data, new_data )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._shortcuts.ReplaceData( data, new_data, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -490,7 +491,7 @@ class EditShortcutsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
new_shortcuts = panel.GetValue()
|
||||
|
||||
self._custom_shortcuts.AddDatas( ( new_shortcuts, ) )
|
||||
self._custom_shortcuts.AddDatas( ( new_shortcuts, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -507,9 +508,14 @@ class EditShortcutsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditCustom( self ):
|
||||
|
||||
all_selected = self._custom_shortcuts.GetData( only_selected = True )
|
||||
data = self._custom_shortcuts.GetTopSelectedData()
|
||||
|
||||
for shortcuts in all_selected:
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
shortcuts = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit shortcuts' ) as dlg:
|
||||
|
||||
|
@ -521,21 +527,21 @@ class EditShortcutsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_shortcuts = panel.GetValue()
|
||||
|
||||
self._custom_shortcuts.ReplaceData( shortcuts, edited_shortcuts )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
self._custom_shortcuts.ReplaceData( shortcuts, edited_shortcuts, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def _EditReserved( self ):
|
||||
|
||||
all_selected = self._reserved_shortcuts.GetData( only_selected = True )
|
||||
data = self._reserved_shortcuts.GetTopSelectedData()
|
||||
|
||||
for shortcuts in all_selected:
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
shortcuts = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit shortcuts' ) as dlg:
|
||||
|
||||
|
@ -547,11 +553,7 @@ class EditShortcutsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_shortcuts = panel.GetValue()
|
||||
|
||||
self._reserved_shortcuts.ReplaceData( shortcuts, edited_shortcuts )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._reserved_shortcuts.ReplaceData( shortcuts, edited_shortcuts, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -627,7 +629,7 @@ class EditShortcutsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if result == QW.QDialog.Accepted:
|
||||
|
||||
self._reserved_shortcuts.ReplaceData( existing_data, new_data )
|
||||
self._reserved_shortcuts.ReplaceData( existing_data, new_data, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
import os
|
||||
import re
|
||||
import typing
|
||||
|
||||
|
@ -23,6 +22,7 @@ from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
|
|||
from hydrus.client.gui.lists import ClientGUIListCtrl
|
||||
from hydrus.client.gui.panels import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIRegex
|
||||
|
||||
NO_RESULTS_TEXT = 'no results'
|
||||
|
||||
|
@ -721,12 +721,16 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._data_number = ClientGUICommon.BetterSpinBox( self._control_panel, min=0, max=65535 )
|
||||
self._data_encoding = ClientGUICommon.BetterChoice( self._control_panel )
|
||||
self._data_decoding = ClientGUICommon.BetterChoice( self._control_panel )
|
||||
self._data_regex_pattern = ClientGUIRegex.RegexInput( self._control_panel, show_group_menu = True )
|
||||
self._data_regex_repl = QW.QLineEdit( self._control_panel )
|
||||
self._data_date_link = ClientGUICommon.BetterHyperLink( self._control_panel, 'link to date info', 'https://docs.python.org/3/library/datetime.html#strftime-strptime-behavior' )
|
||||
self._data_timezone_decode = ClientGUICommon.BetterChoice( self._control_panel )
|
||||
self._data_timezone_encode = ClientGUICommon.BetterChoice( self._control_panel )
|
||||
self._data_timezone_offset = ClientGUICommon.BetterSpinBox( self._control_panel, min=-86400, max=86400 )
|
||||
|
||||
self._data_regex_pattern.setToolTip( f'Whatever this matches{HC.UNICODE_ELLIPSIS}' )
|
||||
self._data_regex_repl.setToolTip( f'{HC.UNICODE_ELLIPSIS}will be replaced with this.' )
|
||||
|
||||
self._data_hash_function = ClientGUICommon.BetterChoice( self._control_panel )
|
||||
tt = 'This hashes the string\'s UTF-8-decoded bytes to hexadecimal.'
|
||||
self._data_hash_function.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
|
@ -799,7 +803,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( pattern, repl ) = data
|
||||
|
||||
self._data_text.setText( pattern )
|
||||
self._data_regex_pattern.SetValue( pattern )
|
||||
self._data_regex_repl.setText( repl )
|
||||
|
||||
elif conversion_type == ClientStrings.STRING_CONVERSION_DATE_DECODE:
|
||||
|
@ -854,6 +858,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._data_number_label = ClientGUICommon.BetterStaticText( self, 'number data: ' )
|
||||
self._data_encoding_label = ClientGUICommon.BetterStaticText( self, 'encoding type: ' )
|
||||
self._data_decoding_label = ClientGUICommon.BetterStaticText( self, 'decoding type: ' )
|
||||
self._data_regex_pattern_label = ClientGUICommon.BetterStaticText( self, 'regex pattern: ' )
|
||||
self._data_regex_repl_label = ClientGUICommon.BetterStaticText( self, 'regex replacement: ' )
|
||||
self._data_date_link_label = ClientGUICommon.BetterStaticText( self, 'date info: ' )
|
||||
self._data_timezone_decode_label = ClientGUICommon.BetterStaticText( self, 'date decode timezone: ' )
|
||||
|
@ -868,6 +873,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
rows.append( ( self._data_number_label, self._data_number ) )
|
||||
rows.append( ( self._data_encoding_label, self._data_encoding ) )
|
||||
rows.append( ( self._data_decoding_label, self._data_decoding ) )
|
||||
rows.append( ( self._data_regex_pattern_label, self._data_regex_pattern ) )
|
||||
rows.append( ( self._data_regex_repl_label, self._data_regex_repl ) )
|
||||
rows.append( ( self._data_date_link_label, self._data_date_link ) )
|
||||
rows.append( ( self._data_timezone_decode_label, self._data_timezone_decode ) )
|
||||
|
@ -908,11 +914,12 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._conversion_type.currentIndexChanged.connect( self._UpdateDataControls )
|
||||
self._conversion_type.currentIndexChanged.connect( self._UpdateExampleText )
|
||||
|
||||
self._data_text.textEdited.connect( self._UpdateExampleText )
|
||||
self._data_text.textChanged.connect( self._UpdateExampleText )
|
||||
self._data_number.valueChanged.connect( self._UpdateExampleText )
|
||||
self._data_encoding.currentIndexChanged.connect( self._UpdateExampleText )
|
||||
self._data_decoding.currentIndexChanged.connect( self._UpdateExampleText )
|
||||
self._data_regex_repl.textEdited.connect( self._UpdateExampleText )
|
||||
self._data_regex_pattern.textChanged.connect( self._UpdateExampleText )
|
||||
self._data_regex_repl.textChanged.connect( self._UpdateExampleText )
|
||||
self._data_timezone_decode.currentIndexChanged.connect( self._UpdateExampleText )
|
||||
self._data_timezone_offset.valueChanged.connect( self._UpdateExampleText )
|
||||
self._data_timezone_encode.currentIndexChanged.connect( self._UpdateExampleText )
|
||||
|
@ -930,6 +937,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._data_number_label.setVisible( False )
|
||||
self._data_encoding_label.setVisible( False )
|
||||
self._data_decoding_label.setVisible( False )
|
||||
self._data_regex_pattern_label.setVisible( False )
|
||||
self._data_regex_repl_label.setVisible( False )
|
||||
self._data_date_link_label.setVisible( False )
|
||||
self._data_timezone_decode_label.setVisible( False )
|
||||
|
@ -942,6 +950,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._data_number.setVisible( False )
|
||||
self._data_encoding.setVisible( False )
|
||||
self._data_decoding.setVisible( False )
|
||||
self._data_regex_pattern.setVisible( False )
|
||||
self._data_regex_repl.setVisible( False )
|
||||
self._data_date_link.setVisible( False )
|
||||
self._data_timezone_decode.setVisible( False )
|
||||
|
@ -983,7 +992,15 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._data_number.setMinimum( 1 )
|
||||
|
||||
elif conversion_type in ( ClientStrings.STRING_CONVERSION_PREPEND_TEXT, ClientStrings.STRING_CONVERSION_APPEND_TEXT, ClientStrings.STRING_CONVERSION_DATE_DECODE, ClientStrings.STRING_CONVERSION_DATE_ENCODE, ClientStrings.STRING_CONVERSION_REGEX_SUB ):
|
||||
elif conversion_type == ClientStrings.STRING_CONVERSION_REGEX_SUB:
|
||||
|
||||
self._data_regex_pattern_label.setVisible( True )
|
||||
self._data_regex_pattern.setVisible( True )
|
||||
|
||||
self._data_regex_repl_label.setVisible( True )
|
||||
self._data_regex_repl.setVisible( True )
|
||||
|
||||
elif conversion_type in ( ClientStrings.STRING_CONVERSION_PREPEND_TEXT, ClientStrings.STRING_CONVERSION_APPEND_TEXT, ClientStrings.STRING_CONVERSION_DATE_DECODE, ClientStrings.STRING_CONVERSION_DATE_ENCODE ):
|
||||
|
||||
self._data_text_label.setVisible( True )
|
||||
self._data_text.setVisible( True )
|
||||
|
@ -1024,13 +1041,6 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._data_timezone_encode.setVisible( True )
|
||||
|
||||
|
||||
elif conversion_type == ClientStrings.STRING_CONVERSION_REGEX_SUB:
|
||||
|
||||
data_text_label = 'regex pattern: '
|
||||
|
||||
self._data_regex_repl_label.setVisible( True )
|
||||
self._data_regex_repl.setVisible( True )
|
||||
|
||||
|
||||
self._data_text_label.setText( data_text_label )
|
||||
|
||||
|
@ -1129,7 +1139,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
elif conversion_type == ClientStrings.STRING_CONVERSION_REGEX_SUB:
|
||||
|
||||
pattern = self._data_text.text()
|
||||
pattern = self._data_regex_pattern.GetValue()
|
||||
repl = self._data_regex_repl.text()
|
||||
|
||||
data = ( pattern, repl )
|
||||
|
@ -1310,7 +1320,7 @@ class EditStringMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._match_type.addItem( 'regex', ClientStrings.STRING_MATCH_REGEX )
|
||||
|
||||
self._match_value_fixed_input = QW.QLineEdit( self )
|
||||
self._match_value_regex_input = QW.QLineEdit( self )
|
||||
self._match_value_regex_input = ClientGUIRegex.RegexInput( self )
|
||||
|
||||
self._match_value_flexible_input = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
|
@ -1386,7 +1396,7 @@ class EditStringMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
elif match_type == ClientStrings.STRING_MATCH_REGEX:
|
||||
|
||||
match_value = self._match_value_regex_input.text()
|
||||
match_value = self._match_value_regex_input.GetValue()
|
||||
|
||||
|
||||
if match_type == ClientStrings.STRING_MATCH_FIXED:
|
||||
|
@ -1430,6 +1440,11 @@ class EditStringMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._match_value_fixed_input_label.setVisible( True )
|
||||
self._match_value_fixed_input.setVisible( True )
|
||||
|
||||
if self._match_value_fixed_input.text() == '':
|
||||
|
||||
self._match_value_fixed_input.setText( self._example_string.text() )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
self._min_chars_label.setVisible( True )
|
||||
|
@ -1450,6 +1465,11 @@ class EditStringMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._match_value_regex_input_label.setVisible( True )
|
||||
self._match_value_regex_input.setVisible( True )
|
||||
|
||||
if self._match_value_regex_input.GetValue() == '':
|
||||
|
||||
self._match_value_regex_input.SetValue( self._example_string.text() )
|
||||
|
||||
|
||||
|
||||
|
||||
self._UpdateTestResult()
|
||||
|
@ -1514,14 +1534,14 @@ class EditStringMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._match_value_fixed_input.setText( match_value )
|
||||
|
||||
elif match_type == ClientStrings.STRING_MATCH_REGEX:
|
||||
|
||||
self._match_value_regex_input.SetValue( match_value )
|
||||
|
||||
elif match_type == ClientStrings.STRING_MATCH_FLEXIBLE:
|
||||
|
||||
self._match_value_flexible_input.SetValue( match_value )
|
||||
|
||||
elif match_type == ClientStrings.STRING_MATCH_REGEX:
|
||||
|
||||
self._match_value_regex_input.setText( match_value )
|
||||
|
||||
|
||||
self._min_chars.SetValue( min_chars )
|
||||
self._max_chars.SetValue( max_chars )
|
||||
|
@ -2127,34 +2147,6 @@ class EditStringTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return string_match
|
||||
|
||||
|
||||
def SetValue( self, string_match: ClientStrings.StringMatch ):
|
||||
|
||||
( match_type, match_value, min_chars, max_chars, example_string ) = string_match.ToTuple()
|
||||
|
||||
self._match_type.SetValue( match_type )
|
||||
|
||||
self._match_value_flexible_input.SetValue( ClientStrings.ALPHA )
|
||||
|
||||
if match_type == ClientStrings.STRING_MATCH_FIXED:
|
||||
|
||||
self._match_value_fixed_input.setText( match_value )
|
||||
|
||||
elif match_type == ClientStrings.STRING_MATCH_FLEXIBLE:
|
||||
|
||||
self._match_value_flexible_input.SetValue( match_value )
|
||||
|
||||
elif match_type == ClientStrings.STRING_MATCH_REGEX:
|
||||
|
||||
self._match_value_regex_input.setText( match_value )
|
||||
|
||||
|
||||
self._min_chars.SetValue( min_chars )
|
||||
self._max_chars.SetValue( max_chars )
|
||||
|
||||
self._example_string.setText( example_string )
|
||||
|
||||
self._UpdateControlVisibility()
|
||||
|
||||
|
||||
class EditStringProcessorPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
|
|
|
@ -2009,7 +2009,7 @@ class CanvasHoverFrameRightDuplicates( CanvasHoverFrame ):
|
|||
control.setToolTip( ClientGUIFunctions.WrapToolTip( tooltip ) )
|
||||
control.SetValue( value )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -34,6 +34,7 @@ from hydrus.client.gui.networking import ClientGUINetworkJobControl
|
|||
from hydrus.client.gui.panels import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui.search import ClientGUIACDropdown
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIRegex
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
|
@ -208,13 +209,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
self._regexes = ClientGUIListBoxes.BetterQListWidget( self._regexes_panel )
|
||||
self._regexes.itemDoubleClicked.connect( self.EventRemoveRegex )
|
||||
|
||||
self._regex_box = QW.QLineEdit()
|
||||
self._regex_box.installEventFilter( ClientGUICommon.TextCatchEnterEventFilter( self._regexes, self.AddRegex ) )
|
||||
|
||||
self._regex_shortcuts = ClientGUICommon.RegexButton( self._regexes_panel )
|
||||
|
||||
self._regex_intro_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'a good regex introduction', 'https://www.aivosto.com/vbtips/regex.html' )
|
||||
self._regex_practise_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'regex practice', 'https://regexr.com/3cvmf' )
|
||||
self._regex_input = ClientGUIRegex.RegexInput( self )
|
||||
|
||||
#
|
||||
|
||||
|
@ -257,10 +252,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
#
|
||||
|
||||
self._regexes_panel.Add( self._regexes, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._regexes_panel.Add( self._regex_box, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._regexes_panel.Add( self._regex_shortcuts, CC.FLAGS_ON_RIGHT )
|
||||
self._regexes_panel.Add( self._regex_intro_link, CC.FLAGS_ON_RIGHT )
|
||||
self._regexes_panel.Add( self._regex_practise_link, CC.FLAGS_ON_RIGHT )
|
||||
self._regexes_panel.Add( self._regex_input, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
|
@ -295,6 +287,8 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
self._quick_namespaces_list.columnListContentsChanged.connect( self.tagsChanged )
|
||||
|
||||
self._regex_input.userHitEnter.connect( self.AddRegex )
|
||||
|
||||
|
||||
def _ConvertQuickRegexDataToListCtrlTuples( self, data ):
|
||||
|
||||
|
@ -356,7 +350,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
def AddRegex( self ):
|
||||
|
||||
regex = self._regex_box.text()
|
||||
regex = self._regex_input.GetValue()
|
||||
|
||||
if regex != '':
|
||||
|
||||
|
@ -377,7 +371,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
self._regexes.Append( regex, regex )
|
||||
|
||||
self._regex_box.clear()
|
||||
self._regex_input.SetValue( '' )
|
||||
|
||||
self.tagsChanged.emit()
|
||||
|
||||
|
@ -389,7 +383,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
selected = list( self._regexes.GetData( only_selected = True ) )
|
||||
|
||||
self._regex_box.setText( selected[0] )
|
||||
self._regex_input.SetValue( selected[0] )
|
||||
|
||||
self._regexes.DeleteSelected()
|
||||
|
||||
|
|
|
@ -121,15 +121,15 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._preimport_hash_check_type.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
self._preimport_url_check_type.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
|
||||
self._preimport_url_check_looks_for_neighbours = QW.QCheckBox( pre_import_panel )
|
||||
self._preimport_url_check_looks_for_neighbour_spam = QW.QCheckBox( pre_import_panel )
|
||||
|
||||
tt = 'When a file-url mapping is found, and additional check can be performed to see if it is trustworthy.'
|
||||
tt = 'When a file-url mapping is found, an additional check can be performed to see if it is trustworthy.'
|
||||
tt += '\n' * 2
|
||||
tt += 'If the URL has a Post URL Class, and the file has multiple other URLs with the same domain & URL Class (basically the file has multiple URLs on the same site), then the mapping is assumed to be some parse spam and not trustworthy (leading to more "this file looks new" results in the pre-check).'
|
||||
tt += 'If the URL we are checking is recognised as a Post URL, and the file it appears to refer to has other URLs with the same domain & URL Class as what we parsed for the current job (basically the file has or would get multiple URLs on the same site), then this discovered mapping is assumed to be some parse spam and not trustworthy (leading to a "this file looks new" result in the pre-check).'
|
||||
tt += '\n' * 2
|
||||
tt += 'This test is best left on unless you are doing a single job that is messed up by the logic.'
|
||||
|
||||
self._preimport_url_check_looks_for_neighbours.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
self._preimport_url_check_looks_for_neighbour_spam.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
|
||||
#
|
||||
|
||||
|
@ -199,7 +199,7 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._associate_primary_urls.setToolTip( ClientGUIFunctions.WrapToolTip( tt ) )
|
||||
|
||||
tt = 'If the parser discovers and additional source URL for another site (e.g. "This file on wewbooru was originally posted to Bixiv [here]."), should that URL be associated with the final URL? Should it be trusted to make \'already in db/previously deleted\' determinations?'
|
||||
tt = 'If the parser discovers an additional source URL for another site (e.g. "This file on wewbooru was originally posted to Bixiv [here]."), should that URL be associated with the final URL? Should it be trusted to make \'already in db/previously deleted\' determinations?'
|
||||
tt += '\n' * 2
|
||||
tt += 'You should turn this off if the site supplies bad (incorrect or imprecise or malformed) source urls.'
|
||||
|
||||
|
@ -242,13 +242,13 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
rows.append( ( 'check hashes to determine "already in db/previously deleted"?: ', self._preimport_hash_check_type ) )
|
||||
rows.append( ( 'check URLs to determine "already in db/previously deleted"?: ', self._preimport_url_check_type ) )
|
||||
rows.append( ( 'during URL check, check for neighbour-spam?: ', self._preimport_url_check_looks_for_neighbours ) )
|
||||
rows.append( ( 'during URL check, check for neighbour-spam?: ', self._preimport_url_check_looks_for_neighbour_spam ) )
|
||||
|
||||
else:
|
||||
|
||||
self._preimport_hash_check_type.setVisible( False )
|
||||
self._preimport_url_check_type.setVisible( False )
|
||||
self._preimport_url_check_looks_for_neighbours.setVisible( False )
|
||||
self._preimport_url_check_looks_for_neighbour_spam.setVisible( False )
|
||||
|
||||
|
||||
rows.append( ( 'allow decompression bombs: ', self._allow_decompression_bombs ) )
|
||||
|
@ -362,7 +362,7 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = file_import_options.GetPreImportOptions()
|
||||
|
||||
preimport_url_check_looks_for_neighbours = file_import_options.PreImportURLCheckLooksForNeighbours()
|
||||
preimport_url_check_looks_for_neighbour_spam = file_import_options.PreImportURLCheckLooksForNeighbourSpam()
|
||||
|
||||
mimes = file_import_options.GetAllowedSpecificFiletypes()
|
||||
|
||||
|
@ -371,7 +371,7 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._exclude_deleted.setChecked( exclude_deleted )
|
||||
self._preimport_hash_check_type.SetValue( preimport_hash_check_type )
|
||||
self._preimport_url_check_type.SetValue( preimport_url_check_type )
|
||||
self._preimport_url_check_looks_for_neighbours.setChecked( preimport_url_check_looks_for_neighbours )
|
||||
self._preimport_url_check_looks_for_neighbour_spam.setChecked( preimport_url_check_looks_for_neighbour_spam )
|
||||
self._allow_decompression_bombs.setChecked( allow_decompression_bombs )
|
||||
self._min_size.SetValue( min_size )
|
||||
self._max_size.SetValue( max_size )
|
||||
|
@ -456,7 +456,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
self._preimport_hash_check_type.SetValue( FileImportOptions.DO_CHECK )
|
||||
|
||||
|
||||
self._preimport_url_check_looks_for_neighbours.setEnabled( preimport_url_check_type != FileImportOptions.DO_NOT_CHECK )
|
||||
self._preimport_url_check_looks_for_neighbour_spam.setEnabled( preimport_url_check_type != FileImportOptions.DO_NOT_CHECK )
|
||||
|
||||
|
||||
def _UpdateIsDefault( self ):
|
||||
|
@ -512,7 +512,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
exclude_deleted = self._exclude_deleted.isChecked()
|
||||
preimport_hash_check_type = self._preimport_hash_check_type.GetValue()
|
||||
preimport_url_check_type = self._preimport_url_check_type.GetValue()
|
||||
preimport_url_check_looks_for_neighbours = self._preimport_url_check_looks_for_neighbours.isChecked()
|
||||
preimport_url_check_looks_for_neighbour_spam = self._preimport_url_check_looks_for_neighbour_spam.isChecked()
|
||||
allow_decompression_bombs = self._allow_decompression_bombs.isChecked()
|
||||
min_size = self._min_size.GetValue()
|
||||
max_size = self._max_size.GetValue()
|
||||
|
@ -529,7 +529,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
destination_location_context = self._destination_location_context.GetValue()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbourSpam( preimport_url_check_looks_for_neighbour_spam )
|
||||
file_import_options.SetAllowedSpecificFiletypes( self._mimes.GetValue() )
|
||||
file_import_options.SetDestinationLocationContext( destination_location_context )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
|
|
@ -383,6 +383,14 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
return indices
|
||||
|
||||
|
||||
def _IterateTopLevelItems( self ) -> typing.Iterator[ QW.QTreeWidgetItem ]:
|
||||
|
||||
for i in range( self.topLevelItemCount() ):
|
||||
|
||||
yield self.topLevelItem( i )
|
||||
|
||||
|
||||
|
||||
def _RecalculateIndicesAfterDelete( self ):
|
||||
|
||||
indices_and_data_info = sorted( self._indices_to_data_info.items() )
|
||||
|
@ -533,7 +541,14 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
|
||||
|
||||
|
||||
def AddDatas( self, datas: typing.Iterable[ object ] ):
|
||||
def AddDatas( self, datas: typing.Iterable[ object ], select_sort_and_scroll = False ):
|
||||
|
||||
datas = list( datas )
|
||||
|
||||
if len( datas ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
for data in datas:
|
||||
|
||||
|
@ -542,6 +557,17 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
self._AddDataInfo( ( data, display_tuple, sort_tuple ) )
|
||||
|
||||
|
||||
if select_sort_and_scroll:
|
||||
|
||||
self.SelectDatas( datas )
|
||||
|
||||
self.Sort()
|
||||
|
||||
first_data = sorted( ( ( self._data_to_indices[ data ], data ) for data in datas ) )[0][1]
|
||||
|
||||
self.ScrollToData( first_data )
|
||||
|
||||
|
||||
self.columnListContentsChanged.emit()
|
||||
|
||||
|
||||
|
@ -726,6 +752,24 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
return result
|
||||
|
||||
|
||||
def GetTopSelectedData( self ) -> typing.Optional[ object ]:
|
||||
|
||||
indices = self._GetSelectedIndices()
|
||||
|
||||
if len( indices ) > 0:
|
||||
|
||||
top_index = min( indices )
|
||||
|
||||
( data, display_tuple, sort_tuple ) = self._indices_to_data_info[ top_index ]
|
||||
|
||||
return data
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
def HasData( self, data: object ):
|
||||
|
||||
return data in self._data_to_indices
|
||||
|
@ -832,17 +876,38 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
|
||||
|
||||
|
||||
def SelectDatas( self, datas: typing.Iterable[ object ] ):
|
||||
|
||||
self.clearFocus()
|
||||
|
||||
for data in datas:
|
||||
def ScrollToData( self, data: object ):
|
||||
|
||||
if data in self._data_to_indices:
|
||||
|
||||
index = self._data_to_indices[ data ]
|
||||
|
||||
self.topLevelItem( index ).setSelected( True )
|
||||
item = self.topLevelItem( index )
|
||||
|
||||
self.scrollToItem( item, hint = QW.QAbstractItemView.ScrollHint.PositionAtCenter )
|
||||
|
||||
|
||||
|
||||
def SelectDatas( self, datas: typing.Iterable[ object ], deselect_others = False ):
|
||||
|
||||
self.clearFocus()
|
||||
|
||||
selectee_indices = { self._data_to_indices[ data ] for data in datas if data in self._data_to_indices }
|
||||
|
||||
if deselect_others:
|
||||
|
||||
for ( index, item ) in enumerate( self._IterateTopLevelItems() ):
|
||||
|
||||
item.setSelected( index in selectee_indices )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
for index in selectee_indices:
|
||||
|
||||
item = self.topLevelItem( index )
|
||||
|
||||
item.setSelected( True )
|
||||
|
||||
|
||||
|
||||
|
@ -1095,7 +1160,21 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
HydrusSerialisable.SetNonDupeName( obj, current_names )
|
||||
|
||||
|
||||
def ReplaceData( self, old_data: object, new_data: object ):
|
||||
def ReplaceData( self, old_data: object, new_data: object, sort_and_scroll = False ):
|
||||
|
||||
self.ReplaceDatas( [ ( old_data, new_data ) ], sort_and_scroll = sort_and_scroll )
|
||||
|
||||
|
||||
def ReplaceDatas( self, replacement_tuples, sort_and_scroll = False ):
|
||||
|
||||
first_new_data = None
|
||||
|
||||
for ( old_data, new_data ) in replacement_tuples:
|
||||
|
||||
if first_new_data is None:
|
||||
|
||||
first_new_data = new_data
|
||||
|
||||
|
||||
new_data = QP.ListsToTuples( new_data )
|
||||
|
||||
|
@ -1114,6 +1193,15 @@ class BetterListCtrl( QW.QTreeWidget ):
|
|||
self._UpdateRow( data_index, display_tuple )
|
||||
|
||||
|
||||
if sort_and_scroll and first_new_data is not None:
|
||||
|
||||
self.Sort()
|
||||
|
||||
self.ScrollToData( first_new_data )
|
||||
|
||||
|
||||
|
||||
|
||||
class BetterListCtrlPanel( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
@ -1124,7 +1212,7 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
self._buttonbox = QP.HBoxLayout()
|
||||
|
||||
self._listctrl = None
|
||||
self._listctrl: typing.Optional[ BetterListCtrl ] = None
|
||||
|
||||
self._permitted_object_types = []
|
||||
self._import_add_callable = lambda x: None
|
||||
|
@ -1137,12 +1225,20 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
defaults = defaults_callable()
|
||||
|
||||
if len( defaults ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
for default in defaults:
|
||||
|
||||
add_callable( default )
|
||||
|
||||
|
||||
# try it, it might not work, if what is actually added differs, but it may!
|
||||
self._listctrl.SelectDatas( defaults )
|
||||
self._listctrl.Sort()
|
||||
self._listctrl.ScrollToData( list( defaults )[0] )
|
||||
|
||||
|
||||
def _AddButton( self, button, enabled_only_on_selection = False, enabled_only_on_single_selection = False, enabled_check_func = None ):
|
||||
|
@ -1184,12 +1280,20 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
return
|
||||
|
||||
|
||||
if len( defaults_to_add ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
for default in defaults_to_add:
|
||||
|
||||
add_callable( default )
|
||||
|
||||
|
||||
# try it, it might not work, if what is actually added differs, but it may!
|
||||
self._listctrl.SelectDatas( defaults_to_add )
|
||||
self._listctrl.Sort()
|
||||
self._listctrl.ScrollToData( list( defaults_to_add )[0] )
|
||||
|
||||
|
||||
def _Duplicate( self ):
|
||||
|
@ -1446,16 +1550,16 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
def _ImportObject( self, obj, can_present_messages = True ):
|
||||
|
||||
num_added = 0
|
||||
bad_object_type_names = set()
|
||||
objects_added = []
|
||||
|
||||
if isinstance( obj, HydrusSerialisable.SerialisableList ):
|
||||
|
||||
for sub_obj in obj:
|
||||
|
||||
( sub_num_added, sub_bad_object_type_names ) = self._ImportObject( sub_obj, can_present_messages = False )
|
||||
( sub_objects_added, sub_bad_object_type_names ) = self._ImportObject( sub_obj, can_present_messages = False )
|
||||
|
||||
num_added += sub_num_added
|
||||
objects_added.extend( sub_objects_added )
|
||||
bad_object_type_names.update( sub_bad_object_type_names )
|
||||
|
||||
|
||||
|
@ -1465,7 +1569,7 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
self._import_add_callable( obj )
|
||||
|
||||
num_added += 1
|
||||
objects_added.append( obj )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -1486,14 +1590,20 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
ClientGUIDialogsMessage.ShowWarning( self, message )
|
||||
|
||||
|
||||
num_added = len( objects_added )
|
||||
|
||||
if can_present_messages and num_added > 0:
|
||||
|
||||
message = '{} objects added!'.format( HydrusData.ToHumanInt( num_added ) )
|
||||
|
||||
ClientGUIDialogsMessage.ShowInformation( self, message )
|
||||
|
||||
self._listctrl.SelectDatas( objects_added )
|
||||
self._listctrl.Sort()
|
||||
self._listctrl.ScrollToData( objects_added[0] )
|
||||
|
||||
return ( num_added, bad_object_type_names )
|
||||
|
||||
return ( objects_added, bad_object_type_names )
|
||||
|
||||
|
||||
def _ImportJSONs( self, paths ):
|
||||
|
|
|
@ -1049,7 +1049,7 @@ class TimeDeltaButton( QW.QPushButton ):
|
|||
|
||||
control.SetValue( self._value )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -257,7 +257,7 @@ class EditAccountTypesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
new_account_type = panel.GetValue()
|
||||
|
||||
self._account_types_listctrl.AddDatas( ( new_account_type, ) )
|
||||
self._account_types_listctrl.AddDatas( ( new_account_type, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -337,17 +337,22 @@ class EditAccountTypesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
datas = self._account_types_listctrl.GetData( only_selected = True )
|
||||
data = self._account_types_listctrl.GetTopSelectedData()
|
||||
|
||||
if True in ( at.IsNullAccount() for at in datas ):
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
account_type = data
|
||||
|
||||
if account_type.IsNullAccount():
|
||||
|
||||
ClientGUIDialogsMessage.ShowWarning( self, 'You cannot edit the null account type!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
for account_type in datas:
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit account type' ) as dlg_edit:
|
||||
|
||||
panel = EditAccountTypePanel( dlg_edit, self._service_type, account_type )
|
||||
|
@ -358,12 +363,7 @@ class EditAccountTypesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_account_type = panel.GetValue()
|
||||
|
||||
self._account_types_listctrl.ReplaceData( account_type, edited_account_type )
|
||||
|
||||
else:
|
||||
|
||||
return
|
||||
|
||||
self._account_types_listctrl.ReplaceData( account_type, edited_account_type, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
|
@ -475,9 +476,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.AddDatas( ( domain_and_login_info, ) )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.AddDatas( ( domain_and_login_info, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
def _CanDoLogin( self ):
|
||||
|
@ -524,9 +523,14 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _CanEditCreds( self ):
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
data = self._domains_and_login_info.GetTopSelectedData()
|
||||
|
||||
for domain_and_login_info in domain_and_login_infos:
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
domain_and_login_info = data
|
||||
|
||||
( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason ) = domain_and_login_info
|
||||
|
||||
|
@ -541,8 +545,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
continue
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return False
|
||||
|
@ -786,11 +789,14 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditCredentials( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._domains_and_login_info.GetTopSelectedData()
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for domain_and_login_info in domain_and_login_infos:
|
||||
return
|
||||
|
||||
|
||||
domain_and_login_info = data
|
||||
|
||||
( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason ) = domain_and_login_info
|
||||
|
||||
|
@ -829,7 +835,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
continue
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
@ -876,24 +882,19 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.DeleteDatas( ( domain_and_login_info, ) )
|
||||
self._domains_and_login_info.AddDatas( ( edited_domain_and_login_info, ) )
|
||||
|
||||
edited_datas.append( edited_domain_and_login_info )
|
||||
|
||||
|
||||
self._domains_and_login_info.SelectDatas( edited_datas )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.ReplaceData( domain_and_login_info, edited_domain_and_login_info, sort_and_scroll = True )
|
||||
|
||||
|
||||
def _EditLoginScript( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._domains_and_login_info.GetTopSelectedData()
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for domain_and_login_info in domain_and_login_infos:
|
||||
return
|
||||
|
||||
|
||||
domain_and_login_info = data
|
||||
|
||||
( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason ) = domain_and_login_info
|
||||
|
||||
|
@ -928,17 +929,17 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
if login_script is None:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
if login_script == current_login_script:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
login_script_key_and_name = login_script.GetLoginScriptKeyAndName()
|
||||
|
@ -959,7 +960,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
login_access_text = ClientNetworkingLogin.login_access_type_default_description_lookup[ login_access_type ]
|
||||
|
@ -972,7 +973,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
@ -1006,20 +1007,12 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.DeleteDatas( ( domain_and_login_info, ) )
|
||||
self._domains_and_login_info.AddDatas( ( edited_domain_and_login_info, ) )
|
||||
|
||||
edited_datas.append( edited_domain_and_login_info )
|
||||
|
||||
|
||||
self._domains_and_login_info.SelectDatas( edited_datas )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.ReplaceData( domain_and_login_info, edited_domain_and_login_info, sort_and_scroll = True )
|
||||
|
||||
|
||||
def _FlipActive( self ):
|
||||
|
||||
edited_datas = []
|
||||
edit_tuples = []
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
|
||||
|
@ -1031,15 +1024,10 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
flipped_domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.DeleteDatas( ( domain_and_login_info, ) )
|
||||
self._domains_and_login_info.AddDatas( ( flipped_domain_and_login_info, ) )
|
||||
|
||||
edited_datas.append( flipped_domain_and_login_info )
|
||||
edit_tuples.append( ( domain_and_login_info, flipped_domain_and_login_info ) )
|
||||
|
||||
|
||||
self._domains_and_login_info.SelectDatas( edited_datas )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.ReplaceDatas( edit_tuples, sort_and_scroll = True )
|
||||
|
||||
|
||||
def _GetLoginScript( self, login_script_key_and_name ):
|
||||
|
@ -1067,7 +1055,7 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _ScrubDelays( self ):
|
||||
|
||||
edited_datas = []
|
||||
edit_tuples = []
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
|
||||
|
@ -1080,20 +1068,15 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
scrubbed_domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.DeleteDatas( ( domain_and_login_info, ) )
|
||||
self._domains_and_login_info.AddDatas( ( scrubbed_domain_and_login_info, ) )
|
||||
|
||||
edited_datas.append( scrubbed_domain_and_login_info )
|
||||
edit_tuples.append( ( domain_and_login_info, scrubbed_domain_and_login_info ) )
|
||||
|
||||
|
||||
self._domains_and_login_info.SelectDatas( edited_datas )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.ReplaceDatas( edit_tuples, sort_and_scroll = True )
|
||||
|
||||
|
||||
def _ScrubInvalidity( self ):
|
||||
|
||||
edited_datas = []
|
||||
edit_tuples = []
|
||||
|
||||
domain_and_login_infos = self._domains_and_login_info.GetData( only_selected = True )
|
||||
|
||||
|
@ -1132,15 +1115,10 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
scrubbed_domain_and_login_info = ( login_domain, login_script_key_and_name, credentials_tuple, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
self._domains_and_login_info.DeleteDatas( ( domain_and_login_info, ) )
|
||||
self._domains_and_login_info.AddDatas( ( scrubbed_domain_and_login_info, ) )
|
||||
|
||||
edited_datas.append( scrubbed_domain_and_login_info )
|
||||
edit_tuples.append( domain_and_login_info, scrubbed_domain_and_login_info )
|
||||
|
||||
|
||||
self._domains_and_login_info.SelectDatas( edited_datas )
|
||||
|
||||
self._domains_and_login_info.Sort()
|
||||
self._domains_and_login_info.ReplaceDatas( edit_tuples, sort_and_scroll = True )
|
||||
|
||||
|
||||
def GetDomainsToLoginAfterOK( self ):
|
||||
|
@ -1439,9 +1417,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
HydrusSerialisable.SetNonDupeName( new_credential_definition, self._GetExistingCredentialDefinitionNames() )
|
||||
|
||||
self._credential_definitions.AddDatas( ( new_credential_definition, ) )
|
||||
|
||||
self._credential_definitions.Sort()
|
||||
self._credential_definitions.AddDatas( ( new_credential_definition, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -1505,9 +1481,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
example_domain_info = ( domain, access_type, access_text )
|
||||
|
||||
self._example_domains_info.AddDatas( ( example_domain_info, ) )
|
||||
|
||||
self._example_domains_info.Sort()
|
||||
self._example_domains_info.AddDatas( ( example_domain_info, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
def _AddLoginStep( self ):
|
||||
|
@ -1577,11 +1551,14 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditCredentialDefinitions( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._credential_definitions.GetTopSelectedData()
|
||||
|
||||
credential_definitions = self._credential_definitions.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for credential_definition in credential_definitions:
|
||||
return
|
||||
|
||||
|
||||
credential_definition = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit login script', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
|
@ -1593,26 +1570,17 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_credential_definition = panel.GetValue()
|
||||
|
||||
self._credential_definitions.DeleteDatas( ( credential_definition, ) )
|
||||
existing_names = self._GetExistingCredentialDefinitionNames()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( edited_credential_definition, self._GetExistingCredentialDefinitionNames() )
|
||||
existing_names.discard( credential_definition.GetName() )
|
||||
|
||||
self._credential_definitions.AddDatas( ( edited_credential_definition, ) )
|
||||
HydrusSerialisable.SetNonDupeName( edited_credential_definition, existing_names )
|
||||
|
||||
edited_datas.append( edited_credential_definition )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._credential_definitions.ReplaceData( credential_definition, edited_credential_definition, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
self._credential_definitions.SelectDatas( edited_datas )
|
||||
|
||||
self._credential_definitions.Sort()
|
||||
|
||||
|
||||
def _DoTest( self ):
|
||||
|
||||
def qt_add_result( test_result ):
|
||||
|
@ -1759,11 +1727,14 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditExampleDomainsInfo( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._example_domains_info.GetTopSelectedData()
|
||||
|
||||
selected_example_domains_info = self._example_domains_info.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for example_domain_info in selected_example_domains_info:
|
||||
return
|
||||
|
||||
|
||||
example_domain_info = data
|
||||
|
||||
( original_domain, access_type, access_text ) = example_domain_info
|
||||
|
||||
|
@ -1775,7 +1746,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
@ -1785,7 +1756,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
ClientGUIDialogsMessage.ShowWarning( self, 'That domain already exists!' )
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
a_types = [ ClientNetworkingLogin.LOGIN_ACCESS_TYPE_EVERYTHING, ClientNetworkingLogin.LOGIN_ACCESS_TYPE_NSFW, ClientNetworkingLogin.LOGIN_ACCESS_TYPE_SPECIAL, ClientNetworkingLogin.LOGIN_ACCESS_TYPE_USER_PREFS_ONLY ]
|
||||
|
@ -1798,7 +1769,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
if new_access_type != access_type:
|
||||
|
@ -1816,22 +1787,13 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
break
|
||||
return
|
||||
|
||||
|
||||
|
||||
self._example_domains_info.DeleteDatas( ( example_domain_info, ) )
|
||||
|
||||
edited_example_domain_info = ( domain, access_type, access_text )
|
||||
|
||||
self._example_domains_info.AddDatas( ( edited_example_domain_info, ) )
|
||||
|
||||
edited_datas.append( edited_example_domain_info )
|
||||
|
||||
|
||||
self._example_domains_info.SelectDatas( edited_datas )
|
||||
|
||||
self._example_domains_info.Sort()
|
||||
self._example_domains_info.ReplaceData( example_domain_info, edited_example_domain_info, sort_and_scroll = True )
|
||||
|
||||
|
||||
def _EditLoginStep( self, login_step ):
|
||||
|
@ -1855,7 +1817,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
|
||||
|
||||
def _GetExistingCredentialDefinitionNames( self ):
|
||||
def _GetExistingCredentialDefinitionNames( self ) -> typing.Set[ str ]:
|
||||
|
||||
return { credential_definition.GetName() for credential_definition in self._credential_definitions.GetData() }
|
||||
|
||||
|
@ -1973,8 +1935,6 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._AddLoginScript( new_login_script )
|
||||
|
||||
self._login_scripts.Sort()
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1984,7 +1944,7 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
login_script.RegenerateLoginScriptKey()
|
||||
|
||||
self._login_scripts.AddDatas( ( login_script, ) )
|
||||
self._login_scripts.AddDatas( ( login_script, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
def _ConvertLoginScriptToListCtrlTuples( self, login_script ):
|
||||
|
@ -2004,11 +1964,14 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._login_scripts.GetTopSelectedData()
|
||||
|
||||
login_scripts = self._login_scripts.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for login_script in login_scripts:
|
||||
return
|
||||
|
||||
|
||||
login_script = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit login script', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
|
@ -2020,26 +1983,16 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_login_script = panel.GetValue()
|
||||
|
||||
self._login_scripts.DeleteDatas( ( login_script, ) )
|
||||
existing_names = self._GetExistingNames()
|
||||
existing_names.discard( login_script.GetName() )
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( edited_login_script, self._GetExistingNames() )
|
||||
HydrusSerialisable.SetNonDupeName( edited_login_script, existing_names )
|
||||
|
||||
self._login_scripts.AddDatas( ( edited_login_script, ) )
|
||||
|
||||
edited_datas.append( edited_login_script )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._login_scripts.ReplaceData( login_script, edited_login_script, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
self._login_scripts.SelectDatas( edited_datas )
|
||||
|
||||
self._login_scripts.Sort()
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
names = { login_script.GetName() for login_script in self._login_scripts.GetData() }
|
||||
|
@ -2052,6 +2005,7 @@ class EditLoginScriptsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return self._login_scripts.GetData()
|
||||
|
||||
|
||||
|
||||
class EditLoginStepPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, login_step ):
|
||||
|
|
|
@ -263,7 +263,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
exclude_deleted = advanced_import_options[ 'exclude_deleted' ]
|
||||
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
|
||||
preimport_url_check_type = FileImportOptions.DO_CHECK
|
||||
preimport_url_check_looks_for_neighbours = True
|
||||
preimport_url_check_looks_for_neighbour_spam = True
|
||||
allow_decompression_bombs = False
|
||||
min_size = advanced_import_options[ 'min_size' ]
|
||||
max_size = None
|
||||
|
@ -278,7 +278,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
file_import_options.SetPreImportURLCheckLooksForNeighbourSpam( preimport_url_check_looks_for_neighbour_spam )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
paths_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in additional_service_keys_to_tags } for ( path, additional_service_keys_to_tags ) in paths_to_tags.items() }
|
||||
|
|
|
@ -247,6 +247,7 @@ class EditSingleCtrlPanel( CAC.ApplicationCommandProcessorMixin, EditPanel ):
|
|||
|
||||
return self._control.toPlainText()
|
||||
|
||||
|
||||
return self._control.value()
|
||||
|
||||
|
||||
|
@ -275,11 +276,26 @@ class EditSingleCtrlPanel( CAC.ApplicationCommandProcessorMixin, EditPanel ):
|
|||
return command_processed
|
||||
|
||||
|
||||
def SetControl( self, control ):
|
||||
def SetControl( self, control, perpendicular = False ):
|
||||
|
||||
self._control = control
|
||||
|
||||
QP.AddToLayout( self._vbox, control, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
if perpendicular:
|
||||
|
||||
flag = CC.FLAGS_EXPAND_PERPENDICULAR
|
||||
|
||||
else:
|
||||
|
||||
flag = CC.FLAGS_EXPAND_BOTH_WAYS
|
||||
|
||||
|
||||
QP.AddToLayout( self._vbox, control, flag )
|
||||
|
||||
if perpendicular:
|
||||
|
||||
self._vbox.addStretch( 1 )
|
||||
|
||||
|
||||
|
||||
|
||||
class ManagePanel( ResizingScrolledPanel ):
|
||||
|
@ -289,6 +305,8 @@ class ManagePanel( ResizingScrolledPanel ):
|
|||
raise NotImplementedError()
|
||||
|
||||
|
||||
|
||||
class ReviewPanel( ResizingScrolledPanel ):
|
||||
|
||||
pass
|
||||
|
||||
|
|
|
@ -34,6 +34,7 @@ from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
|
|||
from hydrus.client.gui.lists import ClientGUIListCtrl
|
||||
from hydrus.client.gui.panels import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIRegex
|
||||
from hydrus.client.importing.options import NoteImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
|
@ -2694,11 +2695,21 @@ class EditRegexFavourites( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( regex_phrase, description ) = row
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'Update regex.', default = regex_phrase ) as dlg:
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit regex' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
control = ClientGUIRegex.RegexInput( panel )
|
||||
|
||||
control.SetValue( regex_phrase )
|
||||
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
regex_phrase = dlg.GetValue()
|
||||
regex_phrase = control.GetValue()
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'Update description.', default = description ) as dlg_2:
|
||||
|
||||
|
@ -2714,6 +2725,10 @@ class EditRegexFavourites( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_datas.append( edited_row )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
|
|
@ -1560,7 +1560,14 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def EditFrameLocations( self ):
|
||||
|
||||
for listctrl_list in self._frame_locations.GetData( only_selected = True ):
|
||||
data = self._frame_locations.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
listctrl_list = data
|
||||
|
||||
title = 'set frame location information'
|
||||
|
||||
|
@ -1574,8 +1581,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
new_listctrl_list = panel.GetValue()
|
||||
|
||||
self._frame_locations.ReplaceData( listctrl_list, new_listctrl_list )
|
||||
|
||||
self._frame_locations.ReplaceData( listctrl_list, new_listctrl_list, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -2955,7 +2961,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def EditMediaViewerOptions( self ):
|
||||
|
||||
for data in self._filetype_handling_listctrl.GetData( only_selected = True ):
|
||||
data = self._filetype_handling_listctrl.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
title = 'edit media view options information'
|
||||
|
||||
|
@ -2969,8 +2980,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
new_data = panel.GetValue()
|
||||
|
||||
self._filetype_handling_listctrl.ReplaceData( data, new_data )
|
||||
|
||||
self._filetype_handling_listctrl.ReplaceData( data, new_data, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -4921,7 +4931,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
control = ClientGUICommon.BetterSpinBox( panel, initial = 100, min = 0, max = 10000 )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg_2.SetPanel( panel )
|
||||
|
||||
|
@ -5006,7 +5016,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
control = ClientGUICommon.BetterSpinBox( panel, initial = weight, min = 0, max = 10000 )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -830,7 +830,7 @@ class MoveMediaFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
control.SetValue( max_num_bytes )
|
||||
|
||||
panel.SetControl( control )
|
||||
panel.SetControl( control, perpendicular = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -1693,20 +1693,18 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
new_parser = panel.GetValue()
|
||||
|
||||
self._AddParser( new_parser )
|
||||
|
||||
self._parsers.Sort()
|
||||
self._AddParser( new_parser, select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def _AddParser( self, parser ):
|
||||
def _AddParser( self, parser, select_sort_and_scroll = False ):
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( parser, self._GetExistingNames() )
|
||||
|
||||
parser.RegenerateParserKey()
|
||||
|
||||
self._parsers.AddDatas( ( parser, ) )
|
||||
self._parsers.AddDatas( ( parser, ), select_sort_and_scroll = select_sort_and_scroll )
|
||||
|
||||
|
||||
def _ConvertParserToListCtrlTuples( self, parser ):
|
||||
|
@ -1732,11 +1730,14 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
edited_datas = []
|
||||
data = self._parsers.GetTopSelectedData()
|
||||
|
||||
parsers = self._parsers.GetData( only_selected = True )
|
||||
if data is None:
|
||||
|
||||
for parser in parsers:
|
||||
return
|
||||
|
||||
|
||||
parser: ClientParsing.PageParser = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit parser', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
|
@ -1748,26 +1749,16 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
edited_parser = panel.GetValue()
|
||||
|
||||
self._parsers.DeleteDatas( ( parser, ) )
|
||||
if edited_parser.GetName() != parser.GetName():
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( edited_parser, self._GetExistingNames() )
|
||||
|
||||
self._parsers.AddDatas( ( edited_parser, ) )
|
||||
|
||||
edited_datas.append( edited_parser )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
self._parsers.ReplaceData( parser, edited_parser, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
self._parsers.SelectDatas( edited_datas )
|
||||
|
||||
self._parsers.Sort()
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
names = { parser.GetName() for parser in self._parsers.GetData() }
|
||||
|
|
|
@ -227,11 +227,17 @@ class EditNodes( QW.QWidget ):
|
|||
|
||||
def Edit( self ):
|
||||
|
||||
for node in self._nodes.GetData( only_selected = True ):
|
||||
data = self._nodes.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
node = data
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit node', frame_key = 'deeply_nested_dialog' ) as dlg:
|
||||
|
||||
referral_url = self._referral_url_callable()
|
||||
example_data = self._example_data_callable()
|
||||
|
||||
if isinstance( node, ClientParsing.ContentParser ):
|
||||
|
@ -249,9 +255,7 @@ class EditNodes( QW.QWidget ):
|
|||
|
||||
edited_node = panel.GetValue()
|
||||
|
||||
self._nodes.ReplaceData( node, edited_node )
|
||||
|
||||
|
||||
self._nodes.ReplaceData( node, edited_node, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -1057,7 +1061,14 @@ class ManageParsingScriptsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def Edit( self ):
|
||||
|
||||
for script in self._scripts.GetData( only_selected = True ):
|
||||
data = self._scripts.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
script = data
|
||||
|
||||
if isinstance( script, ClientParsing.ParseRootFileLookup ):
|
||||
|
||||
|
@ -1083,9 +1094,7 @@ class ManageParsingScriptsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._scripts.SetNonDupeName( edited_script )
|
||||
|
||||
|
||||
self._scripts.ReplaceData( script, edited_script )
|
||||
|
||||
|
||||
self._scripts.ReplaceData( script, edited_script, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -20,9 +20,10 @@ from hydrus.client.gui import ClientGUIFunctions
|
|||
from hydrus.client.gui import ClientGUIOptionsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.metadata import ClientGUITime
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIBytes
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUINumberTest
|
||||
from hydrus.client.gui.widgets import ClientGUIRegex
|
||||
from hydrus.client.search import ClientSearch
|
||||
|
||||
class StaticSystemPredicateButton( QW.QWidget ):
|
||||
|
@ -1456,7 +1457,7 @@ class PanelPredicateSystemKnownURLsRegex( PanelPredicateSystemSingle ):
|
|||
self._operator.addItem( 'has', True )
|
||||
self._operator.addItem( 'does not have', False )
|
||||
|
||||
self._regex = QW.QLineEdit( self )
|
||||
self._regex = ClientGUIRegex.RegexInput( self )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1465,7 +1466,7 @@ class PanelPredicateSystemKnownURLsRegex( PanelPredicateSystemSingle ):
|
|||
( operator, rule_type, rule, description ) = predicate.GetValue()
|
||||
|
||||
self._operator.SetValue( operator )
|
||||
self._regex.setText( rule )
|
||||
self._regex.SetValue( rule )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1491,7 +1492,7 @@ class PanelPredicateSystemKnownURLsRegex( PanelPredicateSystemSingle ):
|
|||
|
||||
def CheckValid( self ):
|
||||
|
||||
regex = self._regex.text()
|
||||
regex = self._regex.GetValue()
|
||||
|
||||
try:
|
||||
|
||||
|
@ -1518,7 +1519,7 @@ class PanelPredicateSystemKnownURLsRegex( PanelPredicateSystemSingle ):
|
|||
|
||||
rule_type = 'regex'
|
||||
|
||||
regex = self._regex.text()
|
||||
regex = self._regex.GetValue()
|
||||
|
||||
rule = regex
|
||||
|
||||
|
|
|
@ -336,7 +336,7 @@ class ManageServerServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._SetNonDupePort( new_service )
|
||||
|
||||
self._services_listctrl.AddDatas( ( new_service, ) )
|
||||
self._services_listctrl.AddDatas( ( new_service, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
@ -368,7 +368,14 @@ class ManageServerServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
for service in self._services_listctrl.GetData( only_selected = True ):
|
||||
data = self._services_listctrl.GetTopSelectedData()
|
||||
|
||||
if data is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
service = data
|
||||
|
||||
original_name = service.GetName()
|
||||
|
||||
|
@ -391,12 +398,7 @@ class ManageServerServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._SetNonDupePort( edited_service )
|
||||
|
||||
self._services_listctrl.ReplaceData( service, edited_service )
|
||||
|
||||
elif dlg_edit.WasCancelled():
|
||||
|
||||
break
|
||||
|
||||
self._services_listctrl.ReplaceData( service, edited_service, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1,6 +1,3 @@
|
|||
import collections.abc
|
||||
import os
|
||||
import re
|
||||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
|
@ -1470,100 +1467,6 @@ class OnOffButton( QW.QPushButton ):
|
|||
|
||||
|
||||
|
||||
class RegexButton( BetterButton ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
BetterButton.__init__( self, parent, 'regex shortcuts', self._ShowMenu )
|
||||
|
||||
|
||||
def _ShowMenu( self ):
|
||||
|
||||
menu = ClientGUIMenus.GenerateMenu( self )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( menu, 'click on a phrase to copy it to the clipboard' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'whitespace character - \s', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'\s' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'number character - \d', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'\d' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'alphanumeric or underscore character - \w', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'\w' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'any character - .', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'.' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'backslash character - \\', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'\\' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'beginning of line - ^', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'^' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'end of line - $', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'$' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'any of these - [{HC.UNICODE_ELLIPSIS}]', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', f'text', '[{HC.UNICODE_ELLIPSIS}]' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'anything other than these - [^{HC.UNICODE_ELLIPSIS}]', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', f'[^{HC.UNICODE_ELLIPSIS}]' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or more matches, consuming as many as possible - *', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'*' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'1 or more matches, consuming as many as possible - +', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'+' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or 1 matches, preferring 1 - ?', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or more matches, consuming as few as possible - *?', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'*?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'1 or more matches, consuming as few as possible - +?', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'+?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or 1 matches, preferring 0 - ??', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'??' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'exactly m matches - {m}', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'{m}' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'm to n matches, consuming as many as possible - {m,n}', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'{m,n}' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'm to n matches, consuming as few as possible - {m,n}?', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'{m,n}?' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the next characters are: (non-consuming) - (?={HC.UNICODE_ELLIPSIS})', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', f'(?={HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the next characters are not: (non-consuming) - (?!{HC.UNICODE_ELLIPSIS})', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', f'(?!{HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the previous characters are: (non-consuming) - (?<={HC.UNICODE_ELLIPSIS})', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', f'(?<={HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the previous characters are not: (non-consuming) - (?<!{HC.UNICODE_ELLIPSIS})', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', f'(?<!{HC.UNICODE_ELLIPSIS})' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0074 -> 74 - [1-9]+\d*', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', r'[1-9]+\d*' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'filename - (?<=' + re.escape( os.path.sep ) + r')[^' + re.escape( os.path.sep ) + r']*?(?=\..*$)', 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', '(?<=' + re.escape( os.path.sep ) + r')[^' + re.escape( os.path.sep ) + r']*?(?=\..*$)' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'regex components' )
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'manage favourites', 'manage some custom favourite phrases', self._ManageFavourites )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
for ( regex_phrase, description ) in HC.options[ 'regex_favourites' ]:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, description, 'copy this phrase to the clipboard', CG.client_controller.pub, 'clipboard', 'text', regex_phrase )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'favourites' )
|
||||
|
||||
CGC.core().PopupMenu( self, menu )
|
||||
|
||||
|
||||
def _ManageFavourites( self ):
|
||||
|
||||
regex_favourites = HC.options[ 'regex_favourites' ]
|
||||
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui.panels import ClientGUIScrolledPanelsEdit
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'manage regex favourites' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditRegexFavourites( dlg, regex_favourites )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
regex_favourites = panel.GetValue()
|
||||
|
||||
HC.options[ 'regex_favourites' ] = regex_favourites
|
||||
|
||||
CG.client_controller.Write( 'save_options', HC.options )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class StaticBox( QW.QFrame ):
|
||||
|
||||
def __init__( self, parent, title ):
|
||||
|
|
|
@ -0,0 +1,220 @@
|
|||
import os
|
||||
import re
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientPaths
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
|
||||
class RegexButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, show_group_menu = False ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, '.*', self._ShowMenu )
|
||||
|
||||
self._show_group_menu = show_group_menu
|
||||
|
||||
width = ClientGUIFunctions.ConvertTextToPixelWidth( self, 4 )
|
||||
|
||||
self.setFixedWidth( width )
|
||||
|
||||
|
||||
def _ShowMenu( self ):
|
||||
|
||||
menu = ClientGUIMenus.GenerateMenu( self )
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'a good regex introduction', 'If you have never heard of regex before, hit this!', ClientPaths.LaunchURLInWebBrowser, 'https://www.regular-expressions.info/index.html' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'a full interactive tutorial', 'If you want to work through a full lesson with problem solving on your end, hit this!', ClientPaths.LaunchURLInWebBrowser, 'https://www.regexone.com/' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'regex sandbox', 'You can play around here before you do something for real.', ClientPaths.LaunchURLInWebBrowser, 'https://regexr.com/3cvmf' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'regex help' )
|
||||
|
||||
#
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
#
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( submenu, 'click below to copy to clipboard', no_copy = True )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
copy_desc = 'copy this phrase to the clipboard'
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'whitespace character - \s', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\s' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'number character - \d', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\d' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'alphanumeric or underscore character - \w', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\w' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'any character - .', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'.' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'backslash character - \\', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\\' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'beginning of line - ^', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'^' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'end of line - $', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'$' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'any of these - [{HC.UNICODE_ELLIPSIS}]', copy_desc, CG.client_controller.pub, 'clipboard', f'text', '[{HC.UNICODE_ELLIPSIS}]' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'anything other than these - [^{HC.UNICODE_ELLIPSIS}]', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'[^{HC.UNICODE_ELLIPSIS}]' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or more matches, consuming as many as possible - *', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'*' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'1 or more matches, consuming as many as possible - +', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'+' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or 1 matches, preferring 1 - ?', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or more matches, consuming as few as possible - *?', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'*?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'1 or more matches, consuming as few as possible - +?', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'+?' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0 or 1 matches, preferring 0 - ??', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'??' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'exactly m matches - {m}', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'{m}' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'm to n matches, consuming as many as possible - {m,n}', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'{m,n}' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'm to n matches, consuming as few as possible - {m,n}?', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'{m,n}?' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the next characters are: (non-consuming) - (?={HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'(?={HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the next characters are not: (non-consuming) - (?!{HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'(?!{HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the previous characters are: (non-consuming) - (?<={HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'(?<={HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'the previous characters are not: (non-consuming) - (?<!{HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'(?<!{HC.UNICODE_ELLIPSIS})' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'0074 -> 74 - [1-9]+\d*', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'[1-9]+\d*' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'filename - (?<=' + re.escape( os.path.sep ) + r')[^' + re.escape( os.path.sep ) + r']*?(?=\..*$)', copy_desc, CG.client_controller.pub, 'clipboard', 'text', '(?<=' + re.escape( os.path.sep ) + r')[^' + re.escape( os.path.sep ) + r']*?(?=\..*$)' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'regex components' )
|
||||
|
||||
#
|
||||
|
||||
if self._show_group_menu:
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( submenu, 'click below to copy to clipboard', no_copy = True )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
copy_desc = 'copy this phrase to the clipboard'
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( submenu, '-in the pattern-', no_copy = True )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'unnamed group - ({HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'({HC.UNICODE_ELLIPSIS})' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, f'named group - (?P<name>{HC.UNICODE_ELLIPSIS})', copy_desc, CG.client_controller.pub, 'clipboard', 'text', f'(?P<name>{HC.UNICODE_ELLIPSIS})' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( submenu, '-in the replacement-', no_copy = True )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'reference nth unnamed group - \1', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\1' )
|
||||
ClientGUIMenus.AppendMenuItem( submenu, r'reference named group - \g<name>', copy_desc, CG.client_controller.pub, 'clipboard', 'text', r'\g<name>' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'regex replacement groups' )
|
||||
|
||||
|
||||
#
|
||||
|
||||
submenu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'manage favourites', 'manage some custom favourite phrases', self._ManageFavourites )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( submenu, 'click below to copy to clipboard', no_copy = True )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
for ( regex_phrase, description ) in HC.options[ 'regex_favourites' ]:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, description, copy_desc, CG.client_controller.pub, 'clipboard', 'text', regex_phrase )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'favourites' )
|
||||
|
||||
CGC.core().PopupMenu( self, menu )
|
||||
|
||||
|
||||
def _ManageFavourites( self ):
|
||||
|
||||
regex_favourites = HC.options[ 'regex_favourites' ]
|
||||
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui.panels import ClientGUIScrolledPanelsEdit
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'manage regex favourites' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditRegexFavourites( dlg, regex_favourites )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
regex_favourites = panel.GetValue()
|
||||
|
||||
HC.options[ 'regex_favourites' ] = regex_favourites
|
||||
|
||||
CG.client_controller.Write( 'save_options', HC.options )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class RegexInput( QW.QWidget ):
|
||||
|
||||
textChanged = QC.Signal()
|
||||
userHitEnter = QC.Signal()
|
||||
|
||||
def __init__( self, parent: QW.QWidget, show_group_menu = False ):
|
||||
|
||||
QW.QWidget.__init__( self, parent )
|
||||
|
||||
self._regex_text = QW.QLineEdit( self )
|
||||
self._regex_text.setPlaceholderText( 'regex input' )
|
||||
|
||||
self._regex_button = RegexButton( self, show_group_menu = show_group_menu )
|
||||
|
||||
hbox = QP.HBoxLayout( margin = 0 )
|
||||
|
||||
QP.AddToLayout( hbox, self._regex_text, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( hbox, self._regex_button, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
self.setLayout( hbox )
|
||||
|
||||
self._regex_text.installEventFilter( ClientGUICommon.TextCatchEnterEventFilter( self, self.userHitEnter.emit ) )
|
||||
self._regex_text.textChanged.connect( self.textChanged )
|
||||
self._regex_text.textChanged.connect( self._UpdateValidityStyle )
|
||||
|
||||
self._UpdateValidityStyle()
|
||||
|
||||
|
||||
def _UpdateValidityStyle( self ):
|
||||
|
||||
try:
|
||||
|
||||
re.compile( self._regex_text.text() )
|
||||
|
||||
self._regex_text.setObjectName( 'HydrusValid' )
|
||||
|
||||
except:
|
||||
|
||||
self._regex_text.setObjectName( 'HydrusInvalid' )
|
||||
|
||||
|
||||
self._regex_text.style().polish( self._regex_text )
|
||||
|
||||
|
||||
def GetValue( self ) -> str:
|
||||
|
||||
return self._regex_text.text()
|
||||
|
||||
|
||||
def SetValue( self, regex: str ):
|
||||
|
||||
self._regex_text.setText( regex )
|
||||
|
||||
|
|
@ -37,75 +37,91 @@ from hydrus.client.networking import ClientNetworkingFunctions
|
|||
FILE_SEED_TYPE_HDD = 0
|
||||
FILE_SEED_TYPE_URL = 1
|
||||
|
||||
def FileURLMappingHasUntrustworthyNeighbours( hash: bytes, url: str ):
|
||||
def FilterOneFileURLs( urls ):
|
||||
|
||||
# let's see if the file that has this url has any other interesting urls
|
||||
# if the file has another url with the same url class, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
|
||||
|
||||
try:
|
||||
|
||||
url = CG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
except HydrusExceptions.URLClassException:
|
||||
|
||||
# this url is so borked it doesn't parse. can't make neighbour inferences about it
|
||||
return False
|
||||
one_file_urls = []
|
||||
|
||||
for url in urls:
|
||||
|
||||
url_class = CG.client_controller.network_engine.domain_manager.GetURLClass( url )
|
||||
|
||||
# direct file URLs do not care about neighbours, since that can mean tokenised or different CDN URLs
|
||||
url_is_worried_about_neighbours = url_class is not None and url_class.GetURLType() not in ( HC.URL_TYPE_FILE, HC.URL_TYPE_UNKNOWN )
|
||||
if url_class is None:
|
||||
|
||||
if url_is_worried_about_neighbours:
|
||||
continue
|
||||
|
||||
|
||||
# direct file URLs do not care about neighbours, since that can mean tokenised or different CDN URLs, so skip file/unknown
|
||||
if url_class.GetURLType() != HC.URL_TYPE_POST:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if not url_class.RefersToOneFile():
|
||||
|
||||
continue
|
||||
|
||||
|
||||
one_file_urls.append( url )
|
||||
|
||||
|
||||
return one_file_urls
|
||||
|
||||
|
||||
def FileURLMappingHasUntrustworthyNeighbours( hash: bytes, lookup_urls: typing.Collection[ str ] ):
|
||||
|
||||
# let's see if the file that has this url has any other interesting urls
|
||||
# if the file has--or would have, after import--multiple URLs from the same domain with the same URL Class, but those URLs are supposed to only refer to one file, then we have a dodgy spam URL mapping so we cannot trust it
|
||||
# maybe this is the correct file, but we can't trust that it is mate
|
||||
|
||||
lookup_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( lookup_urls )
|
||||
|
||||
# this has probably already been done by the caller, but let's be sure
|
||||
lookup_urls = FilterOneFileURLs( lookup_urls )
|
||||
|
||||
if len( lookup_urls ) == 0:
|
||||
|
||||
# what is going on, yes, whatever garbage you just threw at me is not to be trusted to produce a dispositive result
|
||||
return True
|
||||
|
||||
|
||||
lookup_url_domains = { ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) for lookup_url in lookup_urls }
|
||||
lookup_url_classes = { CG.client_controller.network_engine.domain_manager.GetURLClass( lookup_url ) for lookup_url in lookup_urls }
|
||||
|
||||
media_result = CG.client_controller.Read( 'media_result', hash )
|
||||
|
||||
file_urls = media_result.GetLocationsManager().GetURLs()
|
||||
existing_file_urls = media_result.GetLocationsManager().GetURLs()
|
||||
|
||||
# normalise to collapse http/https dupes
|
||||
file_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( file_urls )
|
||||
existing_file_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( existing_file_urls )
|
||||
|
||||
for file_url in file_urls:
|
||||
existing_file_urls = FilterOneFileURLs( existing_file_urls )
|
||||
|
||||
if file_url == url:
|
||||
for file_url in existing_file_urls:
|
||||
|
||||
# obviously when we find ourselves, that's not a dupe
|
||||
if file_url in lookup_urls:
|
||||
|
||||
# obviously when we find ourselves, that's fine
|
||||
# this should happen at least once every time this method is called, since that's how we found the file!
|
||||
continue
|
||||
|
||||
|
||||
if ClientNetworkingFunctions.ConvertURLIntoDomain( file_url ) != ClientNetworkingFunctions.ConvertURLIntoDomain( url ):
|
||||
if ClientNetworkingFunctions.ConvertURLIntoDomain( file_url ) not in lookup_url_domains:
|
||||
|
||||
# checking here for the day when url classes can refer to multiple domains
|
||||
# this existing URL has a unique domain, so there is no domain spam here
|
||||
continue
|
||||
|
||||
|
||||
try:
|
||||
|
||||
file_url_class = CG.client_controller.network_engine.domain_manager.GetURLClass( file_url )
|
||||
|
||||
except HydrusExceptions.URLClassException:
|
||||
if file_url_class in lookup_url_classes:
|
||||
|
||||
# this is borked text, not matchable
|
||||
continue
|
||||
|
||||
|
||||
if file_url_class is None or url_class.GetURLType() in ( HC.URL_TYPE_FILE, HC.URL_TYPE_UNKNOWN ):
|
||||
|
||||
# being slightly superfluous here, but this file url can't be an untrustworthy neighbour
|
||||
continue
|
||||
|
||||
|
||||
if file_url_class == url_class:
|
||||
|
||||
# oh no, the file this source url refers to has a different known url in this same domain
|
||||
# it is more likely that an edit on this site points to the original elsewhere
|
||||
# oh no, the file these lookup urls refer to has a different known url in the same domain+url_class
|
||||
# it is likely that an edit on this site points to the original elsewhere
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
@ -887,7 +903,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
preimport_url_check_type = file_import_options.GetPreImportURLCheckType()
|
||||
|
||||
preimport_url_check_looks_for_neighbours = file_import_options.PreImportURLCheckLooksForNeighbours()
|
||||
preimport_url_check_looks_for_neighbour_spam = file_import_options.PreImportURLCheckLooksForNeighbourSpam()
|
||||
|
||||
match_found = False
|
||||
matches_are_dispositive = preimport_url_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
|
||||
|
@ -899,35 +915,50 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
# urls
|
||||
|
||||
urls = []
|
||||
lookup_urls = []
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
urls.append( self.file_seed_data_for_comparison )
|
||||
lookup_urls.append( self.file_seed_data_for_comparison )
|
||||
|
||||
|
||||
if file_url is not None:
|
||||
|
||||
urls.append( file_url )
|
||||
lookup_urls.append( file_url )
|
||||
|
||||
|
||||
urls.extend( self._primary_urls )
|
||||
lookup_urls.extend( self._primary_urls )
|
||||
|
||||
# now that we store primary and source urls separately, we'll trust any primary but be careful about source
|
||||
# trusting classless source urls was too much of a hassle with too many boorus providing bad source urls like user account pages
|
||||
|
||||
urls.extend( ( url for url in self._source_urls if CG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) ) )
|
||||
source_lookup_urls = [ url for url in self._source_urls if CG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) ]
|
||||
|
||||
all_neighbour_useful_lookup_urls = list( lookup_urls )
|
||||
all_neighbour_useful_lookup_urls.extend( source_lookup_urls )
|
||||
|
||||
if file_import_options.ShouldAssociateSourceURLs():
|
||||
|
||||
lookup_urls.extend( source_lookup_urls )
|
||||
|
||||
|
||||
# now discard gallery pages or post urls that can hold multiple files
|
||||
urls = [ url for url in urls if not CG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ) ]
|
||||
lookup_urls = [ url for url in lookup_urls if not CG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ) ]
|
||||
|
||||
lookup_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( urls )
|
||||
lookup_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( lookup_urls )
|
||||
|
||||
all_neighbour_useful_lookup_urls = [ url for url in all_neighbour_useful_lookup_urls if not CG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ) ]
|
||||
|
||||
all_neighbour_useful_lookup_urls = CG.client_controller.network_engine.domain_manager.NormaliseURLs( all_neighbour_useful_lookup_urls )
|
||||
|
||||
untrustworthy_domains = set()
|
||||
untrustworthy_hashes = set()
|
||||
|
||||
for lookup_url in lookup_urls:
|
||||
|
||||
if ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) in untrustworthy_domains:
|
||||
lookup_url_domain = ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url )
|
||||
|
||||
if lookup_url_domain in untrustworthy_domains:
|
||||
|
||||
continue
|
||||
|
||||
|
@ -948,9 +979,19 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
file_import_status = ClientImportFiles.CheckFileImportStatus( file_import_status )
|
||||
|
||||
if preimport_url_check_looks_for_neighbours and FileURLMappingHasUntrustworthyNeighbours( file_import_status.hash, lookup_url ):
|
||||
possible_hash = file_import_status.hash
|
||||
|
||||
untrustworthy_domains.add( ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) )
|
||||
if possible_hash in untrustworthy_hashes:
|
||||
|
||||
untrustworthy_domains.add( lookup_url_domain )
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if preimport_url_check_looks_for_neighbour_spam and FileURLMappingHasUntrustworthyNeighbours( possible_hash, all_neighbour_useful_lookup_urls ):
|
||||
|
||||
untrustworthy_domains.add( lookup_url_domain )
|
||||
untrustworthy_hashes.add( possible_hash )
|
||||
|
||||
continue
|
||||
|
||||
|
|
|
@ -24,6 +24,7 @@ from hydrus.client.importing import ClientImporting
|
|||
from hydrus.client.importing import ClientImportFileSeeds
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.interfaces import ClientControllerInterface
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientMetadataMigration
|
||||
from hydrus.client.metadata import ClientMetadataMigrationExporters
|
||||
|
@ -1204,6 +1205,26 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self._last_modified_time_skip_period
|
||||
|
||||
|
||||
def GetNextWorkTime( self ):
|
||||
|
||||
if self._paused:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
if self._check_now:
|
||||
|
||||
return HydrusTime.GetNow()
|
||||
|
||||
|
||||
if self._check_regularly:
|
||||
|
||||
return self._last_checked + self._period
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetMetadataRouters( self ):
|
||||
|
||||
return list( self._metadata_routers )
|
||||
|
@ -1277,4 +1298,227 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._publish_files_to_page = publish_files_to_page
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER ] = ImportFolder
|
||||
|
||||
class ImportFoldersManager( object ):
|
||||
|
||||
def __init__( self, controller: ClientControllerInterface.ClientControllerInterface ):
|
||||
|
||||
self._controller = controller
|
||||
|
||||
self._lock = threading.Lock()
|
||||
|
||||
self._serious_error_encountered = False
|
||||
|
||||
self._import_folder_names_fetched = False
|
||||
self._import_folder_names_to_next_work_time_cache: typing.Dict[ str, int ] = {}
|
||||
|
||||
self._wake_event = threading.Event()
|
||||
self._shutdown = threading.Event()
|
||||
|
||||
self._controller.sub( self, 'Shutdown', 'shutdown' )
|
||||
self._controller.sub( self, 'NotifyImportFoldersHaveChanged', 'notify_new_import_folders' )
|
||||
|
||||
|
||||
def _DoWork( self ):
|
||||
|
||||
if self._controller.new_options.GetBoolean( 'pause_import_folders_sync' ):
|
||||
|
||||
return
|
||||
|
||||
|
||||
name = self._GetImportFolderNameThatIsDue()
|
||||
|
||||
if name is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
import_folder = self._controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER, name )
|
||||
|
||||
except HydrusExceptions.DBException as e:
|
||||
|
||||
if isinstance( e.db_e, HydrusExceptions.DataMissing ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
del self._import_folder_names_to_next_work_time_cache[ name ]
|
||||
|
||||
return
|
||||
|
||||
|
||||
else:
|
||||
|
||||
raise
|
||||
|
||||
|
||||
|
||||
import_folder.DoWork()
|
||||
|
||||
with self._lock:
|
||||
|
||||
next_work_time = import_folder.GetNextWorkTime()
|
||||
|
||||
if next_work_time is None:
|
||||
|
||||
del self._import_folder_names_to_next_work_time_cache[ name ]
|
||||
|
||||
else:
|
||||
|
||||
self._import_folder_names_to_next_work_time_cache[ name ] = max( next_work_time, HydrusTime.GetNow() + 180 )
|
||||
|
||||
|
||||
|
||||
|
||||
def _GetImportFolderNameThatIsDue( self ):
|
||||
|
||||
if not self._import_folder_names_fetched:
|
||||
|
||||
import_folder_names = self._controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER )
|
||||
|
||||
with self._lock:
|
||||
|
||||
for name in import_folder_names:
|
||||
|
||||
self._import_folder_names_to_next_work_time_cache[ name ] = HydrusTime.GetNow()
|
||||
|
||||
|
||||
self._import_folder_names_fetched = True
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
for ( name, time_due ) in self._import_folder_names_to_next_work_time_cache.items():
|
||||
|
||||
if HydrusTime.TimeHasPassed( time_due ):
|
||||
|
||||
return name
|
||||
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def _GetTimeUntilNextWork( self ):
|
||||
|
||||
if self._controller.new_options.GetBoolean( 'pause_import_folders_sync' ):
|
||||
|
||||
return 1800
|
||||
|
||||
|
||||
if not self._import_folder_names_fetched:
|
||||
|
||||
return 180
|
||||
|
||||
|
||||
if len( self._import_folder_names_to_next_work_time_cache ) == 0:
|
||||
|
||||
return 1800
|
||||
|
||||
|
||||
next_work_time = min( self._import_folder_names_to_next_work_time_cache.values() )
|
||||
|
||||
return max( HydrusTime.TimeUntil( next_work_time ), 1 )
|
||||
|
||||
|
||||
def MainLoop( self ):
|
||||
|
||||
def check_shutdown():
|
||||
|
||||
if HydrusThreading.IsThreadShuttingDown() or self._shutdown.is_set() or self._serious_error_encountered:
|
||||
|
||||
raise HydrusExceptions.ShutdownException()
|
||||
|
||||
|
||||
|
||||
try:
|
||||
|
||||
time_to_start = HydrusTime.GetNow() + 5
|
||||
|
||||
while not HydrusTime.TimeHasPassed( time_to_start ):
|
||||
|
||||
check_shutdown()
|
||||
|
||||
time.sleep( 1 )
|
||||
|
||||
|
||||
while True:
|
||||
|
||||
check_shutdown()
|
||||
|
||||
self._controller.WaitUntilViewFree()
|
||||
|
||||
try:
|
||||
|
||||
HG.import_folders_running = True
|
||||
|
||||
self._DoWork()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
self._serious_error_encountered = True
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'There was an unexpected problem during import folders work! They will not run again this boot. A full traceback of this error should be written to the log.'
|
||||
message += '\n' * 2
|
||||
message += str( e )
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
return
|
||||
|
||||
finally:
|
||||
|
||||
HG.import_folders_running = False
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
wait_period = self._GetTimeUntilNextWork()
|
||||
|
||||
|
||||
self._wake_event.wait( wait_period )
|
||||
|
||||
self._wake_event.clear()
|
||||
|
||||
|
||||
except HydrusExceptions.ShutdownException:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
def NotifyImportFoldersHaveChanged( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._import_folder_names_fetched = False
|
||||
self._import_folder_names_to_next_work_time_cache = {}
|
||||
|
||||
|
||||
self.Wake()
|
||||
|
||||
|
||||
def Shutdown( self ):
|
||||
|
||||
self._shutdown.set()
|
||||
|
||||
self.Wake()
|
||||
|
||||
|
||||
def Start( self ):
|
||||
|
||||
self._controller.CallToThreadLongRunning( self.MainLoop )
|
||||
|
||||
|
||||
def Wake( self ):
|
||||
|
||||
self._wake_event.set()
|
||||
|
||||
|
||||
|
|
|
@ -49,7 +49,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._exclude_deleted = True
|
||||
self._preimport_hash_check_type = DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
|
||||
self._preimport_url_check_type = DO_CHECK
|
||||
self._preimport_url_check_looks_for_neighbours = True
|
||||
self._preimport_url_check_looks_for_neighbour_spam = True
|
||||
self._allow_decompression_bombs = True
|
||||
self._filetype_filter_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, value = set( HC.GENERAL_FILETYPES ) )
|
||||
self._min_size = None
|
||||
|
@ -82,7 +82,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_filetype_filter_predicate = self._filetype_filter_predicate.GetSerialisableTuple()
|
||||
|
||||
pre_import_options = ( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbours, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context )
|
||||
pre_import_options = ( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbour_spam, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context )
|
||||
post_import_options = ( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls )
|
||||
serialisable_presentation_import_options = self._presentation_import_options.GetSerialisableTuple()
|
||||
|
||||
|
@ -93,7 +93,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
( pre_import_options, post_import_options, serialisable_presentation_import_options, self._is_default ) = serialisable_info
|
||||
|
||||
( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbours, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context ) = pre_import_options
|
||||
( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbour_spam, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context ) = pre_import_options
|
||||
( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls ) = post_import_options
|
||||
self._presentation_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_presentation_import_options )
|
||||
|
||||
|
@ -268,9 +268,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
preimport_url_check_type = DO_CHECK
|
||||
|
||||
|
||||
preimport_url_check_looks_for_neighbours = True
|
||||
preimport_url_check_looks_for_neighbour_spam = True
|
||||
|
||||
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
|
||||
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbour_spam, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
|
||||
|
||||
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default )
|
||||
|
||||
|
@ -281,7 +281,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default ) = old_serialisable_info
|
||||
|
||||
( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context ) = pre_import_options
|
||||
( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbour_spam, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context ) = pre_import_options
|
||||
|
||||
filetype_filter_predicate = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_filetype_filter_predicate )
|
||||
|
||||
|
@ -297,7 +297,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_filetype_filter_predicate = filetype_filter_predicate.GetSerialisableTuple()
|
||||
|
||||
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
|
||||
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbour_spam, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
|
||||
|
||||
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default )
|
||||
|
||||
|
@ -518,9 +518,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return self._is_default
|
||||
|
||||
|
||||
def PreImportURLCheckLooksForNeighbours( self ) -> bool:
|
||||
def PreImportURLCheckLooksForNeighbourSpam( self ) -> bool:
|
||||
|
||||
return self._preimport_url_check_looks_for_neighbours
|
||||
return self._preimport_url_check_looks_for_neighbour_spam
|
||||
|
||||
|
||||
def SetAllowedSpecificFiletypes( self, mimes ) -> None:
|
||||
|
@ -547,9 +547,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._associate_source_urls = associate_source_urls
|
||||
|
||||
|
||||
def SetPreImportURLCheckLooksForNeighbours( self, preimport_url_check_looks_for_neighbours: bool ):
|
||||
def SetPreImportURLCheckLooksForNeighbourSpam( self, preimport_url_check_looks_for_neighbour_spam: bool ):
|
||||
|
||||
self._preimport_url_check_looks_for_neighbours = preimport_url_check_looks_for_neighbours
|
||||
self._preimport_url_check_looks_for_neighbour_spam = preimport_url_check_looks_for_neighbour_spam
|
||||
|
||||
|
||||
def SetPresentationImportOptions( self, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
|
||||
|
|
|
@ -105,7 +105,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 578
|
||||
SOFTWARE_VERSION = 579
|
||||
CLIENT_API_VERSION = 64
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
@ -1296,7 +1296,7 @@ mime_mimetype_string_lookup = {
|
|||
mime_mimetype_string_lookup[ UNDETERMINED_WM ] = '{} or {}'.format( mime_mimetype_string_lookup[ AUDIO_WMA ], mime_mimetype_string_lookup[ VIDEO_WMV ] )
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_MP4 ] = '{} or {}'.format( mime_mimetype_string_lookup[ AUDIO_MP4 ], mime_mimetype_string_lookup[ VIDEO_MP4 ] )
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_PNG ] = '{} or {}'.format( mime_mimetype_string_lookup[ IMAGE_PNG ], mime_mimetype_string_lookup[ ANIMATION_APNG ] )
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_WEBP ] = '{} or {}'.format( mime_mimetype_string_lookup[ IMAGE_WEBP ], mime_mimetype_string_lookup[ ANIMATION_WEBP ] )
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_WEBP ] = 'image/webp, static or animated'
|
||||
|
||||
mime_ext_lookup = {
|
||||
APPLICATION_HYDRUS_CLIENT_COLLECTION : '.collection',
|
||||
|
|
|
@ -331,7 +331,7 @@ def GetTimesToPlayPILAnimationFromPIL( pil_image: PILImage.Image ) -> int:
|
|||
|
||||
def PILAnimationHasDuration( path ):
|
||||
|
||||
pil_image = HydrusImageHandling.GeneratePILImage( path, dequantize = False )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
try:
|
||||
|
||||
|
|
|
@ -234,7 +234,7 @@ SYSTEM_PREDICATES = {
|
|||
'has tags': (Predicate.HAS_TAGS, None, None, None),
|
||||
'untagged|no tags': (Predicate.UNTAGGED, None, None, None),
|
||||
'num(ber)?( of)? tags': (Predicate.NUM_OF_TAGS, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
'num(ber)?( of)? (?=[^\\s].* tags)': (Predicate.NUM_OF_TAGS_WITH_NAMESPACE, None, Value.NAMESPACE_AND_NUM_TAGS, None),
|
||||
r'num(ber)?( of)? (?=[^\s].* tags)': (Predicate.NUM_OF_TAGS_WITH_NAMESPACE, None, Value.NAMESPACE_AND_NUM_TAGS, None),
|
||||
'num(ber)?( of)? urls': (Predicate.NUM_OF_URLS, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
'num(ber)?( of)? words': (Predicate.NUM_OF_WORDS, Operators.RELATIONAL_EXACT, Value.NATURAL, None),
|
||||
'height': (Predicate.HEIGHT, Operators.RELATIONAL, Value.NATURAL, Units.PIXELS_OR_NONE),
|
||||
|
@ -254,8 +254,8 @@ SYSTEM_PREDICATES = {
|
|||
'num(ber)?( of)? frames': (Predicate.NUM_OF_FRAMES, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
'file service': (Predicate.FILE_SERVICE, Operators.FILESERVICE_STATUS, Value.ANY_STRING, None),
|
||||
'num(ber)?( of)? file relationships': (Predicate.NUM_FILE_RELS, Operators.RELATIONAL, Value.NATURAL, Units.FILE_RELATIONSHIP_TYPE),
|
||||
'ratio(?=.*\d)': (Predicate.RATIO, Operators.RATIO_OPERATORS, Value.RATIO, None),
|
||||
'ratio(?!.*\d)': (Predicate.RATIO_SPECIAL, Operators.RATIO_OPERATORS_SPECIAL, Value.RATIO_SPECIAL, None),
|
||||
r'ratio(?=.*\d)': (Predicate.RATIO, Operators.RATIO_OPERATORS, Value.RATIO, None),
|
||||
r'ratio(?!.*\d)': (Predicate.RATIO_SPECIAL, Operators.RATIO_OPERATORS_SPECIAL, Value.RATIO_SPECIAL, None),
|
||||
'num pixels': (Predicate.NUM_PIXELS, Operators.RELATIONAL, Value.NATURAL, Units.PIXELS),
|
||||
'media views': (Predicate.MEDIA_VIEWS, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
'preview views': (Predicate.PREVIEW_VIEWS, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
|
@ -279,9 +279,9 @@ SYSTEM_PREDICATES = {
|
|||
'((has )?no|does not have( a)?|doesn\'t have( a)?) note (with name|named)': (Predicate.NO_NOTE_NAME, None, Value.ANY_STRING, None),
|
||||
'has( a)? rating( for)?': (Predicate.HAS_RATING, None, Value.ANY_STRING, None ),
|
||||
'((has )?no|does not have( a)?|doesn\'t have( a)?) rating( for)?': (Predicate.NO_RATING, None, Value.ANY_STRING, None ),
|
||||
'rating( for)?(?=.+?\d+/\d+$)': (Predicate.RATING_SPECIFIC_NUMERICAL, Operators.RELATIONAL_FOR_RATING_SERVICE, Value.RATING_SERVICE_NAME_AND_NUMERICAL_VALUE, None ),
|
||||
r'rating( for)?(?=.+?\d+/\d+$)': (Predicate.RATING_SPECIFIC_NUMERICAL, Operators.RELATIONAL_FOR_RATING_SERVICE, Value.RATING_SERVICE_NAME_AND_NUMERICAL_VALUE, None ),
|
||||
'rating( for)?(?=.+?(like|dislike)$)': (Predicate.RATING_SPECIFIC_LIKE_DISLIKE, None, Value.RATING_SERVICE_NAME_AND_LIKE_DISLIKE, None ),
|
||||
'rating( for)?(?=.+?[^/]\d+$)': (Predicate.RATING_SPECIFIC_INCDEC, Operators.RELATIONAL_FOR_RATING_SERVICE, Value.RATING_SERVICE_NAME_AND_INCDEC, None ),
|
||||
r'rating( for)?(?=.+?[^/]\d+$)': (Predicate.RATING_SPECIFIC_INCDEC, Operators.RELATIONAL_FOR_RATING_SERVICE, Value.RATING_SERVICE_NAME_AND_INCDEC, None ),
|
||||
}
|
||||
|
||||
def string_looks_like_date( string ):
|
||||
|
@ -426,9 +426,9 @@ def parse_value( string: str, spec ):
|
|||
|
||||
|
||||
elif spec == Value.SHA256_HASHLIST_WITH_DISTANCE:
|
||||
match = re.match( '(?P<hashes>([0-9a-f]{4}[0-9a-f]+(\s|,)*)+)(with\s+)?(distance\s+)?(of\s+)?(?P<distance>0|([1-9][0-9]*))?', string )
|
||||
match = re.match( r'(?P<hashes>([0-9a-f]{4}[0-9a-f]+(\s|,)*)+)(with\s+)?(distance\s+)?(of\s+)?(?P<distance>0|([1-9][0-9]*))?', string )
|
||||
if match:
|
||||
hashes = set( hsh.strip() for hsh in re.sub( '\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
hashes = set( hsh.strip() for hsh in re.sub( r'\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
|
||||
d = match.groupdict()
|
||||
|
||||
|
@ -444,9 +444,9 @@ def parse_value( string: str, spec ):
|
|||
return string[ len( match[ 0 ] ): ], (hashes, distance)
|
||||
raise ValueError( "Invalid value, expected a list of hashes with distance" )
|
||||
elif spec == Value.SIMILAR_TO_HASHLIST_WITH_DISTANCE:
|
||||
match = re.match( '(?P<hashes>([0-9a-f]{4}[0-9a-f]+(\s|,)*)+)(with\s+)?(distance\s+)?(of\s+)?(?P<distance>0|([1-9][0-9]*))?', string )
|
||||
match = re.match( r'(?P<hashes>([0-9a-f]{4}[0-9a-f]+(\s|,)*)+)(with\s+)?(distance\s+)?(of\s+)?(?P<distance>0|([1-9][0-9]*))?', string )
|
||||
if match:
|
||||
hashes = set( hsh.strip() for hsh in re.sub( '\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
hashes = set( hsh.strip() for hsh in re.sub( r'\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
pixel_hashes = { hash for hash in hashes if len( hash ) == 64 }
|
||||
perceptual_hashes = { hash for hash in hashes if len( hash ) == 16 }
|
||||
|
||||
|
@ -466,7 +466,7 @@ def parse_value( string: str, spec ):
|
|||
elif spec == Value.HASHLIST_WITH_ALGORITHM:
|
||||
|
||||
# hydev KISS hijack here, instead of clever regex to capture algorithm in all sorts of situations, let's just grab the hex we see and scan the rest for non-hex phrases mate
|
||||
# old pattern: match = re.match( '(?P<hashes>([0-9a-f]+(\s|,)*)+)((with\s+)?algorithm)?\s*(?P<algorithm>sha256|sha512|md5|sha1|)', string )
|
||||
# old pattern: match = re.match( r'(?P<hashes>([0-9a-f]+(\s|,)*)+)((with\s+)?algorithm)?\s*(?P<algorithm>sha256|sha512|md5|sha1|)', string )
|
||||
|
||||
algorithm = 'sha256'
|
||||
|
||||
|
@ -481,10 +481,10 @@ def parse_value( string: str, spec ):
|
|||
|
||||
|
||||
# {8} here to make sure we are looking at proper hash hex and not some short 'a' or 'de' word
|
||||
match = re.search( '(?P<hashes>([0-9a-f]{8}[0-9a-f]+(\s|,)*)+)', string )
|
||||
match = re.search( r'(?P<hashes>([0-9a-f]{8}[0-9a-f]+(\s|,)*)+)', string )
|
||||
|
||||
if match:
|
||||
hashes = set( hsh.strip() for hsh in re.sub( '\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
hashes = set( hsh.strip() for hsh in re.sub( r'\s', ' ', match[ 'hashes' ].replace( ',', ' ' ) ).split( ' ' ) if len( hsh ) > 0 )
|
||||
return string[ match.endpos : ], (hashes, algorithm)
|
||||
|
||||
raise ValueError( "Invalid value, expected a list of hashes and perhaps an algorithm" )
|
||||
|
@ -494,11 +494,11 @@ def parse_value( string: str, spec ):
|
|||
|
||||
valid_values = sorted( FILETYPES.keys(), key = lambda k: len( k ), reverse = True )
|
||||
ftype_regex = '(' + '|'.join( [ '(' + val + ')' for val in valid_values ] ) + ')'
|
||||
match = re.match( '(' + ftype_regex + '(\s|,)+)*' + ftype_regex, string )
|
||||
match = re.match( '(' + ftype_regex + r'(\s|,)+)*' + ftype_regex, string )
|
||||
|
||||
if match:
|
||||
|
||||
found_ftypes_all = re.sub( '\s', ' ', match[ 0 ].replace( ',', '|' ) ).split( '|' )
|
||||
found_ftypes_all = re.sub( r'\s', ' ', match[ 0 ].replace( ',', '|' ) ).split( '|' )
|
||||
found_ftypes_good = [ ]
|
||||
for ftype in found_ftypes_all:
|
||||
ftype = ftype.strip()
|
||||
|
@ -545,7 +545,7 @@ def parse_value( string: str, spec ):
|
|||
|
||||
else:
|
||||
|
||||
match = re.match( '((?P<year>0|([1-9][0-9]*))\s*(years|year))?\s*((?P<month>0|([1-9][0-9]*))\s*(months|month))?\s*((?P<day>0|([1-9][0-9]*))\s*(days|day))?\s*((?P<hour>0|([1-9][0-9]*))\s*(hours|hour|h))?', string )
|
||||
match = re.match( r'((?P<year>0|([1-9][0-9]*))\s*(years|year))?\s*((?P<month>0|([1-9][0-9]*))\s*(months|month))?\s*((?P<day>0|([1-9][0-9]*))\s*(days|day))?\s*((?P<hour>0|([1-9][0-9]*))\s*(hours|hour|h))?', string )
|
||||
if match and (match.group( 'year' ) or match.group( 'month' ) or match.group( 'day' ) or match.group( 'hour' )):
|
||||
years = int( match.group( 'year' ) ) if match.group( 'year' ) else 0
|
||||
months = int( match.group( 'month' ) ) if match.group( 'month' ) else 0
|
||||
|
@ -562,7 +562,7 @@ def parse_value( string: str, spec ):
|
|||
return string_result, (years, months, days, hours)
|
||||
|
||||
|
||||
match = re.match( '(?P<year>[0-9][0-9][0-9][0-9])-(?P<month>[0-9][0-9]?)-(?P<day>[0-9][0-9]?)', string )
|
||||
match = re.match( r'(?P<year>[0-9][0-9][0-9][0-9])-(?P<month>[0-9][0-9]?)-(?P<day>[0-9][0-9]?)', string )
|
||||
if match:
|
||||
# good expansion here would be to parse a full date with 08:20am kind of thing, but we'll wait for better datetime parsing library for that I think!
|
||||
return string[ len( match[ 0 ] ): ], datetime.datetime( int( match.group( 'year' ) ), int( match.group( 'month' ) ), int( match.group( 'day' ) ) )
|
||||
|
@ -577,7 +577,7 @@ def parse_value( string: str, spec ):
|
|||
return '', ( 0, 0 )
|
||||
|
||||
|
||||
match = re.match( '((?P<sec>0|([1-9][0-9]*))\s*(seconds|second|secs|sec|s))?\s*((?P<msec>0|([1-9][0-9]*))\s*(milliseconds|millisecond|msecs|msec|ms))?', string )
|
||||
match = re.match( r'((?P<sec>0|([1-9][0-9]*))\s*(seconds|second|secs|sec|s))?\s*((?P<msec>0|([1-9][0-9]*))\s*(milliseconds|millisecond|msecs|msec|ms))?', string )
|
||||
if match and (match.group( 'sec' ) or match.group( 'msec' )):
|
||||
seconds = int( match.group( 'sec' ) ) if match.group( 'sec' ) else 0
|
||||
mseconds = int( match.group( 'msec' ) ) if match.group( 'msec' ) else 0
|
||||
|
@ -588,7 +588,7 @@ def parse_value( string: str, spec ):
|
|||
elif spec == Value.ANY_STRING:
|
||||
return "", string
|
||||
elif spec == Value.TIME_INTERVAL:
|
||||
match = re.match( '((?P<day>0|([1-9][0-9]*))\s*(days|day))?\s*((?P<hour>0|([1-9][0-9]*))\s*(hours|hour|h))?\s*((?P<minute>0|([1-9][0-9]*))\s*(minutes|minute|mins|min))?\s*((?P<second>0|([1-9][0-9]*))\s*(seconds|second|secs|sec|s))?', string )
|
||||
match = re.match( r'((?P<day>0|([1-9][0-9]*))\s*(days|day))?\s*((?P<hour>0|([1-9][0-9]*))\s*(hours|hour|h))?\s*((?P<minute>0|([1-9][0-9]*))\s*(minutes|minute|mins|min))?\s*((?P<second>0|([1-9][0-9]*))\s*(seconds|second|secs|sec|s))?', string )
|
||||
if match and (match.group( 'day' ) or match.group( 'hour' ) or match.group( 'minute' ) or match.group( 'second' )):
|
||||
days = int( match.group( 'day' ) ) if match.group( 'day' ) else 0
|
||||
hours = int( match.group( 'hour' ) ) if match.group( 'hour' ) else 0
|
||||
|
@ -603,7 +603,7 @@ def parse_value( string: str, spec ):
|
|||
return string[ len( match[ 0 ] ): ], (days, hours, minutes, seconds)
|
||||
raise ValueError( "Invalid value, expected a time interval" )
|
||||
elif spec == Value.RATIO:
|
||||
match = re.match( '(?P<first>0|([1-9][0-9]*)):(?P<second>0|([1-9][0-9]*))', string )
|
||||
match = re.match( r'(?P<first>0|([1-9][0-9]*)):(?P<second>0|([1-9][0-9]*))', string )
|
||||
if match: return string[ len( match[ 0 ] ): ], (int( match[ 'first' ] ), int( match[ 'second' ] ))
|
||||
raise ValueError( "Invalid value, expected a ratio" )
|
||||
elif spec == Value.RATIO_SPECIAL:
|
||||
|
@ -616,7 +616,7 @@ def parse_value( string: str, spec ):
|
|||
|
||||
# 'my favourites 3/5' (no operator here)
|
||||
|
||||
match = re.match( '(?P<name>.+?)\s+(?P<num>\d+)/(?P<den>\d+)$', string )
|
||||
match = re.match( r'(?P<name>.+?)\s+(?P<num>\d+)/(?P<den>\d+)$', string )
|
||||
|
||||
if match:
|
||||
|
||||
|
@ -679,7 +679,7 @@ def parse_value( string: str, spec ):
|
|||
|
||||
# 'I'm cooooollecting counter 123' (no operator here)
|
||||
|
||||
match = re.match( '(?P<name>.+?)\s+(?P<num>\d+)$', string )
|
||||
match = re.match( r'(?P<name>.+?)\s+(?P<num>\d+)$', string )
|
||||
|
||||
if match:
|
||||
|
||||
|
@ -789,7 +789,7 @@ def parse_operator( string: str, spec ):
|
|||
|
||||
# "favourites service name > 3/5"
|
||||
# since service name can be all sorts of gubbins, we'll work backwards and KISS
|
||||
match = re.match( '(?P<first>.*?)(?P<second>(dislike|like|\d+/\d+|\d+))$', string )
|
||||
match = re.match( r'(?P<first>.*?)(?P<second>(dislike|like|\d+/\d+|\d+))$', string )
|
||||
|
||||
if match:
|
||||
|
||||
|
@ -855,7 +855,7 @@ def parse_operator( string: str, spec ):
|
|||
# note this is in the correct order, also, to eliminate = vs == ambiguity
|
||||
all_operators_piped = '|'.join( ( s_r[0] for s_r in operator_strings_and_results ) )
|
||||
|
||||
match = re.match( f'(?P<namespace>.*)\s+(?P<op>({all_operators_piped}))', string )
|
||||
match = re.match( r'(?P<namespace>.*)\s+' + f'(?P<op>({all_operators_piped}))', string )
|
||||
|
||||
if match:
|
||||
|
||||
|
|
|
@ -354,7 +354,7 @@ IRL_SIBLING_PAIRS = {
|
|||
( 'tracer (overwatch', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer (overwatch)doll joints', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer overwatch', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer\overwatch', 'character:lena "tracer" oxton' ),
|
||||
( r'tracer\overwatch', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer_(cosplay)', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer_(overwatch)', 'character:lena "tracer" oxton' ),
|
||||
( 'tracer_(overwatch)_(cosplay)', 'character:lena "tracer" oxton' ),
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
import shutil
|
||||
import time
|
||||
import unittest
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
|
@ -54,7 +55,13 @@ class TestDaemons( unittest.TestCase ):
|
|||
HG.test_controller.ClearWrites( 'import_file' )
|
||||
HG.test_controller.ClearWrites( 'serialisable' )
|
||||
|
||||
ClientDaemons.DAEMONCheckImportFolders()
|
||||
manager = ClientImportLocal.ImportFoldersManager( HG.controller )
|
||||
|
||||
manager.Start()
|
||||
|
||||
time.sleep( 8 )
|
||||
|
||||
try:
|
||||
|
||||
import_file = HG.test_controller.GetWrite( 'import_file' )
|
||||
|
||||
|
@ -72,6 +79,11 @@ class TestDaemons( unittest.TestCase ):
|
|||
self.assertTrue( os.path.exists( os.path.join( test_dir, '3' ) ) )
|
||||
self.assertTrue( os.path.exists( os.path.join( test_dir, '4' ) ) )
|
||||
|
||||
finally:
|
||||
|
||||
manager.Shutdown()
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
shutil.rmtree( test_dir )
|
||||
|
|
|
@ -24,7 +24,7 @@ IF ERRORLEVEL 1 (
|
|||
|
||||
)
|
||||
|
||||
REM You can copy this file to 'client-user.bat' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
REM You can copy this file to 'hydrus_client-user.bat' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
REM Just tack new params on like this:
|
||||
REM start "" "pythonw" hydrus_client.pyw -d="E:\hydrus"
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ if ! source venv/bin/activate; then
|
|||
exit 1
|
||||
fi
|
||||
|
||||
# You can copy this file to 'hydrus_client-user.sh' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
# You can copy this file to 'hydrus_client-user.command' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
# Just tack new params on like this:
|
||||
# python hydrus_client.py -d="/path/to/hydrus/db"
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ if ! source venv/bin/activate; then
|
|||
exit 1
|
||||
fi
|
||||
|
||||
# You can copy this file to 'client-user.sh' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
# You can copy this file to 'hydrus_client-user.sh' and add in your own launch parameters here if you like, and a git pull won't overwrite the file.
|
||||
# Just tack new hardcoded params on like this:
|
||||
#
|
||||
# python hydrus_client.py -d="/path/to/hydrus/db" "$@"
|
||||
|
|
|
@ -10,7 +10,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
psd-tools>=1.9.28
|
||||
Pillow>=10.0.1
|
||||
|
|
|
@ -3,6 +3,11 @@
|
|||
USER_ID=${UID}
|
||||
GROUP_ID=${GID}
|
||||
|
||||
PYTHON_VERSION=$(python3 --version | awk '{print $2}')
|
||||
PYTHON_MAJOR_VERSION=$(echo $PYTHON_VERSION | cut -d. -f1)
|
||||
PYTHON_MINOR_VERSION=$(echo $PYTHON_VERSION | cut -d. -f2)
|
||||
|
||||
#apk add xterm
|
||||
echo "Starting Hydrus with UID/GID : $USER_ID/$GROUP_ID"
|
||||
|
||||
cd /opt/hydrus/
|
||||
|
@ -12,12 +17,29 @@ if [ -f "/opt/hydrus/static/build_files/docker/client/patch.patch" ]; then
|
|||
patch -f -p1 -i /opt/hydrus/static/build_files/docker/client/patch.patch
|
||||
fi
|
||||
|
||||
if [ -f "/opt/hydrus/static/build_files/docker/client/requests.patch" ]; then
|
||||
cd /usr/lib/python3.10/site-packages/requests
|
||||
echo "Patching Requests"
|
||||
patch -f -p2 -i /opt/hydrus/static/build_files/docker/client/requests.patch
|
||||
cd /opt/hydrus/
|
||||
# Determine which requests patch file to use and warn on unsupported python version
|
||||
if [ "$PYTHON_MAJOR_VERSION" == "3" ]; then
|
||||
if [ "$PYTHON_MINOR_VERSION" -lt 11 ]; then
|
||||
PATCH_FILE="/opt/hydrus/static/build_files/docker/client/requests.patch"
|
||||
if [ -f "$PATCH_FILE" ]; then
|
||||
echo "Found and apply requests patch for py 3.10 and below"
|
||||
cd $(python3 -c "import sys; import requests; print(requests.__path__[0])")
|
||||
patch -f -p2 -i "$PATCH_FILE"
|
||||
fi
|
||||
elif [ "$PYTHON_MINOR_VERSION" -eq 11 ]; then
|
||||
PATCH_FILE="/opt/hydrus/static/build_files/docker/client/requests.311.patch"
|
||||
if [ -f "$PATCH_FILE" ]; then
|
||||
echo "Found and apply requests patch for py 3.11"
|
||||
cd $(python3 -c "import sys; import requests; print(requests.__path__[0])")
|
||||
patch -f -i "$PATCH_FILE"
|
||||
fi
|
||||
else
|
||||
echo "Unsupported Python minor version: $PYTHON_MINOR_VERSION"
|
||||
fi
|
||||
else
|
||||
echo "Unsupported Python major version: $PYTHON_MAJOR_VERSION"
|
||||
fi
|
||||
cd /opt/hydrus/
|
||||
|
||||
#if [ $USER_ID != 0 ] && [ $GROUP_ID != 0 ]; then
|
||||
# find /opt/hydrus/ -not -path "/opt/hydrus/db/*" -exec chown hydrus:hydrus "{}" \;
|
||||
|
|
|
@ -0,0 +1,37 @@
|
|||
--- sessions.py
|
||||
+++ sessions.py
|
||||
@@ -576,6 +576,14 @@
|
||||
|
||||
proxies = proxies or {}
|
||||
|
||||
+ # Append proxies to self.proxies if necessary and update proxies with new list or use self.proxies for proxies
|
||||
+ if isinstance(proxies,dict):
|
||||
+ self_proxies_tmp = self.proxies.copy()
|
||||
+ self_proxies_tmp.update(proxies)
|
||||
+ proxies = self_proxies_tmp.copy()
|
||||
+ else:
|
||||
+ proxies = self.proxies.copy()
|
||||
+
|
||||
settings = self.merge_environment_settings(
|
||||
prep.url, proxies, stream, verify, cert
|
||||
)
|
||||
@@ -771,8 +779,18 @@
|
||||
or verify
|
||||
)
|
||||
|
||||
+ # Check for existing no_proxy and no since they could be loaded from environment
|
||||
+ no_proxy = proxies.get('no_proxy') if proxies is not None else None
|
||||
+ no = proxies.get('no') if proxies is not None else None
|
||||
+ if any([no_proxy,no]):
|
||||
+ no_proxy = ','.join(filter(None, (no_proxy, no)))
|
||||
+
|
||||
+ # Check if we should bypass proxy for this URL
|
||||
# Merge all the kwargs.
|
||||
- proxies = merge_setting(proxies, self.proxies)
|
||||
+ if should_bypass_proxies(url, no_proxy):
|
||||
+ proxies = {}
|
||||
+ else:
|
||||
+ proxies = merge_setting(proxies, self.proxies)
|
||||
stream = merge_setting(stream, self.stream)
|
||||
verify = merge_setting(verify, self.verify)
|
||||
cert = merge_setting(cert, self.cert)
|
|
@ -10,7 +10,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
|
|
|
@ -10,7 +10,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
|
|
|
@ -10,7 +10,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
|
|
|
@ -9,3 +9,7 @@ This is still a bit of a test. I think to do this properly we'll want to move to
|
|||
Here's some examples, there are some QSS files buried here:
|
||||
|
||||
https://wiki.qt.io/Gallery_of_Qt_CSS_Based_Styles
|
||||
|
||||
And a bunch of random projects have some too, such as:
|
||||
|
||||
https://github.com/ModOrganizer2/modorganizer/tree/master/src/stylesheets
|
||||
|
|
|
@ -10,7 +10,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
psd-tools>=1.9.28
|
||||
psutil>=5.0.0
|
||||
|
|
|
@ -2,7 +2,7 @@ cryptography
|
|||
|
||||
html5lib>=1.0.1
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
numpy>=1.16.0,<2.0.0
|
||||
olefile>=0.47
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
|
|
Loading…
Reference in New Issue