Version 595
This commit is contained in:
parent
072b44a03b
commit
d0f2c97a04
|
@ -77,7 +77,7 @@ And when you are ready to close the shell cleanly, go:
|
|||
|
||||
.exit
|
||||
|
||||
It can be slow. A few MB a second is typical on an HDD (SSDs obviously faster), so expect a 10GB file to take a while. If it takes hours and hours, and your Task Manager suggests only 50KB/s read, consider again if your hard drive is healthy or not.
|
||||
It can be slow. A few MB a second is typical on an HDD (SSDs obviously faster), so expect a 10GB file to take a while. A 60GB mappings.db may take two hours. If it takes way way too long, and your Task Manager suggests only 50KB/s read, consider again if your hard drive is healthy or not.
|
||||
|
||||
Please note that newer versions of SQLite support a second check command:
|
||||
|
||||
|
@ -97,6 +97,7 @@ PRAGMA integrity_check;
|
|||
PRAGMA integrity_check;
|
||||
.exit
|
||||
|
||||
(this one can take ages simply because of the size of the file, so only do it if you really need to check; if you know it is broke, just move on to cloning now, no need to waste time confirming it)
|
||||
.open client.mappings.db
|
||||
PRAGMA integrity_check;
|
||||
.exit
|
||||
|
@ -118,7 +119,7 @@ This instructs the database to copy itself to a new file. When it comes across d
|
|||
|
||||
And wait a bit. It'll report its progress as it tries to copy your db. It will be slow. Remember to go '.exit' once it is done to close the shell neatly.
|
||||
|
||||
If the clone says some errors like 'subtags_fts4_content already exists' but keeps on working, don't worry about it! That isn't a real error. Same if you get an error about a missing 'sqlite_stat1' table.
|
||||
If the clone says some errors involving 'fts' or 'fts4', like 'subtags_fts4_content already exists', but it keeps on working, don't worry about it! That probably isn't a real error, and even if it is, 'fts' stuff will be automatically fixed on the next boot. Same if you get any errors about 'sqlite_stat1' table.
|
||||
|
||||
Once it is done, the cloned file may be significantly smaller than the original. 50% reduction in size is typical. This means some data has been lost. If this is in client.caches.db, client.master.db or client.mappings.db, it is probably fine (particularly if you sync to a tag repository like the PTR), as we can regenerate that data with some work once you are working again.
|
||||
|
||||
|
@ -140,6 +141,8 @@ If a different file is broken, then use of these:
|
|||
|
||||
Do not delete your original files for now. Just rename them to 'old' and move them somewhere safe. Also be careful not to lose track of which is which (filesize and creation date can help here)! Make sure your new cloned files are all named right and then try running the client!
|
||||
|
||||
You do not have to do an integrity_check after you clone. If you know your current hard drive is good, you can assume a clone works to create a 'clean' database.
|
||||
|
||||
** repair **
|
||||
|
||||
This command tries to fix a database file in place. I do not recommend it as it works very very slowly. While it may be able to recover some rows a clones would lose, it cannot fix everything and may leave you with a database that is still malformed, so you'll want to run integrity_check again and do a clone if needed.
|
||||
|
@ -250,4 +253,4 @@ If you do not have a backup routine, this is the time to sort it out. If you can
|
|||
|
||||
Check out the hydrus 'getting started with installing/updating/backing up' help for some more info about routines and programs like FreeFileSync.
|
||||
|
||||
https://hydrusnetwork.github.io/hydrus/after_disaster.html
|
||||
https://hydrusnetwork.github.io/hydrus/after_disaster.html
|
||||
|
|
|
@ -7,6 +7,73 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 595](https://github.com/hydrusnetwork/hydrus/releases/tag/v595)
|
||||
|
||||
### ugoiras
|
||||
|
||||
* thanks to a user who put in a lot of work, we finally have Ugoira rendering! all ugoiras will now animate using the hydrus native animation player. if the ugoira has json timing data in its zip (those downloaded with PixivUtil and gallery-dl will!), we will use that, but if it is just a zip of images (which is most older ugoiras you'll see in the wild), it'll check a couple of note names for the timing data, and, failing that, will assign a default 125ms per frame fallback. ugoiras without internal timing data will currently get no 'duration' metadata property, but right-clicking on them will show their note-based or simulated duration on the file info line
|
||||
* all existing ugoiras will be metadata rescanned and thumbnail regenned on update
|
||||
* technical info here: https://hydrusnetwork.github.io/hydrus/filetypes.html#ugoira
|
||||
* ugoira metadata and thumbnail generation is cleaner
|
||||
* a bug in ugoira thumbnail selection, when the file contains non-image files, is fixed
|
||||
* a future step will be to write a special hook into the hydrus downloader engine to recognise ugoiras (typically on Pixiv) and splice the timing data into the zip on download, at which point we'll finally be able to turn on Ugoira downloading on Pixiv on our end. for now, please check out PixivUtil or gallery-dl to get rich Ugoiras
|
||||
* I'd like to bake the simulated or note-based durations into the database somehow, as I don't like the underlying media object thinking these things have no duration, but it'll need more thought
|
||||
|
||||
### misc
|
||||
|
||||
* all multi-column lists now sort string columns in a caseless manner. a subscription called 'Tents' will now slot between 'sandwiches' and 'umbrellas'
|
||||
* in 'favourite searches', the 'folder' name now has hacky nested folder support. just put '/' in the folder name and it'll make nested submenus. in future this will be implemented with a nicer tree widget
|
||||
* file logs now load faster in a couple of ways, which should speed up UI session and subscriptions dialog load. previously, there were two rounds of URL normalisation on URL file import object load, one wasteful and one fixable with a cache; these are now dealt with. thanks to the users who sent in profiles of the subscriptions dialog opening; let me know how things seem now (hopefully this fixes/relieves #1612)
|
||||
* added 'Swap in common resolution labels' to `options->media viewer`. this lets you turn off the '1080p' and '4k'-style label swap-ins for common resolutions on file descriptor strings
|
||||
* the 'are you sure you want to exit the client? 3 pages say "I am still importing"' popup now says the page names, and in a pretty way, and it shows multiple messages nicer
|
||||
* the primary 'sort these tags in a human way m8' routine now uses unicode tech to sort things like ß better
|
||||
* the String Converter can decode 'hex' and 'base64' again (so you can now do '68656c6c6f20776f726c64' or 'aGVsbG8gd29ybGQ=' to 'hello world'). these functions were a holdover from hash parsing in the python 2 times, but I've brushed them off and cleared out the 'what if we put raw bytes in the parsing system bro' nonsense we used to have to deal with. these types are now explictly UTF-8. I also added a couple unit tests for them
|
||||
* fixed an options initialisation bug where setting two files in the duplicate filter as 'not related' was updating the A file to have the B file's file modified time if that was earlier!! if you have files in this category, you will be asked on update if you want to reset their file modified date back to what is actually on disk (the duplicate merge would not have overwritten this; this only happens if you edit the time in the times dialog by hand). a unit test now checks this situation. sorry for the trouble, and thank you to the user who noticed and reported this
|
||||
* the hydrus Docker package now sets the 'hydrus' process to `autorestart=unexpected`. I understand this makes `file->exit` stick without an automatic restart. it seems like commanding the whole Docker image to shut down still causes a near-instant unclean exit (some SIGTERM thing isn't being caught right, I think), but `file->exit` should now be doable beforehand. we will keep working here
|
||||
|
||||
### more OR preds
|
||||
|
||||
* the new 'replace selected with their OR' and the original 'add an OR of the selected' are now mutually exclusive, depending on whether the current selection is entirely in the active search list
|
||||
* added 'start an OR with selected', which opens the 'edit OR predicate' panel on the current selection. this works if you only select one item, too
|
||||
* added 'dissolve selected into single predicates', when you select only OR predicates. it does the opposite of the 'replace'
|
||||
* the new OR menu gubbins is now in its own separated menu section on the tag right-click
|
||||
* the indent for OR sub preds is moved up from two spaces to four
|
||||
|
||||
### urls
|
||||
|
||||
* wrote some help about the 'force page refetch' checkboxes in 'tag import options' here: https://hydrusnetwork.github.io/hydrus/getting_started_downloading.html#force_page_fetch
|
||||
* added a new submenu `urls->force metadata refetch` that lets you quickly and automatically create a new urls downloader page with the selected files' 'x URL Class' urls with the tag import options set to the respective URLs' default but with these checkboxes all set for you. we finally have a simple answer to 'I messed up my tag parse, I need to redownload these files to get the tags'!
|
||||
* the urls menu offers the 'for x url class' even when only one file is selected now. crazy files with fifty of the same url class can now be handled
|
||||
|
||||
### duplicates auto-resolution
|
||||
|
||||
* wrote some placeholder UI for the new system. anyone who happens to be in advanced mode will see another tab on duplicate filter pages. you can poke around if you like, but it is mostly just blank lists that aren't plugged into anything
|
||||
* wrote some placeholder help too. same deal, just a placeholder that you have to look for to find that I'll keep working on
|
||||
* I still feel good about the duplicates auto-resolution system. there is much more work to do, but I'll keep iterating and fleshing things out
|
||||
|
||||
### client api
|
||||
|
||||
* the new `/get_files/file_path` command now returns the `filetype` and `size` of the file
|
||||
* updated the Client API help and unit tests for this
|
||||
* client api version is now 73
|
||||
|
||||
### new build stuff
|
||||
|
||||
* the library updates we've been testing the past few weeks have gone well, so I am rolling them into the normal builds for everyone. the libraries that do 'fetch stuff from the internet' and 'help python manage its packages' are being updated because of some security problems that I don't think matter for us at all (there's some persistent https verification thing in requests that I know we don't care about, and a malicious URL exploit in setuptools that only matters if you are using it to download packages, which, as I understand, we don't), but we are going to be good and update anyway
|
||||
* `requests` is updated from `2.31.0` to `2.32.3`
|
||||
* `setuptools` is updated from `69.1.1` to `70.3.0`
|
||||
* `PyInstaller` is updated from `6.2` to `6.7` for Windows and Linux to handle the new `setuptools`
|
||||
* there do not appear to be any update conflicts with dlls or anything, so just update like you normally do. I don't think the new pyinstaller will have problems with older/weirder Windows, but let me know if you run into anything
|
||||
* users who run from source may like to reinstall their venvs after pulling to get the new libraries too
|
||||
|
||||
### boring cleanup
|
||||
|
||||
* refactored `ClientGUIDuplicates` to a new `duplicates` gui module and renamed it to `ClientGUIDuplicateActions`
|
||||
* harmonised some duplicates auto-resolution terminology across the client to exactly that form. not auto-duplicates or duplicate auto resolution, but 'duplicates auto-resolution'
|
||||
* fixed some bad help link anchors
|
||||
* clarified a couple things in the 'help my db is broke.txt' document
|
||||
* updated the new x.svg to a black version; it looks a bit better in light & dark styles
|
||||
|
||||
## [Version 594](https://github.com/hydrusnetwork/hydrus/releases/tag/v594)
|
||||
|
||||
### misc
|
||||
|
@ -330,63 +397,3 @@ title: Changelog
|
|||
* added `/add_urls/migrate_files` to copy files to new local file domains (essentially doing _files->add to_ from the thumbnail menu)
|
||||
* with (I think) all multiple local file service capabilities added to the Client API, issue #251 is finally ticked off
|
||||
* client api version is now 68
|
||||
|
||||
## [Version 585](https://github.com/hydrusnetwork/hydrus/releases/tag/v585)
|
||||
|
||||
### the new asynchronous siblings and parent dialogs
|
||||
|
||||
* the `tags->manage tag siblings/parents` dialogs now load quickly. rather than fetching all known pairs on every open, they now only load pertinent pairs as they are needed. if you type in tag A in the left or right side, all the pairs that involve A directly or point to a pair that involves A directly or indirectly are loaded in the background (usually so fast it seems instant). the dialog can still do 'ah, that would cause a conflict, what do you want to do?' logic, but it only fetches what it needs
|
||||
* the main edit operations in this dialog are now 'asynchronous', which means there is actually a short delay between the action firing and the UI updating. most of the time it is so fast it isn't noticeable, and in general because of other cleanup it tends to be faster about everything it does
|
||||
* the dialogs now have a sticky workspace 'memory'. when you type tags in, the dialog still shows the related rows as normal, but now it does not clear those rows away once you actually enter those new pairs. the 'workspace' shows anything related to anything you have typed until you hit the new 'wipe workspace' button, which will reset back to a blank view. I hope this makes it less frustrating to work on a large group--it now stays in view the whole time, rather than the 'current' stuff jumping in and out of view vs the pending/petitioned as you type and submit stuff. the 'wipe workspace' button also has the current workspace tags in its tooltip
|
||||
* the 'show all pairs' checkbox remains. it may well take twenty seconds to load up the hundreds of thousands of pairs from the PTR, but you can do it
|
||||
* also added is a 'show pending and petitioned groups', which will load up anything waiting to be uploaded to a tag repository, and all related pairs
|
||||
* when a user with 'modify siblings/parents' adds a pair, the auto-assigned 'reason' is now "Entered by a janitor.' (previously it was the enigmatic "admin")
|
||||
* some misc layout improvements aross the board. the green/red text at the top is compressed; the 'num pairs' now shows the current number of pairs count; there are more rows for the pairs list, fewer for the input list; and the pairs list eats up all new expand space
|
||||
* a great amount of misc code cleanup in all these panels and systems, and most of the logic is shared between both sibling and parent dialogs. a lot of janky old stuff is cleared up!
|
||||
* these dialogs are better about showing invalid, duplicated, or loop-causing pairs. the idea is to show you everything as-is in storage so you can better directly edit problems out (previously, I am pretty sure it was sometimes collapsing stuff and obscuring problems)
|
||||
* the 'manage tag parents' dialog now auto-petitions new loops when entering pairs (it was just siblings before)
|
||||
* this tech now works on multiple potential loops, rather than just the first
|
||||
* the 'manage tag parents' dialog now detects pre-existing loops in the database record and warns about this when trying to enter pairs that join the loop (it was just siblings before)
|
||||
* this tech works better and now detects multiple loops, including completely invalid records that nonetheless exist (e.g. `a->b, a->c` siblings that point to more than one locations), and when it reports them, it now reports them all in one dialog, and it shows the actual `a->b->c->d` route that forms the loop
|
||||
* a bad final 'do not allow loop-inputs' backstop check in the main pair-add routine is removed--it was not helping
|
||||
|
||||
### misc
|
||||
|
||||
* hitting escape on any taglist will now deselect all tags
|
||||
* added 'Do not allow mouse media drag-panning when the media has duration' to the _options->media viewer_ page. if you often misclick and pan when scrubbing through videos, try it out!
|
||||
* the media viewer's top hover window no longer shows every 'added-to' time for all the local file services; it was spammy, so it now just says 'imported: (time)'. the related 'hide uninteresting import time' option is retired. I also removed the 'archived: (time)' label, so this is now pretty much just 'imported, modified'. if I bring detailed times back to the file summary, it'll be part of a more flexible system. note that all these timestamps are still available in the media top-row flyout menu
|
||||
* the file log and gallery log now copy their urls/sources on a ctrl+c hit. also, the 'copy' right-click commands here also no longer unhelpfully double-newline-separates rows
|
||||
* a `StringConverter` edit panel now throws up a yes/no confirmation if you try to ok on a regex substitution that seems to match a group in the pattern but has an empty string in the 'replacement' box
|
||||
* updated the 'test' versions of OpenCV (4.10.0.84), Pyside6 (6.7.2), and python-mpv (1.0.7). I'll be testing these myself, and devving with them, mostly to iron out some Qt 6.7.x stuff we've seen, and then put out a future release with them
|
||||
* added a note to the default_mpv.conf to say 'try commenting out the audio normalisation line if you get mpv problems and are on Arch'
|
||||
* added different example launch paths to the 'external programs' options panel depending on the current OS
|
||||
* added a note about running with `QT_QPA_PLATFORM=xcb` on Wayland to the install help
|
||||
* refactored the `ClientGUIFileSeedCache` and `ClientGUIGallerySeedLog` files, which do the file and gallery log panels, up to the 'gui.importing' module
|
||||
* thanks to a user, added a new darkmode 'Nord' stylesheet
|
||||
|
||||
### fixes
|
||||
|
||||
* fixed 'scrub invalidity' in the manage logins dialog--sorry, it was a stupid typo from the recent multiple-column list rework. also, this button is now only enabled if the login script is active
|
||||
* fixed a bug opening the 'migrate files' dialog when you have no files!
|
||||
* I force-added `Accept-Language: en-US,en;q=0.5` to the client's default http headers for pixiv.net. this appears to get the API to give us English tags again. let me know if this completely screws anything up
|
||||
* updated the 'do we have enough disk space to do this transaction?' test to check for double the destination disk amount. thanks to the user who helped navigate this--regardless of temp dir work, when you do a vacuum or other gigantic single transaction, there is a very brief period as the transaction commits when either the stuffed WAL journal or (for a vacuum) cloned db file exists at the same time in the same folder as the original db file. I also updated the text in the 'review vacuum data' window to talk about this a bit. good luck vacuuming your client.mappings.db file bros
|
||||
* improved the error handling when a sidecar import fails--it now says the original file path in the report
|
||||
* improved failure-recovery of unicode decoding (usually used in webpage parsing) when the given text includes errors and the encoding is `ISO-8859-1` (or the encoding is unparseable and `requests` falls back to it) and/or if `chardet` is not available
|
||||
* I hacked the menubar padding back to something sensible on the new 'windows11' style int Qt 6.7.x. for whatever reason, this new style adds about 15px of padding/margin to each menubar menu button. I am aware the collect-by combobox is still busted in this style--let me know if you spot anything else! btw switching from 'windows11' to 'windowsvista' seems to make all the menubar menus transparent, let's go
|
||||
* improved the layout of the 'edit client api access key permissions' panel. it wasn't vertically expanding before
|
||||
* fixed up some keypress handling in taglists. some stuff that was being swallowed or promoted unintentionally is fixed
|
||||
* thanks to a user, fixed a weird bug in the 'repair missing file storage locations' boot repair dialog where it would always say you only had missing thumbs
|
||||
* also thanks to that user, the 'repair missing file storage locations' dialog now checks `client_files` and `thumbnails` subdirectories when trying to auto-discover with the 'add a possibly correct location' action
|
||||
|
||||
### some hash-sorting stuff
|
||||
|
||||
* _you can probably ignore this section, don't worry about it_
|
||||
* you can now sort by blurhash. this works at the database level too, when mixed with system:limit
|
||||
* when sorting by pixel hash, a file search with system:limit now pre-sorts by pixel hash before the limit clips the resultset
|
||||
* when sorting by pixel hash or blurhash, the files with no such hash (e.g. audio files) are now always put at the end
|
||||
* searching many tens of thousands of files and sorting by hash, pixel hash, or blurhash is now just a tiny bit faster
|
||||
|
||||
### client api
|
||||
|
||||
* the new `/manage_services/get_pending_counts` command now includes the 'Services Object' in its response
|
||||
* the client api version is now 67
|
||||
|
|
|
@ -2193,11 +2193,13 @@ Only use one. As with metadata fetching, you may only use the hash argument if y
|
|||
```
|
||||
|
||||
Response:
|
||||
: The actual path to the file on the host system.
|
||||
: The actual path to the file on the host system. Filetype and size are included for convenience.
|
||||
|
||||
``` json title="Example response"
|
||||
{
|
||||
"path" : "D:\hydrus_files\f7f\7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a.jpg"
|
||||
"path" : "D:\hydrus_files\f7f\7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a.jpg",
|
||||
"filetype" : "image/jpeg",
|
||||
"size" : 95237
|
||||
}
|
||||
```
|
||||
|
||||
|
@ -2584,7 +2586,7 @@ If there are no potential duplicate groups in the search, this returns an empty
|
|||
|
||||
### **POST `/manage_file_relationships/remove_potentials`** { id="manage_file_relationships_remove_potentials" }
|
||||
|
||||
Remove all potential pairs that any of the given files are a part of. If you hit [/manage\_file\_relationships/get\_file\_relationships](#get-manage_file_relationshipsget_file_relationships) after this on any of these files, they will have no potential relationships, and any hashes that were potential to them before will no longer, conversely, refer to these files as potentials.
|
||||
Remove all potential pairs that any of the given files are a part of. If you hit [/manage\_file\_relationships/get\_file\_relationships](#manage_file_relationships_get_file_relationships) after this on any of these files, they will have no potential relationships, and any hashes that were potential to them before will no longer, conversely, refer to these files as potentials.
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage File Relationships permission needed.
|
||||
|
|
|
@ -0,0 +1,47 @@
|
|||
---
|
||||
title: Filtering Duplicates Automatically
|
||||
---
|
||||
|
||||
## Hey, this is a draft for a system that is not yet working, you can ignore it for now
|
||||
|
||||
# the problem with duplicates processing
|
||||
|
||||
The duplicates filter can be pretty tedious to work with. Pairs that have trivial differences are easy to resolve, but working through dozens of obvious resizes or pixel duplicates that all follow the same pattern can get boring.
|
||||
|
||||
If only there were some way to automate common situations! We could have hydrus solve these trivial duplicates in the background, leaving us with less, more interesting work to do.
|
||||
|
||||
## duplicates auto-resolution
|
||||
|
||||
_This is a new system that I am still developing. The plan is to roll out a hardcoded rule that resolves jpeg and png pixel dupes and then iterate on the UI and workflow to let users add their own custom rules. If you try it, let me know how you find things!_
|
||||
|
||||
So, let's start with a simple and generally non-controversial example: pixel duplicate jpegs and pngs. When you save a jpeg, you get some 'fuzzy' artifacts, but when you save a png, it is always pixel perfect. Thus, when you have a normal jpeg and a png that are pixel duplicates, you _know_, for certain, that the png is a copy of the jpeg. This happens most often when someone is posting from one application to another, or with a phone, and rather than uploading the source jpeg, they do 'copy image' and paste that into the upload box--the browser creates the accursed 'Clipboard.png', and we are thus overwhelmed with spam.
|
||||
|
||||
In this case, we always want to keep the (almost always smaller) jpeg and ditch the (bloated, derived) png, which in the duplicates system would be:
|
||||
|
||||
- A two-part duplicates search, for 'system:filetype is jpg' and 'system:filetype is png', with 'must be pixel dupes'.
|
||||
- Arranging 'the jpeg is A, the png is B'
|
||||
- Sending the normal duplicate action of 'set A as better than B, and delete B'.
|
||||
|
||||
Let's check out the 'auto-resolution' tab under the duplicates filtering page:
|
||||
|
||||
(image)
|
||||
|
||||
The auto-resolution system lets you have multiple 'rules'. Each represents a search, a way of testing pairs, and then an action. Let's check the edit dialog:
|
||||
|
||||
(image of edit rules)
|
||||
|
||||
(image of edit rule, png vs jpeg)
|
||||
|
||||
Note that this adds the 'system:height/width > 128' predicates as a failsafe to ensure we are checking real images in this case, not tiny 16x16 icons where there might be a legitimate accidentaly jpeg/png pixel dupe, and where the decision on what to keep is not so simple. Automated systems are powerful magic wands, and we should always be careful waving them around.
|
||||
|
||||
Talk about metadata conditional objects here.
|
||||
|
||||
Talk about the pair Comparator stuff, 4x filesize and so on. Might be more UI, so maybe a picture of the sub-panel.
|
||||
|
||||
Hydrus will work these rules in its normal background maintenance time. You can force them to work a bit harder if you want to catch up somewhere, but normally you can just leave them alone and they'll stay up to date with new imports.
|
||||
|
||||
## future
|
||||
|
||||
I will expand the Metadata Conditional to cover more tests, including most of the hooks in the duplicates filter summaries, like 'this has exif data'. And, assuming the trivial cases go well, I'd like to push toward less-certain comparions and have some sort of tools for 'A is at least 99.7% similar to B', which will help with resize comparisons and differentiating dupes from alternates.
|
||||
|
||||
I'd also eventually like auto-resolution to apply to files _as_ they are imported, so, in the vein of 'previously deleted', you could have an instant import result of 'duplicate discarded: (rule name)'.
|
|
@ -43,7 +43,7 @@ The filetype for a file can be overridden with `manage -> force filetype` in the
|
|||
|
||||
If there are no frame durations provided hydrus will assume each frame should last 125ms. Hydrus will look inside the zip for a file called `animation.json` and try to parse it as the 2 most common metadata formats that PixivUtil and gallery-dl generate. The Ugoira file will only have a duration in the database if it contains a valid `animation.json`.
|
||||
|
||||
When played hydrus will first attempt to use the `animation.json` file but if it can't it will look for notes containing frame delays. First it looks for a note named `ugoira json` and attempts to read it like the `animation.json`, it then looks for a note called `ugoira frame delay array` which should be a note containing a simple JSON array, for example: `#!json [90, 90, 40, 90]`.
|
||||
When played hydrus will first attempt to use the `animation.json` file, but if that does not exist, it will look for notes containing frame delays. First it looks for a note named `ugoira json` and attempts to read it like the `animation.json`, it then looks for a note called `ugoira frame delay array` which should be a note containing a simple JSON array, for example: `#!json [90, 90, 40, 90]`.
|
||||
|
||||
|
||||
## Video
|
||||
|
|
|
@ -134,7 +134,7 @@ A few of the options have more information if you hover over them.
|
|||
: See [multiple file services](advanced_multiple_local_file_services.md), an advanced feature.
|
||||
|
||||
**post import actions**
|
||||
: See the [files section on filtering](getting_started_files.md#inbox-and-archive) for the first option, the other two have information if you hover over them.
|
||||
: See the [files section on filtering](getting_started_files.md#inbox_and_archive) for the first option, the other two have information if you hover over them.
|
||||
|
||||
### Tag Parsing
|
||||
By default, hydrus now starts with a local tag service called 'downloader tags' and it will parse (get) all the tags from normal gallery sites and put them in this service. You don't have to do anything, you will get some decent tags. As you use the client, you will figure out which tags you like and where you want them. On the downloader page, click `import options`:
|
||||
|
@ -154,6 +154,14 @@ The blacklist button will let you skip downloading files that have certain tags
|
|||
!!! warning
|
||||
The file limit and import options on the upper panel of a gallery or watcher page, if changed, will only apply to **new** queries. If you want to change the options for an existing queue, either do so on its highlight panel below or use the 'set options to queries' button.
|
||||
|
||||
#### Force Page Fetch
|
||||
|
||||
By default, hydrus will not revisit web pages or API endpoints for URLs it knows A) refer to one known file only, and B) that file is already in your database or has previously been deleted. The way it navigates this can be a complicated mix of hash and URL data, and in certain logical situations hydrus will determine its own records are untrustworthy and decide to check the source again. This saves bandwidth and time as you run successive queries that include the same results. You should not disable the capability for normal operation.
|
||||
|
||||
But if you mess up your tag import options somewhere and need to re-run a download with forced tag re-fetching, how to do it?
|
||||
|
||||
At the moment, this is in tag import options, the `force page fetch even if...` checkboxes. You can either set up a one-time downloader page with specific tag import options that check both of these checkboxes and then paste URLs in, or you can right-click a selection of thumbnails and have hydrus create the page for you under the _urls->force metadata refetch_ menu. Once you are done with the downloader page, delete it and do not use it for normal jobs--again, this method of downloading is inefficient and should not be used for repeating, long-term, or speculative jobs. Only use it to fill in specific holes.
|
||||
|
||||
### Note Parsing
|
||||
|
||||
Hydrus alsos parse 'notes' from some sites. This is a young feature, and a little advanced at times, but it generally means the comments that artists leave on certain gallery sites, or something like a tweet text. Notes are editable by you and appear in a hovering window on the right side of the media viewer.
|
||||
|
|
|
@ -32,7 +32,7 @@ Each tag service comes with its own tags, siblings and parents.
|
|||
The intent is to use this service for tags you yourself want to add.
|
||||
|
||||
### Downloader tags
|
||||
The default [tag parse target](getting_started_downloading.md#parsing). Tags of things you download will end up here unless you change the settings. It's probably a good idea to set up some tag blacklists for tags you don't want.
|
||||
The default place for tags coming from downloaders. Tags of things you download will end up here unless you change the settings. It is a good idea to set up some tag blacklists for tags you do not want.
|
||||
|
||||
## Tag repositories
|
||||
|
||||
|
|
|
@ -11,7 +11,7 @@ So, you have some files imported. Let's give them some tags so we can find them
|
|||
|
||||
[FAQ: what is a tag?](faq.md#tags)
|
||||
|
||||
Your client starts with two [local tags services](getting_started_tags.md#tag_services), called 'my tags' and 'downloader tags' which keep all of their file->tag mappings in your client's database where only you can see them. 'my tags' is a good place to practise.
|
||||
Your client starts with two [local tags services](getting_started_more_tags.md#tag_services), called 'my tags' and 'downloader tags' which keep all of their file->tag mappings in your client's database where only you can see them. 'my tags' is a good place to practise.
|
||||
|
||||
Select a file and press F3 to open the _manage tags dialog_:
|
||||
|
||||
|
|
|
@ -34,6 +34,60 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_595"><a href="#version_595">version 595</a></h2>
|
||||
<ul>
|
||||
<li><h3>ugoiras</h3></li>
|
||||
<li>thanks to a user who put in a lot of work, we finally have Ugoira rendering! all ugoiras will now animate using the hydrus native animation player. if the ugoira has json timing data in its zip (those downloaded with PixivUtil and gallery-dl will!), we will use that, but if it is just a zip of images (which is most older ugoiras you'll see in the wild), it'll check a couple of note names for the timing data, and, failing that, will assign a default 125ms per frame fallback. ugoiras without internal timing data will currently get no 'duration' metadata property, but right-clicking on them will show their note-based or simulated duration on the file info line</li>
|
||||
<li>all existing ugoiras will be metadata rescanned and thumbnail regenned on update</li>
|
||||
<li>technical info here: https://hydrusnetwork.github.io/hydrus/filetypes.html#ugoira</li>
|
||||
<li>ugoira metadata and thumbnail generation is cleaner</li>
|
||||
<li>a bug in ugoira thumbnail selection, when the file contains non-image files, is fixed</li>
|
||||
<li>a future step will be to write a special hook into the hydrus downloader engine to recognise ugoiras (typically on Pixiv) and splice the timing data into the zip on download, at which point we'll finally be able to turn on Ugoira downloading on Pixiv on our end. for now, please check out PixivUtil or gallery-dl to get rich Ugoiras</li>
|
||||
<li>I'd like to bake the simulated or note-based durations into the database somehow, as I don't like the underlying media object thinking these things have no duration, but it'll need more thought</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>all multi-column lists now sort string columns in a caseless manner. a subscription called 'Tents' will now slot between 'sandwiches' and 'umbrellas'</li>
|
||||
<li>in 'favourite searches', the 'folder' name now has hacky nested folder support. just put '/' in the folder name and it'll make nested submenus. in future this will be implemented with a nicer tree widget</li>
|
||||
<li>file logs now load faster in a couple of ways, which should speed up UI session and subscriptions dialog load. previously, there were two rounds of URL normalisation on URL file import object load, one wasteful and one fixable with a cache; these are now dealt with. thanks to the users who sent in profiles of the subscriptions dialog opening; let me know how things seem now (hopefully this fixes/relieves #1612)</li>
|
||||
<li>added 'Swap in common resolution labels' to `options->media viewer`. this lets you turn off the '1080p' and '4k'-style label swap-ins for common resolutions on file descriptor strings</li>
|
||||
<li>the 'are you sure you want to exit the client? 3 pages say "I am still importing"' popup now says the page names, and in a pretty way, and it shows multiple messages nicer</li>
|
||||
<li>the primary 'sort these tags in a human way m8' routine now uses unicode tech to sort things like ß better</li>
|
||||
<li>the String Converter can decode 'hex' and 'base64' again (so you can now do '68656c6c6f20776f726c64' or 'aGVsbG8gd29ybGQ=' to 'hello world'). these functions were a holdover from hash parsing in the python 2 times, but I've brushed them off and cleared out the 'what if we put raw bytes in the parsing system bro' nonsense we used to have to deal with. these types are now explictly UTF-8. I also added a couple unit tests for them</li>
|
||||
<li>fixed an options initialisation bug where setting two files in the duplicate filter as 'not related' was updating the A file to have the B file's file modified time if that was earlier!! if you have files in this category, you will be asked on update if you want to reset their file modified date back to what is actually on disk (the duplicate merge would not have overwritten this; this only happens if you edit the time in the times dialog by hand). a unit test now checks this situation. sorry for the trouble, and thank you to the user who noticed and reported this</li>
|
||||
<li>the hydrus Docker package now sets the 'hydrus' process to `autorestart=unexpected`. I understand this makes `file->exit` stick without an automatic restart. it seems like commanding the whole Docker image to shut down still causes a near-instant unclean exit (some SIGTERM thing isn't being caught right, I think), but `file->exit` should now be doable beforehand. we will keep working here</li>
|
||||
<li><h3>more OR preds</h3></li>
|
||||
<li>the new 'replace selected with their OR' and the original 'add an OR of the selected' are now mutually exclusive, depending on whether the current selection is entirely in the active search list</li>
|
||||
<li>added 'start an OR with selected', which opens the 'edit OR predicate' panel on the current selection. this works if you only select one item, too</li>
|
||||
<li>added 'dissolve selected into single predicates', when you select only OR predicates. it does the opposite of the 'replace'</li>
|
||||
<li>the new OR menu gubbins is now in its own separated menu section on the tag right-click</li>
|
||||
<li>the indent for OR sub preds is moved up from two spaces to four</li>
|
||||
<li><h3>urls</h3></li>
|
||||
<li>wrote some help about the 'force page refetch' checkboxes in 'tag import options' here: https://hydrusnetwork.github.io/hydrus/getting_started_downloading.html#force_page_fetch</li>
|
||||
<li>added a new submenu `urls->force metadata refetch` that lets you quickly and automatically create a new urls downloader page with the selected files' 'x URL Class' urls with the tag import options set to the respective URLs' default but with these checkboxes all set for you. we finally have a simple answer to 'I messed up my tag parse, I need to redownload these files to get the tags'!</li>
|
||||
<li>the urls menu offers the 'for x url class' even when only one file is selected now. crazy files with fifty of the same url class can now be handled</li>
|
||||
<li><h3>duplicates auto-resolution</h3></li>
|
||||
<li>wrote some placeholder UI for the new system. anyone who happens to be in advanced mode will see another tab on duplicate filter pages. you can poke around if you like, but it is mostly just blank lists that aren't plugged into anything</li>
|
||||
<li>wrote some placeholder help too. same deal, just a placeholder that you have to look for to find that I'll keep working on</li>
|
||||
<li>I still feel good about the duplicates auto-resolution system. there is much more work to do, but I'll keep iterating and fleshing things out</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>the new `/get_files/file_path` command now returns the `filetype` and `size` of the file</li>
|
||||
<li>updated the Client API help and unit tests for this</li>
|
||||
<li>client api version is now 73</li>
|
||||
<li><h3>new build stuff</h3></li>
|
||||
<li>the library updates we've been testing the past few weeks have gone well, so I am rolling them into the normal builds for everyone. the libraries that do 'fetch stuff from the internet' and 'help python manage its packages' are being updated because of some security problems that I don't think matter for us at all (there's some persistent https verification thing in requests that I know we don't care about, and a malicious URL exploit in setuptools that only matters if you are using it to download packages, which, as I understand, we don't), but we are going to be good and update anyway</li>
|
||||
<li>`requests` is updated from `2.31.0` to `2.32.3`</li>
|
||||
<li>`setuptools` is updated from `69.1.1` to `70.3.0`</li>
|
||||
<li>`PyInstaller` is updated from `6.2` to `6.7` for Windows and Linux to handle the new `setuptools`</li>
|
||||
<li>there do not appear to be any update conflicts with dlls or anything, so just update like you normally do. I don't think the new pyinstaller will have problems with older/weirder Windows, but let me know if you run into anything</li>
|
||||
<li>users who run from source may like to reinstall their venvs after pulling to get the new libraries too</li>
|
||||
<li><h3>boring cleanup</h3></li>
|
||||
<li>refactored `ClientGUIDuplicates` to a new `duplicates` gui module and renamed it to `ClientGUIDuplicateActions`</li>
|
||||
<li>harmonised some duplicates auto-resolution terminology across the client to exactly that form. not auto-duplicates or duplicate auto resolution, but 'duplicates auto-resolution'</li>
|
||||
<li>fixed some bad help link anchors</li>
|
||||
<li>clarified a couple things in the 'help my db is broke.txt' document</li>
|
||||
<li>updated the new x.svg to a black version; it looks a bit better in light & dark styles</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_594"><a href="#version_594">version 594</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -4,8 +4,10 @@ import time
|
|||
import traceback
|
||||
import yaml
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusNumbers
|
||||
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientThreading
|
||||
|
@ -130,6 +132,36 @@ def OrdIsNumber( o ):
|
|||
|
||||
return 48 <= o <= 57
|
||||
|
||||
|
||||
def ResolutionToPrettyString( resolution ):
|
||||
|
||||
if resolution is None:
|
||||
|
||||
return 'no resolution'
|
||||
|
||||
|
||||
if not isinstance( resolution, tuple ):
|
||||
|
||||
try:
|
||||
|
||||
resolution = tuple( resolution )
|
||||
|
||||
except:
|
||||
|
||||
return 'broken resolution'
|
||||
|
||||
|
||||
|
||||
if resolution in HC.NICE_RESOLUTIONS and CG.client_controller.new_options.GetBoolean( 'use_nice_resolution_strings' ):
|
||||
|
||||
return HC.NICE_RESOLUTIONS[ resolution ]
|
||||
|
||||
|
||||
( width, height ) = resolution
|
||||
|
||||
return '{}x{}'.format( HydrusNumbers.ToHumanInt( width ), HydrusNumbers.ToHumanInt( height ) )
|
||||
|
||||
|
||||
def ShowExceptionClient( e, do_wait = True ):
|
||||
|
||||
( etype, value, tb ) = sys.exc_info()
|
||||
|
|
|
@ -264,7 +264,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
'command_palette_show_media_menu' : False,
|
||||
'disallow_media_drags_on_duration_media' : False,
|
||||
'show_all_my_files_on_page_chooser' : True,
|
||||
'show_local_files_on_page_chooser' : False
|
||||
'show_local_files_on_page_chooser' : False,
|
||||
'use_nice_resolution_strings' : True
|
||||
}
|
||||
|
||||
#
|
||||
|
|
|
@ -908,8 +908,8 @@ class RasterContainerVideo( RasterContainer ):
|
|||
self._times_to_play_animation = HydrusAnimationHandling.GetTimesToPlayAPNG( self._path )
|
||||
|
||||
elif self._media.GetMime() == HC.ANIMATION_UGOIRA:
|
||||
|
||||
self._times_to_play_animation = 1
|
||||
|
||||
self._times_to_play_animation = 0
|
||||
|
||||
else:
|
||||
|
||||
|
|
|
@ -203,8 +203,8 @@ class StringConverter( StringProcessingStep ):
|
|||
|
||||
else:
|
||||
|
||||
# due to py3, this is now a bit of a pain
|
||||
# _for now_, let's convert to bytes if not already and then spit out a str
|
||||
# this was originally an old py2 bytes/str hack to enable some file lookup script file id param generation
|
||||
# but we are now formalising it into some more 'real'
|
||||
|
||||
if isinstance( s, str ):
|
||||
|
||||
|
@ -225,26 +225,50 @@ class StringConverter( StringProcessingStep ):
|
|||
|
||||
s = str( s_bytes, 'utf-8' )
|
||||
|
||||
else:
|
||||
|
||||
raise Exception( 'unknown encode type!' )
|
||||
|
||||
|
||||
|
||||
elif conversion_type == STRING_CONVERSION_DECODE:
|
||||
|
||||
encode_type = data
|
||||
decode_type = data
|
||||
|
||||
if encode_type == 'url percent encoding':
|
||||
if decode_type == 'url percent encoding':
|
||||
|
||||
s = urllib.parse.unquote( s )
|
||||
|
||||
elif encode_type == 'unicode escape characters':
|
||||
elif decode_type == 'unicode escape characters':
|
||||
|
||||
s = s.encode( 'utf-8' ).decode( 'unicode-escape' )
|
||||
|
||||
elif encode_type == 'html entities':
|
||||
elif decode_type == 'html entities':
|
||||
|
||||
s = html.unescape( s )
|
||||
|
||||
|
||||
# the old 'hex' and 'base64' are now deprecated, no-ops
|
||||
else:
|
||||
|
||||
# I originally didn't have these, only had them for the encode side, and it was always a py2 bytes/str hack to enable some file lookup script file id param generation
|
||||
#
|
||||
# due to py3, this is now a bit of a pain
|
||||
# _for now_, let's convert to bytes if not already and then spit out a str
|
||||
|
||||
if decode_type == 'hex':
|
||||
|
||||
s_bytes = bytes.fromhex( s )
|
||||
|
||||
elif decode_type == 'base64':
|
||||
|
||||
s_bytes = base64.b64decode( s )
|
||||
|
||||
else:
|
||||
|
||||
raise Exception( 'unknown decode type!' )
|
||||
|
||||
|
||||
s = str( s_bytes, 'utf-8', errors = 'replace' )
|
||||
|
||||
|
||||
elif conversion_type == STRING_CONVERSION_REVERSE:
|
||||
|
||||
|
@ -439,11 +463,6 @@ class StringConverter( StringProcessingStep ):
|
|||
|
||||
elif conversion_type == STRING_CONVERSION_DECODE:
|
||||
|
||||
if data in ( 'hex', 'base64' ):
|
||||
|
||||
return 'deprecated {} decode, now a no-op, can be deleted'.format( data )
|
||||
|
||||
|
||||
return 'decode from ' + data
|
||||
|
||||
elif conversion_type == STRING_CONVERSION_REVERSE:
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from hydrus.core.files import HydrusUgoiraHandling
|
||||
from hydrus.core.files.images import HydrusImageHandling
|
||||
from hydrus.core.files import HydrusArchiveHandling
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientFiles
|
||||
|
||||
|
@ -10,26 +10,28 @@ import typing
|
|||
|
||||
UGOIRA_DEFAULT_FRAME_DURATION_MS = 125
|
||||
|
||||
def GetFrameDurationsUgoira( media: ClientMedia.MediaSingleton ):
|
||||
def GetFrameDurationsUgoira( media: ClientMediaResult.MediaResult ):
|
||||
|
||||
client_files_manager: ClientFiles.ClientFilesManager = CG.client_controller.client_files_manager
|
||||
|
||||
|
||||
path = client_files_manager.GetFilePath( media.GetHash(), media.GetMime() )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
frameData = HydrusUgoiraHandling.GetUgoiraFrameDataJSON( path )
|
||||
|
||||
|
||||
if frameData is not None:
|
||||
|
||||
durations = [data['delay'] for data in frameData]
|
||||
|
||||
return durations
|
||||
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
try:
|
||||
|
||||
durations = GetFrameTimesFromNote(media)
|
||||
|
@ -37,21 +39,24 @@ def GetFrameDurationsUgoira( media: ClientMedia.MediaSingleton ):
|
|||
if durations is not None:
|
||||
|
||||
return durations
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
num_frames = media.GetNumFrames()
|
||||
|
||||
return [UGOIRA_DEFAULT_FRAME_DURATION_MS] * num_frames
|
||||
|
||||
|
||||
def GetFrameTimesFromNote(media: ClientMedia.MediaSingleton):
|
||||
|
||||
def GetFrameTimesFromNote( media: ClientMediaResult.MediaResult ):
|
||||
|
||||
if not media.HasNotes():
|
||||
|
||||
return None
|
||||
|
||||
|
||||
noteManager = media.GetNotesManager()
|
||||
|
||||
|
@ -70,34 +75,52 @@ def GetFrameTimesFromNote(media: ClientMedia.MediaSingleton):
|
|||
else:
|
||||
|
||||
frameData: typing.List[HydrusUgoiraHandling.UgoiraFrame] = ugoiraJson['frames']
|
||||
|
||||
|
||||
|
||||
frames = [data['delay'] for data in frameData]
|
||||
|
||||
if len(frames) > 0 and isinstance(frames[0], int):
|
||||
|
||||
|
||||
return frames
|
||||
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
if 'ugoira frame delay array' in notes:
|
||||
|
||||
try:
|
||||
|
||||
ugoiraJsonArray: typing.List[int] = json.loads(notes['ugoira frame delay array'])
|
||||
|
||||
if len(ugoiraJsonArray) > 0 and isinstance(ugoiraJsonArray[0], int):
|
||||
|
||||
if len(ugoiraJsonArray) > 0 and isinstance(ugoiraJsonArray[0], int):
|
||||
|
||||
return ugoiraJsonArray
|
||||
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def HasFrameTimesNote( media: ClientMediaResult.MediaResult ):
|
||||
|
||||
if not media.HasNotes():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
names_to_notes = media.GetNotesManager().GetNamesToNotes()
|
||||
|
||||
return 'ugoira json' in names_to_notes or 'ugoira frame delay array' in names_to_notes
|
||||
|
||||
|
||||
class UgoiraRenderer(object):
|
||||
|
||||
|
|
|
@ -11112,6 +11112,79 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 594:
|
||||
|
||||
try:
|
||||
|
||||
all_local_hash_ids = self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( all_local_hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
mimes_we_want = ( HC.ANIMATION_UGOIRA, )
|
||||
|
||||
hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM {} CROSS JOIN files_info USING ( hash_id ) WHERE mime IN {};'.format( temp_hash_ids_table_name, HydrusData.SplayListForDB( mimes_we_want ) ) ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Some ugoira-scanning failed to schedule! This is not super important, but hydev would be interested in seeing the error that was printed to the log.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
false_positive_alternates_group_ids = self._STS( self._Execute( 'SELECT smaller_alternates_group_id FROM duplicate_false_positives;' ) )
|
||||
false_positive_alternates_group_ids.update( self._STS( self._Execute( 'SELECT larger_alternates_group_id FROM duplicate_false_positives;' ) ) )
|
||||
|
||||
false_positive_medias_ids = set()
|
||||
|
||||
for alternates_group_id in false_positive_alternates_group_ids:
|
||||
|
||||
false_positive_medias_ids.update( self.modules_files_duplicates.GetAlternateMediaIds( alternates_group_id ) )
|
||||
|
||||
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_FILE_SERVICE_KEY ) )
|
||||
|
||||
false_positive_hash_ids = self.modules_files_duplicates.GetDuplicatesHashIds( false_positive_medias_ids, db_location_context )
|
||||
|
||||
if len( false_positive_hash_ids ) > 0:
|
||||
|
||||
def ask_what_to_do_false_positive_modified_dates():
|
||||
|
||||
message = 'Hey, due to a bug, some potential duplicate pairs that were set as "false positive/not related" in the duplicates system may have had their file modified date database records merged. The files\' true file modified dates on your hard drive were not affected.'
|
||||
message += '\n' * 2
|
||||
message += f'You have {len( false_positive_hash_ids)} files ever set as "not related". Shall I reset their file modified dates back to whatever they have on your hard drive? I recommend doing this unless you have a complicated file modified merging scheme already in place and would rather go through all these manually.'
|
||||
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( None, message, title = 'Reset modified dates?', yes_label = 'do it', no_label = 'do not do it' )
|
||||
|
||||
return result == QW.QDialog.Accepted
|
||||
|
||||
|
||||
do_it = self._controller.CallBlockingToQt( None, ask_what_to_do_false_positive_modified_dates )
|
||||
|
||||
if do_it:
|
||||
|
||||
self.modules_files_maintenance_queue.AddJobs( false_positive_hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_MODIFIED_TIMESTAMP )
|
||||
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Some alternates metadata updates failed to schedule! This is not super important, but hydev would be interested in seeing the error that was printed to the log.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusNumbers.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -599,7 +599,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
table_join = '{} CROSS JOIN {} USING ( media_id )'.format( temp_media_ids_table_name, 'duplicate_file_members' )
|
||||
|
||||
if db_location_context is not None:
|
||||
|
||||
|
||||
table_join = db_location_context.GetTableJoinLimitedByFileDomain( table_join )
|
||||
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ from hydrus.client.db import ClientDBMaintenance
|
|||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBSerialisable
|
||||
from hydrus.client.db import ClientDBServices
|
||||
from hydrus.client.duplicates import ClientAutoDuplicates
|
||||
from hydrus.client.duplicates import ClientDuplicatesAutoResolution
|
||||
|
||||
def GenerateResolutionDecisionTableNames( resolution_rule_id ) -> typing.Dict[ int, str ]:
|
||||
|
||||
|
@ -23,10 +23,10 @@ def GenerateResolutionDecisionTableNames( resolution_rule_id ) -> typing.Dict[ i
|
|||
results = {}
|
||||
|
||||
for status in (
|
||||
ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED,
|
||||
ClientAutoDuplicates.DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED,
|
||||
ClientAutoDuplicates.DUPLICATE_STATUS_DOES_NOT_MATCH_SEARCH,
|
||||
ClientAutoDuplicates.DUPLICATE_STATUS_MATCHES_SEARCH_FAILED_TEST
|
||||
ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED,
|
||||
ClientDuplicatesAutoResolution.DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED,
|
||||
ClientDuplicatesAutoResolution.DUPLICATE_STATUS_DOES_NOT_MATCH_SEARCH,
|
||||
ClientDuplicatesAutoResolution.DUPLICATE_STATUS_MATCHES_SEARCH_FAILED_TEST
|
||||
):
|
||||
|
||||
results[ status ] = f'{table_core}_{status}'
|
||||
|
@ -55,7 +55,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
super().__init__( 'client duplicates auto-resolution', cursor )
|
||||
|
||||
self._ids_to_resolution_rules: typing.Dict[ int, ClientAutoDuplicates.DuplicatesAutoResolutionRule ] = {}
|
||||
self._ids_to_resolution_rules: typing.Dict[ int, ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ] = {}
|
||||
|
||||
self._Reinit()
|
||||
|
||||
|
@ -76,10 +76,10 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def _Reinit( self ):
|
||||
|
||||
self._ids_to_resolution_rules = { rule.GetId() : rule for rule in self.modules_serialisable.GetJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE ) }
|
||||
self._ids_to_resolution_rules = { rule.GetId() : rule for rule in self.modules_serialisable.GetJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE ) }
|
||||
|
||||
|
||||
def AddRule( self, rule: ClientAutoDuplicates.DuplicatesAutoResolutionRule ):
|
||||
def AddRule( self, rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
self._Execute( 'INSERT INTO duplicate_files_auto_resolution_rules DEFAULT VALUES;' )
|
||||
|
||||
|
@ -100,12 +100,12 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
self._CreateIndex( table_name, ( 'larger_media_id', 'smaller_media_id' ) )
|
||||
|
||||
|
||||
self._ExecuteMany( f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) SELECT smaller_media_id, larger_media_id FROM potential_duplicate_pairs;' )
|
||||
self._ExecuteMany( f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) SELECT smaller_media_id, larger_media_id FROM potential_duplicate_pairs;' )
|
||||
|
||||
ClientAutoDuplicates.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
|
||||
|
||||
def DeleteRule( self, rule: ClientAutoDuplicates.DuplicatesAutoResolutionRule ):
|
||||
def DeleteRule( self, rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
resolution_rule_id = rule.GetId()
|
||||
|
||||
|
@ -120,7 +120,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
del self._ids_to_resolution_rules[ resolution_rule_id ]
|
||||
|
||||
self.modules_serialisable.DeleteJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE, dump_name = rule.GetName() )
|
||||
self.modules_serialisable.DeleteJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE, dump_name = rule.GetName() )
|
||||
|
||||
|
||||
def DoResolutionWork( self, resolution_rule_id: int, max_work_time = 0.5 ) -> bool:
|
||||
|
@ -135,7 +135,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def get_row():
|
||||
|
||||
return self._Execute( f'SELECT smaller_media_id, larger_media_id FROM {statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED ]};' ).fetchone()
|
||||
return self._Execute( f'SELECT smaller_media_id, larger_media_id FROM {statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED ]};' ).fetchone()
|
||||
|
||||
|
||||
pair_to_work = get_row()
|
||||
|
@ -172,7 +172,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def get_rows():
|
||||
|
||||
return self._Execute( f'SELECT smaller_media_id, larger_media_id FROM {statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED ]} LIMIT 256;' ).fetchone()
|
||||
return self._Execute( f'SELECT smaller_media_id, larger_media_id FROM {statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED ]} LIMIT 256;' ).fetchone()
|
||||
|
||||
|
||||
pairs_to_work = get_rows()
|
||||
|
@ -236,7 +236,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
for name in orphaned_object_names:
|
||||
|
||||
self.modules_serialisable.DeleteJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE, dump_name = name )
|
||||
self.modules_serialisable.DeleteJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE, dump_name = name )
|
||||
|
||||
|
||||
HydrusData.ShowText( f'Deleted {HydrusNumbers.ToHumanInt( len( orphaned_on_object_side ) )} orphaned auto-resolution rule objects!' )
|
||||
|
@ -255,12 +255,12 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
statuses_to_table_names = GenerateResolutionDecisionTableNames( resolution_rule_id )
|
||||
|
||||
self._ExecuteMany(
|
||||
f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) VALUES ( ?, ? );',
|
||||
f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) VALUES ( ?, ? );',
|
||||
pairs_to_add
|
||||
)
|
||||
|
||||
|
||||
ClientAutoDuplicates.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
|
||||
|
||||
def NotifyDeletePairs( self, pairs_to_remove ):
|
||||
|
@ -280,10 +280,10 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
|
||||
ClientAutoDuplicates.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
|
||||
|
||||
def ResearchRule( self, rule: ClientAutoDuplicates.DuplicatesAutoResolutionRule ):
|
||||
def ResearchRule( self, rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
# rule was edited or user wants to run it again (maybe it has num_tags or something in it)
|
||||
|
||||
resolution_rule_id = rule.GetId()
|
||||
|
@ -292,11 +292,11 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
statuses_to_table_names = GenerateResolutionDecisionTableNames( resolution_rule_id )
|
||||
|
||||
not_searched_table_name = statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED ]
|
||||
not_searched_table_name = statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED ]
|
||||
|
||||
for ( status, table_name ) in statuses_to_table_names.items():
|
||||
|
||||
if status == ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED:
|
||||
if status == ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED:
|
||||
|
||||
continue
|
||||
|
||||
|
@ -344,7 +344,7 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
if len( pairs_we_should_add ) > 0:
|
||||
|
||||
self._ExecuteMany(
|
||||
f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientAutoDuplicates.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) VALUES ( ?, ? );',
|
||||
f'INSERT OR IGNORE INTO {statuses_to_table_names[ ClientDuplicatesAutoResolution.DUPLICATE_STATUS_NOT_SEARCHED ]} ( smaller_media_id, larger_media_id ) VALUES ( ?, ? );',
|
||||
pairs_we_should_add
|
||||
)
|
||||
|
||||
|
@ -363,6 +363,6 @@ class ClientDBFilesDuplicatesAutoResolution( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
|
||||
ClientAutoDuplicates.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().Wake()
|
||||
|
||||
|
||||
|
|
|
@ -14,6 +14,7 @@ from hydrus.core.files.images import HydrusImageMetadata
|
|||
from hydrus.core.files.images import HydrusImageOpening
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client import ClientTime
|
||||
|
@ -247,7 +248,7 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
else:
|
||||
|
||||
s_string = HydrusNumbers.ResolutionToPrettyString( s_resolution )
|
||||
s_string = ClientData.ResolutionToPrettyString( s_resolution )
|
||||
|
||||
if s_w % 2 == 1 or s_h % 2 == 1:
|
||||
|
||||
|
@ -261,7 +262,7 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
else:
|
||||
|
||||
c_string = HydrusNumbers.ResolutionToPrettyString( c_resolution )
|
||||
c_string = ClientData.ResolutionToPrettyString( c_resolution )
|
||||
|
||||
if c_w % 2 == 1 or c_h % 2 == 1:
|
||||
|
||||
|
@ -851,13 +852,16 @@ class DuplicateContentMergeOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
super().__init__()
|
||||
|
||||
# it is important that the default init of this guy syncs absolutely nothing!
|
||||
# we use empty dupe merge option guys to do some other processing, so empty must mean empty
|
||||
|
||||
self._tag_service_actions = []
|
||||
self._rating_service_actions = []
|
||||
self._sync_notes_action = HC.CONTENT_MERGE_ACTION_NONE
|
||||
self._sync_note_import_options = NoteImportOptions.NoteImportOptions()
|
||||
self._sync_archive_action = SYNC_ARCHIVE_NONE
|
||||
self._sync_urls_action = HC.CONTENT_MERGE_ACTION_NONE
|
||||
self._sync_file_modified_date_action = HC.CONTENT_MERGE_ACTION_COPY
|
||||
self._sync_file_modified_date_action = HC.CONTENT_MERGE_ACTION_NONE
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import random
|
||||
import threading
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusSerialisable
|
||||
|
||||
|
@ -26,8 +27,8 @@ LOOKING_AT_WORSE_CANDIDATE = 1
|
|||
|
||||
class PairComparatorOneFile( PairComparator ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_ONE_FILE
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator - One File'
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_ONE_FILE
|
||||
SERIALISABLE_NAME = 'Duplicates Auto-Resolution Pair Comparator - One File'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
@ -65,12 +66,12 @@ class PairComparatorOneFile( PairComparator ):
|
|||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_ONE_FILE ] = PairComparatorOneFile
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_ONE_FILE ] = PairComparatorOneFile
|
||||
|
||||
class PairComparatorRelative( PairComparator ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_TWO_FILES_RELATIVE
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator - Relative'
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_TWO_FILES_RELATIVE
|
||||
SERIALISABLE_NAME = 'Duplicates Auto-Resolution Pair Comparator - Relative'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
@ -105,12 +106,12 @@ class PairComparatorRelative( PairComparator ):
|
|||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_TWO_FILES_RELATIVE ] = PairComparatorRelative
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_TWO_FILES_RELATIVE ] = PairComparatorRelative
|
||||
|
||||
class PairSelectorAndComparator( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Selector and Comparator'
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_SELECTOR_AND_COMPARATOR
|
||||
SERIALISABLE_NAME = 'Duplicates Auto-Resolution Pair Selector and Comparator'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
@ -152,12 +153,12 @@ class PairSelectorAndComparator( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR ] = PairSelectorAndComparator
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_SELECTOR_AND_COMPARATOR ] = PairSelectorAndComparator
|
||||
|
||||
class DuplicatesAutoResolutionRule( HydrusSerialisable.SerialisableBaseNamed ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Rule'
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE
|
||||
SERIALISABLE_NAME = 'Duplicates Auto-Resolution Rule'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, name ):
|
||||
|
@ -170,8 +171,10 @@ class DuplicatesAutoResolutionRule( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
# the id here will be for the database to match up rules to cached pair statuses. slightly wewmode, but we'll see
|
||||
self._id = -1
|
||||
|
||||
# TODO: Yes, do this before we get too excited here
|
||||
# maybe make this search part into its own object? in ClientDuplicates
|
||||
# could wangle duplicate pages and client api dupe stuff to work in the same guy, great idea
|
||||
# duplicate search, too, rather than passing around a bunch of params
|
||||
self._file_search_context_1 = None
|
||||
self._file_search_context_2 = None
|
||||
self._dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
@ -180,11 +183,19 @@ class DuplicatesAutoResolutionRule( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
self._selector_and_comparator = None
|
||||
|
||||
self._paused = False
|
||||
|
||||
# action info
|
||||
# set as better
|
||||
# delete the other one
|
||||
# optional custom merge options
|
||||
|
||||
# a search cache that we can update on every run, just some nice numbers for the human to see or force-populate in UI that say 'ok for this search we have 700,000 pairs, and we already processed 220,000'
|
||||
# I think a dict of numbers to strings
|
||||
# number of pairs that match the search
|
||||
# how many didn't pass the comparator test
|
||||
# also would be neat just to remember how many pairs we have successfully processed
|
||||
|
||||
|
||||
# serialisable gubbins
|
||||
# get/set
|
||||
|
@ -195,14 +206,58 @@ class DuplicatesAutoResolutionRule( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self._id
|
||||
|
||||
|
||||
def SetId( self, id: int ):
|
||||
def GetActionSummary( self ) -> str:
|
||||
|
||||
self._id = id
|
||||
return 'set A as better, delete worse'
|
||||
|
||||
|
||||
def GetComparatorSummary( self ) -> str:
|
||||
|
||||
return 'if A is jpeg and B is png'
|
||||
|
||||
|
||||
def GetRuleSummary( self ) -> str:
|
||||
|
||||
return 'system:filetype is jpeg & system:filetype is png, pixel duplicates'
|
||||
|
||||
|
||||
def GetSearchSummary( self ) -> str:
|
||||
|
||||
return 'unknown'
|
||||
|
||||
|
||||
def IsPaused( self ) -> bool:
|
||||
|
||||
return self._paused
|
||||
|
||||
|
||||
def SetId( self, value: int ):
|
||||
|
||||
self._id = value
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE ] = DuplicatesAutoResolutionRule
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE ] = DuplicatesAutoResolutionRule
|
||||
|
||||
def GetDefaultRuleSuggestions() -> typing.List[ DuplicatesAutoResolutionRule ]:
|
||||
|
||||
suggested_rules = []
|
||||
|
||||
#
|
||||
|
||||
duplicates_auto_resolution_rule = DuplicatesAutoResolutionRule( 'pixel-perfect jpegs vs pngs' )
|
||||
|
||||
suggested_rules.append( duplicates_auto_resolution_rule )
|
||||
|
||||
# add on a thing here about resolution. one(both) files need to be like at least 128x128
|
||||
|
||||
#
|
||||
|
||||
return suggested_rules
|
||||
|
||||
|
||||
# TODO: get this guy to inherit that new MainLoop Daemon class and hook it into the other client controller managers
|
||||
# ditch the instance() stuff or don't, whatever you like
|
||||
class DuplicatesAutoResolutionManager( object ):
|
||||
|
||||
my_instance = None
|
||||
|
@ -216,7 +271,10 @@ class DuplicatesAutoResolutionManager( object ):
|
|||
|
||||
DuplicatesAutoResolutionManager.my_instance = self
|
||||
|
||||
# my rules, start with empty and then load from db or whatever on controller init
|
||||
self._ids_to_rules = {}
|
||||
|
||||
# load rules from db or whatever on controller init
|
||||
# on program first boot, we should initialise with the defaults set to paused!
|
||||
|
||||
self._lock = threading.Lock()
|
||||
|
||||
|
@ -232,6 +290,27 @@ class DuplicatesAutoResolutionManager( object ):
|
|||
return DuplicatesAutoResolutionManager.my_instance
|
||||
|
||||
|
||||
def GetRules( self ):
|
||||
|
||||
return []
|
||||
|
||||
|
||||
def GetRunningStatus( self, rule_id: int ) -> str:
|
||||
|
||||
return 'idle'
|
||||
|
||||
|
||||
def SetRules( self, rules: typing.Collection[ DuplicatesAutoResolutionRule ] ):
|
||||
|
||||
# save to database
|
||||
|
||||
# make sure the rules that need ids now have them
|
||||
|
||||
self._ids_to_rules = { rule.GetId() : rule for rule in rules }
|
||||
|
||||
# send out an update signal
|
||||
|
||||
|
||||
def Wake( self ):
|
||||
|
||||
pass
|
|
@ -8078,6 +8078,32 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
self._controller.CallToThread( self._controller.SaveGUISession, session )
|
||||
|
||||
|
||||
def RedownloadURLsForceFetch( self, urls ):
|
||||
|
||||
if len( urls ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
urls = sorted( urls )
|
||||
|
||||
tag_import_options = CG.client_controller.network_engine.domain_manager.GetDefaultTagImportOptionsForURL( urls[0] )
|
||||
|
||||
tag_import_options = tag_import_options.Duplicate()
|
||||
|
||||
tag_import_options.SetShouldFetchTagsEvenIfHashKnownAndFileAlreadyInDB( True )
|
||||
tag_import_options.SetShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB( True )
|
||||
|
||||
page = self._notebook.GetOrMakeURLImportPage( desired_page_name = 'forced urls downloader', destination_tag_import_options = tag_import_options )
|
||||
|
||||
management_panel = page.GetManagementPanel()
|
||||
|
||||
for url in urls:
|
||||
|
||||
management_panel.PendURL( url )
|
||||
|
||||
|
||||
|
||||
def RefreshPage( self, page_key: bytes ):
|
||||
|
||||
page = self._notebook.GetPageFromPageKey( page_key )
|
||||
|
|
|
@ -719,13 +719,14 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
for e in ( 'hex', 'base64', 'url percent encoding', 'unicode escape characters', 'html entities' ):
|
||||
|
||||
self._data_encoding.addItem( e, e )
|
||||
|
||||
|
||||
for e in ( 'url percent encoding', 'unicode escape characters', 'html entities' ):
|
||||
|
||||
self._data_decoding.addItem( e, e )
|
||||
|
||||
|
||||
utf8_tt = '"hex" and "base64" here will use UTF-8 encoding.'
|
||||
|
||||
self._data_encoding.setToolTip( ClientGUIFunctions.WrapToolTip( utf8_tt ) )
|
||||
self._data_decoding.setToolTip( ClientGUIFunctions.WrapToolTip( utf8_tt ) )
|
||||
|
||||
self._data_timezone_decode.addItem( 'UTC', HC.TIMEZONE_UTC )
|
||||
self._data_timezone_decode.addItem( 'Local', HC.TIMEZONE_LOCAL )
|
||||
self._data_timezone_decode.addItem( 'Offset', HC.TIMEZONE_OFFSET )
|
||||
|
|
|
@ -23,7 +23,6 @@ from hydrus.client.gui import ClientGUIDialogs
|
|||
from hydrus.client.gui import ClientGUIDialogsManage
|
||||
from hydrus.client.gui import ClientGUIDialogsMessage
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIDuplicates
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUIRatings
|
||||
|
@ -33,6 +32,7 @@ from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
|||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.canvas import ClientGUICanvasHoverFrames
|
||||
from hydrus.client.gui.canvas import ClientGUICanvasMedia
|
||||
from hydrus.client.gui.duplicates import ClientGUIDuplicateActions
|
||||
from hydrus.client.gui.media import ClientGUIMediaSimpleActions
|
||||
from hydrus.client.gui.media import ClientGUIMediaModalActions
|
||||
from hydrus.client.gui.media import ClientGUIMediaControls
|
||||
|
@ -1028,7 +1028,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ClearFalsePositives( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_CLEAR_FALSE_POSITIVES:
|
||||
|
@ -1037,7 +1037,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ClearFalsePositives( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_ALTERNATE_GROUP:
|
||||
|
@ -1046,7 +1046,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_ALTERNATE_GROUP:
|
||||
|
@ -1055,7 +1055,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_DUPLICATE_GROUP:
|
||||
|
@ -1064,7 +1064,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_DUPLICATE_GROUP:
|
||||
|
@ -1073,7 +1073,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_ALTERNATE_GROUP:
|
||||
|
@ -1082,7 +1082,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_DUPLICATE_GROUP:
|
||||
|
@ -1091,7 +1091,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_FOCUSED_POTENTIAL_SEARCH:
|
||||
|
@ -1100,7 +1100,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ResetPotentialSearch( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_POTENTIAL_SEARCH:
|
||||
|
@ -1109,7 +1109,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ResetPotentialSearch( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_POTENTIALS:
|
||||
|
@ -1118,7 +1118,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemovePotentials( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_POTENTIALS:
|
||||
|
@ -1127,7 +1127,7 @@ class Canvas( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
hash = self._current_media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemovePotentials( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
|
||||
|
|
|
@ -782,6 +782,7 @@ class Animation( QW.QWidget ):
|
|||
self._video_container = None
|
||||
|
||||
self._frame_durations = None
|
||||
self._duration = None
|
||||
|
||||
if self._media is None:
|
||||
|
||||
|
@ -796,12 +797,14 @@ class Animation( QW.QWidget ):
|
|||
self._duration = self._media.GetDurationMS()
|
||||
|
||||
if self._media.GetMime() == HC.ANIMATION_UGOIRA:
|
||||
|
||||
self._frame_durations = ClientUgoiraHandling.GetFrameDurationsUgoira( media )
|
||||
|
||||
self._frame_durations = ClientUgoiraHandling.GetFrameDurationsUgoira( media.GetMediaResult() )
|
||||
|
||||
|
||||
if self._duration is None and self._frame_durations is not None:
|
||||
|
||||
|
||||
self._duration = sum( self._frame_durations )
|
||||
|
||||
|
||||
CG.client_controller.gui.RegisterAnimationUpdateWindow( self )
|
||||
|
||||
|
@ -812,10 +815,12 @@ class Animation( QW.QWidget ):
|
|||
def GetDuration( self ):
|
||||
|
||||
return self._duration
|
||||
|
||||
|
||||
def GetNumFrames( self ):
|
||||
|
||||
return self._num_frames
|
||||
|
||||
|
||||
def TIMERAnimationUpdate( self ):
|
||||
|
||||
|
|
|
@ -0,0 +1,353 @@
|
|||
import typing
|
||||
|
||||
from qtpy import QtWidgets as QW
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client.duplicates import ClientDuplicatesAutoResolution
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
|
||||
from hydrus.client.gui.lists import ClientGUIListCtrl
|
||||
from hydrus.client.gui.panels import ClientGUIScrolledPanels
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIMenuButton
|
||||
|
||||
class EditDuplicatesAutoResolutionRulesPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, duplicates_auto_resolution_rules: typing.Collection[ ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ] ):
|
||||
|
||||
super().__init__( parent )
|
||||
|
||||
menu_items = []
|
||||
|
||||
call = HydrusData.Call( ClientGUIDialogsQuick.OpenDocumentation, self, HC.DOCUMENTATION_DUPLICATES_AUTO_RESOLUTION )
|
||||
|
||||
menu_items.append( ( 'normal', 'open the duplicates auto-resolution help', 'Open the help page for duplicates auto-resolution in your web browser.', call ) )
|
||||
|
||||
help_button = ClientGUIMenuButton.MenuBitmapButton( self, CC.global_pixmaps().help, menu_items )
|
||||
|
||||
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', object_name = 'HydrusIndeterminate' )
|
||||
|
||||
#
|
||||
|
||||
self._duplicates_auto_resolution_rules_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
model = ClientGUIListCtrl.HydrusListItemModel( self, CGLC.COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, self._ConvertRuleToDisplayTuple, self._ConvertRuleToSortTuple )
|
||||
|
||||
self._duplicates_auto_resolution_rules = ClientGUIListCtrl.BetterListCtrlTreeView( self._duplicates_auto_resolution_rules_panel, CGLC.COLUMN_LIST_EXPORT_FOLDERS.ID, 12, model, use_simple_delete = True, activation_callback = self._Edit )
|
||||
|
||||
self._duplicates_auto_resolution_rules_panel.SetListCtrl( self._duplicates_auto_resolution_rules )
|
||||
|
||||
#self._duplicates_auto_resolution_rules_panel.AddButton( 'add', self._Add )
|
||||
self._duplicates_auto_resolution_rules_panel.AddButton( 'add suggested', self._AddSuggested )
|
||||
self._duplicates_auto_resolution_rules_panel.AddButton( 'edit', self._Edit, enabled_only_on_single_selection = True )
|
||||
self._duplicates_auto_resolution_rules_panel.AddDeleteButton()
|
||||
#self._duplicates_auto_resolution_rules_panel.AddImportExportButtons( ( ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule, ), self._ImportRule )
|
||||
|
||||
#
|
||||
|
||||
self._duplicates_auto_resolution_rules.AddDatas( duplicates_auto_resolution_rules )
|
||||
|
||||
self._duplicates_auto_resolution_rules.Sort()
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, 'Hey, this system does not work yet! This UI is just a placeholder!' )
|
||||
st.setWordWrap( True )
|
||||
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, self._duplicates_auto_resolution_rules_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
|
||||
def _Add( self ):
|
||||
|
||||
name = 'new rule'
|
||||
|
||||
duplicates_auto_resolution_rule = ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule( name )
|
||||
|
||||
# TODO: set some good defaults
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit rule' ) as dlg:
|
||||
|
||||
panel = EditDuplicatesAutoResolutionRulePanel( dlg, duplicates_auto_resolution_rule )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
duplicates_auto_resolution_rule = panel.GetValue()
|
||||
|
||||
duplicates_auto_resolution_rule.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._duplicates_auto_resolution_rules.AddDatas( ( duplicates_auto_resolution_rule, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def _AddSuggested( self ):
|
||||
|
||||
suggested_rules = ClientDuplicatesAutoResolution.GetDefaultRuleSuggestions()
|
||||
|
||||
choice_tuples = [ ( rule.GetName(), rule ) for rule in suggested_rules ]
|
||||
|
||||
try:
|
||||
|
||||
duplicates_auto_resolution_rule = ClientGUIDialogsQuick.SelectFromList( self, 'Select which to add', choice_tuples )
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._ImportRule( duplicates_auto_resolution_rule )
|
||||
|
||||
|
||||
def _ConvertRuleToDisplayTuple( self, duplicates_auto_resolution_rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
name = duplicates_auto_resolution_rule.GetName()
|
||||
rule_summary = duplicates_auto_resolution_rule.GetRuleSummary()
|
||||
comparator_summary = duplicates_auto_resolution_rule.GetComparatorSummary()
|
||||
action_summary = duplicates_auto_resolution_rule.GetActionSummary()
|
||||
search_status = duplicates_auto_resolution_rule.GetSearchSummary()
|
||||
paused = duplicates_auto_resolution_rule.IsPaused()
|
||||
|
||||
pretty_paused = 'yes' if paused else ''
|
||||
|
||||
return ( name, rule_summary, comparator_summary, action_summary, search_status, pretty_paused )
|
||||
|
||||
|
||||
_ConvertRuleToSortTuple = _ConvertRuleToDisplayTuple
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
duplicates_auto_resolution_rule: typing.Optional[ ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ] = self._duplicates_auto_resolution_rules.GetTopSelectedData()
|
||||
|
||||
if duplicates_auto_resolution_rule is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit export folder' ) as dlg:
|
||||
|
||||
panel = EditDuplicatesAutoResolutionRulePanel( dlg, duplicates_auto_resolution_rule )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_duplicates_auto_resolution_rule = panel.GetValue()
|
||||
|
||||
if edited_duplicates_auto_resolution_rule.GetName() != duplicates_auto_resolution_rule.GetName():
|
||||
|
||||
existing_names = self._GetExistingNames()
|
||||
|
||||
existing_names.discard( duplicates_auto_resolution_rule.GetName() )
|
||||
|
||||
edited_duplicates_auto_resolution_rule.SetNonDupeName( existing_names )
|
||||
|
||||
|
||||
self._duplicates_auto_resolution_rules.ReplaceData( duplicates_auto_resolution_rule, edited_duplicates_auto_resolution_rule, sort_and_scroll = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
return { duplicates_auto_resolution_rule.GetName() for duplicates_auto_resolution_rule in self._duplicates_auto_resolution_rules.GetData() }
|
||||
|
||||
|
||||
def _ImportRule( self, duplicates_auto_resolution_rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
duplicates_auto_resolution_rule.SetNonDupeName( self._GetExistingNames() )
|
||||
|
||||
self._duplicates_auto_resolution_rules.AddDatas( ( duplicates_auto_resolution_rule, ), select_sort_and_scroll = True )
|
||||
|
||||
|
||||
def GetValue( self ) -> typing.List[ ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ]:
|
||||
|
||||
return self._duplicates_auto_resolution_rules.GetData()
|
||||
|
||||
|
||||
|
||||
class EditDuplicatesAutoResolutionRulePanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, duplicates_auto_resolution_rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
super().__init__( parent )
|
||||
|
||||
self._duplicates_auto_resolution_rule = duplicates_auto_resolution_rule
|
||||
|
||||
self._rule_panel = ClientGUICommon.StaticBox( self, 'rule' )
|
||||
|
||||
self._name = QW.QLineEdit( self._rule_panel )
|
||||
|
||||
# paused
|
||||
# search gubbins
|
||||
# comparator gubbins
|
||||
# some way to test-run searches and see pair counts, and, eventually, a way to preview some pairs and the auto-choices we'd see
|
||||
|
||||
#
|
||||
|
||||
self._name.setText( self._duplicates_auto_resolution_rule.GetName() )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'name: ', self._name ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._rule_panel, rows )
|
||||
|
||||
self._rule_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, 'Hey, this system does not work yet! This UI is just a placeholder!' )
|
||||
st.setWordWrap( True )
|
||||
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, self._rule_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
name = self._name.text()
|
||||
|
||||
duplicates_auto_resolution_rule = ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule( name )
|
||||
|
||||
# paused and search gubbins, everything else
|
||||
|
||||
# TODO: transfer any cached search data, including what we may have re-fetched in this panel's work, to the new folder
|
||||
|
||||
return duplicates_auto_resolution_rule
|
||||
|
||||
|
||||
|
||||
class ReviewDuplicatesAutoResolutionPanel( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent: QW.QWidget ):
|
||||
|
||||
super().__init__( parent )
|
||||
|
||||
menu_items = []
|
||||
|
||||
call = HydrusData.Call( ClientGUIDialogsQuick.OpenDocumentation, self, HC.DOCUMENTATION_DUPLICATES_AUTO_RESOLUTION )
|
||||
|
||||
menu_items.append( ( 'normal', 'open the duplicates auto-resolution help', 'Open the help page for duplicates auto-resolution in your web browser.', call ) )
|
||||
|
||||
help_button = ClientGUIMenuButton.MenuBitmapButton( self, CC.global_pixmaps().help, menu_items )
|
||||
|
||||
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', object_name = 'HydrusIndeterminate' )
|
||||
|
||||
#
|
||||
|
||||
# TODO: A cog icon with 'run in normal/idle time' stuff
|
||||
|
||||
#
|
||||
|
||||
self._duplicates_auto_resolution_rules_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
model = ClientGUIListCtrl.HydrusListItemModel( self, CGLC.COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID, self._ConvertRuleToDisplayTuple, self._ConvertRuleToSortTuple )
|
||||
|
||||
self._duplicates_auto_resolution_rules = ClientGUIListCtrl.BetterListCtrlTreeView( self._duplicates_auto_resolution_rules_panel, CGLC.COLUMN_LIST_EXPORT_FOLDERS.ID, 12, model, use_simple_delete = True, activation_callback = self._Edit )
|
||||
|
||||
self._duplicates_auto_resolution_rules_panel.SetListCtrl( self._duplicates_auto_resolution_rules )
|
||||
|
||||
self._duplicates_auto_resolution_rules_panel.AddButton( 'run now', self._RunNow, enabled_check_func = self._CanRunNow )
|
||||
self._duplicates_auto_resolution_rules_panel.AddButton( 'edit rules', self._Edit )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_ON_RIGHT )
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, 'Hey, this system does not work yet! This UI is just a placeholder!' )
|
||||
st.setWordWrap( True )
|
||||
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, self._duplicates_auto_resolution_rules_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
#
|
||||
|
||||
self._ResetListData()
|
||||
|
||||
# TODO: hook into the manager's pubsub update signal, _ResetListData
|
||||
|
||||
|
||||
def _CanRunNow( self ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _ConvertRuleToDisplayTuple( self, duplicates_auto_resolution_rule: ClientDuplicatesAutoResolution.DuplicatesAutoResolutionRule ):
|
||||
|
||||
name = duplicates_auto_resolution_rule.GetName()
|
||||
search_status = duplicates_auto_resolution_rule.GetSearchSummary()
|
||||
|
||||
if duplicates_auto_resolution_rule.IsPaused():
|
||||
|
||||
running_status = 'paused'
|
||||
|
||||
else:
|
||||
|
||||
running_status = ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().GetRunningStatus( duplicates_auto_resolution_rule.GetId() )
|
||||
|
||||
|
||||
return ( name, search_status, running_status )
|
||||
|
||||
|
||||
_ConvertRuleToSortTuple = _ConvertRuleToDisplayTuple
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
# TODO: Some sort of async delay as we wait for any current work to halt and the manager to save
|
||||
|
||||
duplicates_auto_resolution_rules = self._duplicates_auto_resolution_rules.GetData()
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit rules' ) as dlg:
|
||||
|
||||
panel = EditDuplicatesAutoResolutionRulesPanel( dlg, duplicates_auto_resolution_rules )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
edited_duplicates_auto_resolution_rules = panel.GetValue()
|
||||
|
||||
ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().SetRules( edited_duplicates_auto_resolution_rules )
|
||||
|
||||
|
||||
|
||||
|
||||
def _ResetListData( self ):
|
||||
|
||||
rules = ClientDuplicatesAutoResolution.DuplicatesAutoResolutionManager.instance().GetRules()
|
||||
|
||||
self._duplicates_auto_resolution_rules.SetData( rules )
|
||||
|
||||
|
||||
def _RunNow( self ):
|
||||
|
||||
pass
|
||||
|
||||
|
|
@ -561,6 +561,7 @@ If you select synchronise, be careful!'''
|
|||
return export_folder
|
||||
|
||||
|
||||
|
||||
class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
||||
|
||||
def __init__( self, parent, flat_media, do_export_and_then_quit = False ):
|
||||
|
|
|
@ -3314,18 +3314,6 @@ class ListBoxTags( ListBox ):
|
|||
ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( predicates_selection_string ), 'Add the selected predicates to the current search.', self._ProcessMenuPredicateEvent, 'add_predicates' )
|
||||
|
||||
|
||||
if or_predicate is not None and or_predicate not in predicates:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, 'add an OR of {} to current search'.format( predicates_selection_string ), 'Add the selected predicates as an OR predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_or_predicate' )
|
||||
|
||||
all_selected_in_current = predicates.issubset( current_predicates )
|
||||
|
||||
if all_selected_in_current:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, f'replace {predicates_selection_string} with their OR', 'Remove the selected predicates and replace them with an OR predicate that searches for any of them.', self._ProcessMenuPredicateEvent, 'replace_or_predicate')
|
||||
|
||||
|
||||
|
||||
some_selected_in_current = HydrusLists.SetsIntersect( predicates, current_predicates )
|
||||
|
||||
if some_selected_in_current:
|
||||
|
@ -3365,6 +3353,36 @@ class ListBoxTags( ListBox ):
|
|||
ClientGUIMenus.AppendMenuItem( search_menu, text, desc, self._ProcessMenuPredicateEvent, 'add_inverse_predicates' )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( search_menu )
|
||||
|
||||
if or_predicate is not None and or_predicate not in predicates:
|
||||
|
||||
all_selected_in_current = predicates.issubset( current_predicates )
|
||||
|
||||
if all_selected_in_current:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, f'replace {predicates_selection_string} with their OR', 'Remove the selected predicates and replace them with an OR predicate that searches for any of them.', self._ProcessMenuPredicateEvent, 'replace_or_predicate')
|
||||
|
||||
else:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, 'add an OR of {} to current search'.format( predicates_selection_string ), 'Add the selected predicates as an OR predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_or_predicate' )
|
||||
|
||||
|
||||
|
||||
if True not in ( p.IsORPredicate() for p in predicates ):
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, f'start an OR predicate with {predicates_selection_string}', 'Start up the Edit OR Predicate panel starting with this.', self._ProcessMenuPredicateEvent, 'start_or_predicate' )
|
||||
|
||||
|
||||
if False not in ( p.IsORPredicate() for p in predicates ):
|
||||
|
||||
label = f'dissolve {predicates_selection_string} into single predicates'
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, label, 'Convert OR predicates to their constituent parts.', self._ProcessMenuPredicateEvent, 'dissolve_or_predicate' )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( search_menu )
|
||||
|
||||
if namespace_predicate is not None and namespace_predicate not in current_predicates:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( namespace_predicate.ToString( with_count = False ) ), 'Add the namespace predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_namespace_predicate' )
|
||||
|
|
|
@ -567,7 +567,7 @@ class ListBoxItemPredicate( ListBoxItem ):
|
|||
|
||||
for sub_pred in self._predicate.GetORPredicates():
|
||||
|
||||
rows_of_texts_and_namespaces.append( sub_pred.GetTextsAndNamespaces( render_for_user, prefix = ' ' ) )
|
||||
rows_of_texts_and_namespaces.append( sub_pred.GetTextsAndNamespaces( render_for_user, prefix = ' ' ) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1587,3 +1587,43 @@ register_column_type( COLUMN_LIST_METADATA_ROUTER_TEST_RESULTS.ID, COLUMN_LIST_M
|
|||
register_column_type( COLUMN_LIST_METADATA_ROUTER_TEST_RESULTS.ID, COLUMN_LIST_METADATA_ROUTER_TEST_RESULTS.PROCESSED_STRINGS, 'processed strings', False, 48, True )
|
||||
|
||||
default_column_list_sort_lookup[ COLUMN_LIST_METADATA_ROUTER_TEST_RESULTS.ID ] = ( COLUMN_LIST_METADATA_ROUTER_TEST_RESULTS.TEST_OBJECT, True )
|
||||
|
||||
class COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES( COLUMN_LIST_DEFINITION ):
|
||||
|
||||
ID = 74
|
||||
|
||||
NAME = 0
|
||||
RULE_SUMMARY = 1
|
||||
COMPARATOR_SUMMARY = 2
|
||||
ACTION_SUMMARY = 3
|
||||
PROGRESS = 4
|
||||
PAUSED = 5
|
||||
|
||||
|
||||
column_list_type_name_lookup[ COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID ] = 'edit duplicates auto-resolution rules'
|
||||
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.NAME, 'name', False, 48, True )
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.RULE_SUMMARY, 'search', False, 64, True )
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.COMPARATOR_SUMMARY, 'comparison', False, 64, True )
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ACTION_SUMMARY, 'action', False, 64, True )
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.PROGRESS, 'progress', False, 64, True )
|
||||
register_column_type( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.PAUSED, 'paused', False, 18, True )
|
||||
|
||||
default_column_list_sort_lookup[ COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.ID ] = ( COLUMN_LIST_EDIT_DUPLICATES_AUTO_RESOLUTION_RULES.NAME, True )
|
||||
|
||||
class COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES( COLUMN_LIST_DEFINITION ):
|
||||
|
||||
ID = 75
|
||||
|
||||
NAME = 0
|
||||
PROGRESS = 1
|
||||
RUNNING_STATUS = 2
|
||||
|
||||
|
||||
column_list_type_name_lookup[ COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID ] = 'review duplicates auto-resolution rules'
|
||||
|
||||
register_column_type( COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.NAME, 'name', False, 48, True )
|
||||
register_column_type( COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.PROGRESS, 'progress', False, 64, True )
|
||||
register_column_type( COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID, COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.RUNNING_STATUS, 'status', False, 18, True )
|
||||
|
||||
default_column_list_sort_lookup[ COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.ID ] = ( COLUMN_LIST_REVIEW_DUPLICATES_AUTO_RESOLUTION_RULES.NAME, True )
|
||||
|
|
|
@ -7,6 +7,7 @@ from qtpy import QtWidgets as QW
|
|||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusLists
|
||||
from hydrus.core import HydrusNumbers
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusText
|
||||
|
@ -363,7 +364,7 @@ class HydrusListItemModel( QC.QAbstractItemModel ):
|
|||
|
||||
if data not in self._data_to_sort_tuples:
|
||||
|
||||
sort_tuple = self._data_to_sort_tuple_func( data )
|
||||
sort_tuple = HydrusLists.ConvertTupleOfDatasToCasefolded( self._data_to_sort_tuple_func( data ) )
|
||||
|
||||
self._data_to_sort_tuples[ data ] = sort_tuple
|
||||
|
||||
|
@ -457,7 +458,7 @@ class HydrusListItemModel( QC.QAbstractItemModel ):
|
|||
|
||||
existing_sort_tuple = self._data_to_sort_tuples[ data ]
|
||||
|
||||
new_sort_tuple = self._data_to_sort_tuple_func( data )
|
||||
new_sort_tuple = HydrusLists.ConvertTupleOfDatasToCasefolded( self._data_to_sort_tuple_func( data ) )
|
||||
|
||||
if existing_sort_tuple[ existing_sort_logical_index ] != new_sort_tuple[ existing_sort_logical_index ]:
|
||||
|
||||
|
@ -739,44 +740,6 @@ class BetterListCtrlTreeView( QW.QTreeView ):
|
|||
return status
|
||||
|
||||
|
||||
def _GetDisplayAndSortTuples( self, data ):
|
||||
|
||||
try:
|
||||
|
||||
( display_tuple, sort_tuple ) = self._data_to_tuples_func( data )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
if not self._have_shown_a_column_data_error:
|
||||
|
||||
HydrusData.ShowText( 'A multi-column list was unable to generate text or sort data for one or more rows! Please send hydrus dev the traceback!' )
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
self._have_shown_a_column_data_error = True
|
||||
|
||||
|
||||
error_display_tuple = [ 'unable to display' for i in range( self._column_list_status.GetColumnCount() ) ]
|
||||
|
||||
return ( error_display_tuple, None )
|
||||
|
||||
|
||||
better_sort = []
|
||||
|
||||
for item in sort_tuple:
|
||||
|
||||
if isinstance( item, str ):
|
||||
|
||||
item = HydrusData.HumanTextSortKey( item )
|
||||
|
||||
|
||||
better_sort.append( item )
|
||||
|
||||
|
||||
sort_tuple = tuple( better_sort )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _GetRowHeightEstimate( self ):
|
||||
|
||||
# this straight-up returns 0 during dialog init wew, I guess when I ask during init the text isn't initialised or whatever
|
||||
|
|
|
@ -412,7 +412,7 @@ def AddKnownURLsViewCopyMenu( win, menu, focus_media, num_files_selected: int, s
|
|||
|
||||
selected_media = ClientMedia.FlattenMedia( selected_media )
|
||||
|
||||
if len( selected_media ) > 1:
|
||||
if len( selected_media ) > 0:
|
||||
|
||||
SAMPLE_SIZE = 256
|
||||
|
||||
|
@ -475,15 +475,13 @@ def AddKnownURLsViewCopyMenu( win, menu, focus_media, num_files_selected: int, s
|
|||
|
||||
urls_visit_menu = ClientGUIMenus.GenerateMenu( urls_menu )
|
||||
urls_copy_menu = ClientGUIMenus.GenerateMenu( urls_menu )
|
||||
urls_force_refetch_menu = ClientGUIMenus.GenerateMenu( urls_menu )
|
||||
|
||||
if len( focus_labels_and_urls ) > 0:
|
||||
|
||||
urls_open_page_menu = ClientGUIMenus.GenerateMenu( urls_menu )
|
||||
|
||||
|
||||
# copy each this file's urls (of a particular type)
|
||||
|
||||
if len( focus_labels_and_urls ) > 0:
|
||||
# copy each this file's urls (of a particular type)
|
||||
|
||||
MAX_TO_SHOW = 15
|
||||
|
||||
|
@ -587,10 +585,10 @@ def AddKnownURLsViewCopyMenu( win, menu, focus_media, num_files_selected: int, s
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( urls_visit_menu, label, 'Open this url class in your web browser for all files.', ClientGUIMediaModalActions.OpenMediaURLClassURLs, win, selected_media, url_class )
|
||||
|
||||
label = 'these files\' ' + url_class.GetName() + ' urls'
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( urls_copy_menu, label, 'Copy this url class for all files.', ClientGUIMediaSimpleActions.CopyMediaURLClassURLs, selected_media, url_class )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( urls_force_refetch_menu, label, 'Re-download these URLs with forced metadata re-fetch enabled.', ClientGUIMediaModalActions.RedownloadURLClassURLsForceRefetch, win, selected_media, url_class )
|
||||
|
||||
|
||||
|
||||
# now everything
|
||||
|
@ -617,6 +615,10 @@ def AddKnownURLsViewCopyMenu( win, menu, focus_media, num_files_selected: int, s
|
|||
|
||||
ClientGUIMenus.AppendMenu( urls_menu, urls_copy_menu, 'copy' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( urls_menu )
|
||||
|
||||
ClientGUIMenus.AppendMenu( urls_menu, urls_force_refetch_menu, 'force metadata refetch' )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, urls_menu, 'urls' )
|
||||
|
||||
|
|
|
@ -941,6 +941,43 @@ def OpenMediaURLClassURLs( win: QW.QWidget, medias, url_class ):
|
|||
OpenURLs( win, urls )
|
||||
|
||||
|
||||
def RedownloadURLClassURLsForceRefetch( win: QW.QWidget, medias, url_class ):
|
||||
|
||||
urls = set()
|
||||
|
||||
for media in medias:
|
||||
|
||||
media_urls = media.GetLocationsManager().GetURLs()
|
||||
|
||||
for url in media_urls:
|
||||
|
||||
# can't do 'url_class.matches', as it will match too many
|
||||
if CG.client_controller.network_engine.domain_manager.GetURLClass( url ) == url_class:
|
||||
|
||||
urls.add( url )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( urls ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
message = f'Open a new search page and force metadata redownload for {len( urls )} "{url_class.GetName()}" URLs? This is inefficient and should only be done to fill in known gaps in one-time jobs.'
|
||||
message += '\n' * 2
|
||||
message += 'DO NOT USE THIS TO RECHECK TEN THOUSAND URLS EVERY MONTH JUST FOR MAYBE A FEW NEW TAGS.'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( win, message )
|
||||
|
||||
if result != QW.QDialog.Accepted:
|
||||
|
||||
return
|
||||
|
||||
|
||||
CG.client_controller.gui.RedownloadURLsForceFetch( urls )
|
||||
|
||||
|
||||
def SetFilesForcedFiletypes( win: QW.QWidget, medias: typing.Collection[ ClientMedia.Media ] ):
|
||||
|
||||
# boot a panel, it shows the user what current mimes are, what forced mimes are, and they have the choice to set all to x
|
||||
|
|
|
@ -353,8 +353,7 @@ def OpenFileWithDialog( media: typing.Optional[ ClientMedia.MediaSingleton ] ) -
|
|||
ClientPaths.OpenFileWithDialog( path )
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
|
||||
def ShowDuplicatesInNewPage( location_context: ClientLocation.LocationContext, hash, duplicate_type ):
|
||||
|
||||
|
|
|
@ -143,14 +143,14 @@ def CreateManagementControllerImportMultipleWatcher( page_name = None, url = Non
|
|||
return management_controller
|
||||
|
||||
|
||||
def CreateManagementControllerImportURLs( page_name = None, destination_location_context = None ):
|
||||
def CreateManagementControllerImportURLs( page_name = None, destination_location_context = None, destination_tag_import_options = None ):
|
||||
|
||||
if page_name is None:
|
||||
|
||||
page_name = 'url import'
|
||||
|
||||
|
||||
urls_import = ClientImportSimpleURLs.URLsImport( destination_location_context = destination_location_context )
|
||||
urls_import = ClientImportSimpleURLs.URLsImport( destination_location_context = destination_location_context, destination_tag_import_options = destination_tag_import_options )
|
||||
|
||||
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
|
||||
|
|
|
@ -37,6 +37,7 @@ from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
|||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.canvas import ClientGUICanvas
|
||||
from hydrus.client.gui.canvas import ClientGUICanvasFrame
|
||||
from hydrus.client.gui.duplicates import ClientGUIDuplicatesAutoResolution
|
||||
from hydrus.client.gui.importing import ClientGUIFileSeedCache
|
||||
from hydrus.client.gui.importing import ClientGUIGallerySeedLog
|
||||
from hydrus.client.gui.importing import ClientGUIImport
|
||||
|
@ -210,6 +211,15 @@ class ListBoxTagsMediaManagementPanel( ClientGUIListBoxes.ListBoxTagsMedia ):
|
|||
p = ( or_predicate, )
|
||||
permit_remove = False
|
||||
|
||||
elif command == 'dissolve_or_predicate':
|
||||
|
||||
or_preds = [ p for p in predicates if p.IsORPredicate() ]
|
||||
|
||||
sub_preds = HydrusLists.MassUnion( [ p.GetValue() for p in or_preds ] )
|
||||
|
||||
CG.client_controller.pub( 'enter_predicates', self._page_key, or_preds, permit_remove = True, permit_add = False )
|
||||
CG.client_controller.pub( 'enter_predicates', self._page_key, sub_preds, permit_remove = False, permit_add = True )
|
||||
|
||||
elif command == 'replace_or_predicate':
|
||||
|
||||
if or_predicate is None:
|
||||
|
@ -220,6 +230,10 @@ class ListBoxTagsMediaManagementPanel( ClientGUIListBoxes.ListBoxTagsMedia ):
|
|||
CG.client_controller.pub( 'enter_predicates', self._page_key, predicates, permit_remove = True, permit_add = False )
|
||||
CG.client_controller.pub( 'enter_predicates', self._page_key, ( or_predicate, ), permit_remove = False, permit_add = True )
|
||||
|
||||
elif command == 'start_or_predicate':
|
||||
|
||||
CG.client_controller.pub( 'enter_predicates', self._page_key, predicates, start_or_predicate = True )
|
||||
|
||||
elif command == 'remove_predicates':
|
||||
|
||||
p = predicates
|
||||
|
@ -481,9 +495,12 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._main_notebook = ClientGUICommon.BetterNotebook( self )
|
||||
|
||||
# TODO: make these two panels into their own classes and rewire everything into panel signals
|
||||
self._main_left_panel = QW.QWidget( self._main_notebook )
|
||||
self._main_right_panel = QW.QWidget( self._main_notebook )
|
||||
|
||||
self._duplicates_auto_resolution_panel = ClientGUIDuplicatesAutoResolution.ReviewDuplicatesAutoResolutionPanel( self )
|
||||
|
||||
#
|
||||
|
||||
self._refresh_maintenance_status = ClientGUICommon.BetterStaticText( self._main_left_panel, ellipsize_end = True )
|
||||
|
@ -601,6 +618,16 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._main_notebook.addTab( self._main_left_panel, 'preparation' )
|
||||
self._main_notebook.addTab( self._main_right_panel, 'filtering' )
|
||||
|
||||
if CG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
self._main_notebook.addTab( self._duplicates_auto_resolution_panel, 'auto-resolution' )
|
||||
|
||||
else:
|
||||
|
||||
self._duplicates_auto_resolution_panel.setVisible( False )
|
||||
|
||||
|
||||
self._main_notebook.setCurrentWidget( self._main_right_panel )
|
||||
|
||||
#
|
||||
|
|
|
@ -25,13 +25,13 @@ from hydrus.client.gui import ClientGUIDialogs
|
|||
from hydrus.client.gui import ClientGUIDialogsManage
|
||||
from hydrus.client.gui import ClientGUIDialogsMessage
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIDuplicates
|
||||
from hydrus.client.gui import ClientGUIShortcuts
|
||||
from hydrus.client.gui import ClientGUITags
|
||||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.canvas import ClientGUICanvas
|
||||
from hydrus.client.gui.canvas import ClientGUICanvasFrame
|
||||
from hydrus.client.gui.duplicates import ClientGUIDuplicateActions
|
||||
from hydrus.client.gui.media import ClientGUIMediaSimpleActions
|
||||
from hydrus.client.gui.media import ClientGUIMediaModalActions
|
||||
from hydrus.client.gui.networking import ClientGUIHydrusNetwork
|
||||
|
@ -2038,7 +2038,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ClearFalsePositives( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_CLEAR_FALSE_POSITIVES:
|
||||
|
@ -2047,7 +2047,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
ClientGUIDuplicates.ClearFalsePositives( self, hashes )
|
||||
ClientGUIDuplicateActions.ClearFalsePositives( self, hashes )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_ALTERNATE_GROUP:
|
||||
|
@ -2058,7 +2058,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_ALTERNATE_GROUP:
|
||||
|
@ -2067,7 +2067,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
ClientGUIDuplicates.DissolveAlternateGroup( self, hashes )
|
||||
ClientGUIDuplicateActions.DissolveAlternateGroup( self, hashes )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_DUPLICATE_GROUP:
|
||||
|
@ -2078,7 +2078,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.DissolveDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_DUPLICATE_GROUP:
|
||||
|
@ -2087,7 +2087,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
ClientGUIDuplicates.DissolveDuplicateGroup( self, hashes )
|
||||
ClientGUIDuplicateActions.DissolveDuplicateGroup( self, hashes )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_ALTERNATE_GROUP:
|
||||
|
@ -2098,7 +2098,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemoveFromAlternateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_DUPLICATE_GROUP:
|
||||
|
@ -2109,7 +2109,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemoveFromDuplicateGroup( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_FOCUSED_POTENTIAL_SEARCH:
|
||||
|
@ -2120,7 +2120,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.ResetPotentialSearch( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_POTENTIAL_SEARCH:
|
||||
|
@ -2129,7 +2129,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
ClientGUIDuplicates.ResetPotentialSearch( self, hashes )
|
||||
ClientGUIDuplicateActions.ResetPotentialSearch( self, hashes )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_POTENTIALS:
|
||||
|
@ -2140,7 +2140,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
hash = media.GetHash()
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
|
||||
ClientGUIDuplicateActions.RemovePotentials( self, ( hash, ) )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_POTENTIALS:
|
||||
|
@ -2149,7 +2149,7 @@ class MediaResultsPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.Liste
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
ClientGUIDuplicates.RemovePotentials( self, hashes )
|
||||
ClientGUIDuplicateActions.RemovePotentials( self, hashes )
|
||||
|
||||
|
||||
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE:
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import collections
|
||||
import os
|
||||
import random
|
||||
import time
|
||||
import typing
|
||||
|
||||
|
@ -859,8 +860,6 @@ class Page( QW.QWidget ):
|
|||
|
||||
except HydrusExceptions.VetoException as e:
|
||||
|
||||
reason = str( e )
|
||||
|
||||
message = '{} Are you sure you want to close it?'.format( str( e ) )
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message )
|
||||
|
@ -2246,7 +2245,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
|
||||
def GetOrMakeURLImportPage( self, desired_page_name = None, desired_page_key = None, select_page = True, destination_location_context = None ):
|
||||
def GetOrMakeURLImportPage( self, desired_page_name = None, desired_page_key = None, select_page = True, destination_location_context = None, destination_tag_import_options = None ):
|
||||
|
||||
potential_url_import_pages = [ page for page in self._GetMediaPages( False ) if page.IsURLImportPage() ]
|
||||
|
||||
|
@ -2278,6 +2277,25 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
potential_url_import_pages = good_url_import_pages
|
||||
|
||||
|
||||
if destination_tag_import_options is not None:
|
||||
|
||||
good_url_import_pages = []
|
||||
|
||||
for url_import_page in potential_url_import_pages:
|
||||
|
||||
urls_import = url_import_page.GetManagementController().GetVariable( 'urls_import' )
|
||||
|
||||
tag_import_options = urls_import.GetTagImportOptions()
|
||||
|
||||
if tag_import_options.GetSerialisableTuple() == destination_tag_import_options.GetSerialisableTuple():
|
||||
|
||||
good_url_import_pages.append( url_import_page )
|
||||
|
||||
|
||||
|
||||
potential_url_import_pages = good_url_import_pages
|
||||
|
||||
|
||||
if len( potential_url_import_pages ) > 0:
|
||||
|
||||
# ok, we can use an existing one. should we use the current?
|
||||
|
@ -2295,7 +2313,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
else:
|
||||
|
||||
return self.NewPageImportURLs( page_name = desired_page_name, on_deepest_notebook = True, select_page = select_page, destination_location_context = destination_location_context )
|
||||
return self.NewPageImportURLs( page_name = desired_page_name, on_deepest_notebook = True, select_page = select_page, destination_location_context = destination_location_context, destination_tag_import_options = destination_tag_import_options )
|
||||
|
||||
|
||||
|
||||
|
@ -2424,6 +2442,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
def GetTestAbleToCloseStatement( self ):
|
||||
|
||||
reasons_to_names = collections.defaultdict( list )
|
||||
count = collections.Counter()
|
||||
|
||||
for page in self._GetMediaPages( False ):
|
||||
|
@ -2436,28 +2455,34 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
reason = str( e )
|
||||
|
||||
reasons_to_names[ reason ].append( page.GetName() )
|
||||
|
||||
count[ reason ] += 1
|
||||
|
||||
|
||||
|
||||
if len( count ) > 0:
|
||||
|
||||
message = ''
|
||||
message_blocks = []
|
||||
|
||||
for ( reason, c ) in list(count.items()):
|
||||
for ( reason, c ) in sorted( count.items() ):
|
||||
|
||||
names = sorted( reasons_to_names[ reason ], key = HydrusData.HumanTextSortKey )
|
||||
|
||||
if c == 1:
|
||||
|
||||
message = '1 page says: ' + reason
|
||||
message_block = f'page "{names[0]}" says: {reason}'
|
||||
|
||||
else:
|
||||
|
||||
message = HydrusNumbers.ToHumanInt( c ) + ' pages say:' + reason
|
||||
message_block = f'pages{HydrusText.ConvertManyStringsToNiceInsertableHumanSummary( names )}say: {reason}'
|
||||
|
||||
|
||||
message += '\n'
|
||||
message_blocks.append( message_block )
|
||||
|
||||
|
||||
message = '\n----\n'.join( message_blocks )
|
||||
|
||||
return message
|
||||
|
||||
else:
|
||||
|
@ -2995,9 +3020,9 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook, select_page = select_page )
|
||||
|
||||
|
||||
def NewPageImportURLs( self, page_name = None, on_deepest_notebook = False, select_page = True, destination_location_context = None ):
|
||||
def NewPageImportURLs( self, page_name = None, on_deepest_notebook = False, select_page = True, destination_location_context = None, destination_tag_import_options = None ):
|
||||
|
||||
management_controller = ClientGUIManagementController.CreateManagementControllerImportURLs( page_name = page_name, destination_location_context = destination_location_context )
|
||||
management_controller = ClientGUIManagementController.CreateManagementControllerImportURLs( page_name = page_name, destination_location_context = destination_location_context, destination_tag_import_options = destination_tag_import_options )
|
||||
|
||||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook, select_page = select_page )
|
||||
|
||||
|
|
|
@ -17,6 +17,7 @@ from hydrus.core import HydrusTags
|
|||
from hydrus.core.files.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
|
@ -2608,6 +2609,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._draw_bottom_right_index_in_media_viewer_background = QW.QCheckBox( media_canvas_panel )
|
||||
self._draw_bottom_right_index_in_media_viewer_background.setToolTip( ClientGUIFunctions.WrapToolTip( 'Draw the bottom-right index string in the background of the media viewer.' ) )
|
||||
|
||||
self._use_nice_resolution_strings = QW.QCheckBox( media_canvas_panel )
|
||||
self._use_nice_resolution_strings.setToolTip( ClientGUIFunctions.WrapToolTip( 'Use "1080p" instead of "1920x1080" for common resolutions.' ) )
|
||||
|
||||
self._hide_uninteresting_modified_time = QW.QCheckBox( media_canvas_panel )
|
||||
self._hide_uninteresting_modified_time.setToolTip( ClientGUIFunctions.WrapToolTip( 'If the file has a modified time similar to its import time (i.e. the number of seconds since both events differs by less than 10%), hide the modified time in the top of the media viewer.' ) )
|
||||
|
||||
|
@ -2652,6 +2656,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._draw_top_right_hover_in_media_viewer_background.setChecked( self._new_options.GetBoolean( 'draw_top_right_hover_in_media_viewer_background' ) )
|
||||
self._draw_notes_hover_in_media_viewer_background.setChecked( self._new_options.GetBoolean( 'draw_notes_hover_in_media_viewer_background' ) )
|
||||
self._draw_bottom_right_index_in_media_viewer_background.setChecked( self._new_options.GetBoolean( 'draw_bottom_right_index_in_media_viewer_background' ) )
|
||||
self._use_nice_resolution_strings.setChecked( self._new_options.GetBoolean( 'use_nice_resolution_strings' ) )
|
||||
self._hide_uninteresting_modified_time.setChecked( self._new_options.GetBoolean( 'hide_uninteresting_modified_time' ) )
|
||||
|
||||
self._media_viewer_cursor_autohide_time_ms.SetValue( self._new_options.GetNoneableInteger( 'media_viewer_cursor_autohide_time_ms' ) )
|
||||
|
@ -2692,6 +2697,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'Duplicate top-right hover-window information in the background of the viewer:', self._draw_top_right_hover_in_media_viewer_background ) )
|
||||
rows.append( ( 'Duplicate notes hover-window information in the background of the viewer:', self._draw_notes_hover_in_media_viewer_background ) )
|
||||
rows.append( ( 'Draw bottom-right index text in the background of the viewer:', self._draw_bottom_right_index_in_media_viewer_background ) )
|
||||
rows.append( ( 'Swap in common resolution labels:', self._use_nice_resolution_strings ) )
|
||||
rows.append( ( 'Hide uninteresting modified times:', self._hide_uninteresting_modified_time ) )
|
||||
|
||||
media_canvas_gridbox = ClientGUICommon.WrapInGrid( media_canvas_panel, rows )
|
||||
|
@ -2753,6 +2759,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._new_options.SetBoolean( 'draw_top_right_hover_in_media_viewer_background', self._draw_top_right_hover_in_media_viewer_background.isChecked() )
|
||||
self._new_options.SetBoolean( 'draw_notes_hover_in_media_viewer_background', self._draw_notes_hover_in_media_viewer_background.isChecked() )
|
||||
self._new_options.SetBoolean( 'draw_bottom_right_index_in_media_viewer_background', self._draw_bottom_right_index_in_media_viewer_background.isChecked() )
|
||||
self._new_options.SetBoolean( 'use_nice_resolution_strings', self._use_nice_resolution_strings.isChecked() )
|
||||
self._new_options.SetBoolean( 'hide_uninteresting_modified_time', self._hide_uninteresting_modified_time.isChecked() )
|
||||
|
||||
self._new_options.SetBoolean( 'disallow_media_drags_on_duration_media', self._disallow_media_drags_on_duration_media.isChecked() )
|
||||
|
@ -2904,7 +2911,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'Centerpoint for media zooming:', self._media_viewer_zoom_center ) )
|
||||
rows.append( ( 'Media zooms:', self._media_zooms ) )
|
||||
rows.append( ( 'Start animations this % in:', self._animation_start_position ) )
|
||||
rows.append( ( 'Always Loop GIFs/APNGs:', self._always_loop_animations ) )
|
||||
rows.append( ( 'Always Loop Animations:', self._always_loop_animations ) )
|
||||
rows.append( ( 'Draw image transparency as checkerboard:', self._draw_transparency_checkerboard_media_canvas ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( media_panel, rows )
|
||||
|
@ -3966,7 +3973,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
resolution = ( int( 16 * unit_length ), int( 9 * unit_length ) )
|
||||
|
||||
self._image_cache_storage_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusNumbers.ResolutionToPrettyString( resolution ) ) )
|
||||
self._image_cache_storage_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), ClientData.ResolutionToPrettyString( resolution ) ) )
|
||||
|
||||
num_pixels = cache_size * ( self._image_cache_prefetch_limit_percentage.value() / 100 ) / 3
|
||||
|
||||
|
@ -3976,7 +3983,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
resolution = ( int( 16 * unit_length ), int( 9 * unit_length ) )
|
||||
|
||||
self._image_cache_prefetch_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), HydrusNumbers.ResolutionToPrettyString( resolution ) ) )
|
||||
self._image_cache_prefetch_limit_percentage_st.setText( '% - {} pixels, or about a {} image'.format( HydrusNumbers.ToHumanInt( num_pixels ), ClientData.ResolutionToPrettyString( resolution ) ) )
|
||||
|
||||
#
|
||||
|
||||
|
@ -4022,7 +4029,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
( thumbnail_width, thumbnail_height ) = HC.options[ 'thumbnail_dimensions' ]
|
||||
|
||||
res_string = HydrusNumbers.ResolutionToPrettyString( ( thumbnail_width, thumbnail_height ) )
|
||||
res_string = ClientData.ResolutionToPrettyString( ( thumbnail_width, thumbnail_height ) )
|
||||
|
||||
estimated_bytes_per_thumb = 3 * thumbnail_width * thumbnail_height
|
||||
|
||||
|
|
|
@ -2328,7 +2328,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
def _CreateNewOR( self ):
|
||||
|
||||
predicates = { ClientSearchPredicate.Predicate( ClientSearchPredicate.PREDICATE_TYPE_OR_CONTAINER, value = [ ] ) }
|
||||
predicates = { ClientSearchPredicate.Predicate( ClientSearchPredicate.PREDICATE_TYPE_OR_CONTAINER, value = [] ) }
|
||||
|
||||
try:
|
||||
|
||||
|
@ -2365,47 +2365,46 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
ClientGUIMenus.AppendMenuItem( menu, 'save this search', 'Save this search for later.', self._SaveFavouriteSearch )
|
||||
|
||||
|
||||
folders_to_names = CG.client_controller.favourite_search_manager.GetFoldersToNames()
|
||||
# what the hell, this will work for now
|
||||
# I am bodging this weird 'string and None' system to support '/' for nested menu structure, let's go
|
||||
nested_folders_to_names = CG.client_controller.favourite_search_manager.GetNestedFoldersToNames()
|
||||
|
||||
if len( folders_to_names ) > 0:
|
||||
if len( nested_folders_to_names ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
folder_names = list( folders_to_names.keys() )
|
||||
|
||||
if None in folder_names:
|
||||
def populate_a_folder( folder_menu, folder_dict ):
|
||||
|
||||
folder_names.remove( None )
|
||||
subfolder_names = list( folder_dict.keys() )
|
||||
|
||||
folder_names.sort()
|
||||
if None in subfolder_names:
|
||||
|
||||
subfolder_names.remove( None )
|
||||
|
||||
folder_names_and_names_on_this_level = folder_dict[ None ]
|
||||
|
||||
# trust me on the key lambda, in some annoying situations the folder name can be none or '/'
|
||||
for ( full_folder_name, name ) in sorted( folder_names_and_names_on_this_level, key = lambda a: a[1] ):
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( folder_menu, name, 'Load the {} search.'.format( name ), self._LoadFavouriteSearch, full_folder_name, name )
|
||||
|
||||
|
||||
|
||||
folder_names.insert( 0, None )
|
||||
subfolder_names.sort()
|
||||
|
||||
else:
|
||||
|
||||
folder_names.sort()
|
||||
for subfolder_name in subfolder_names:
|
||||
|
||||
subfolder_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenu( folder_menu, subfolder_menu, subfolder_name )
|
||||
|
||||
subfolder_dict = folder_dict[ subfolder_name ]
|
||||
|
||||
populate_a_folder( subfolder_menu, subfolder_dict )
|
||||
|
||||
|
||||
|
||||
for folder_name in folder_names:
|
||||
|
||||
if folder_name is None:
|
||||
|
||||
menu_to_use = menu
|
||||
|
||||
else:
|
||||
|
||||
menu_to_use = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, menu_to_use, folder_name )
|
||||
|
||||
|
||||
names = sorted( folders_to_names[ folder_name ] )
|
||||
|
||||
for name in names:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu_to_use, name, 'Load the {} search.'.format( name ), self._LoadFavouriteSearch, folder_name, name )
|
||||
|
||||
|
||||
populate_a_folder( menu, nested_folders_to_names )
|
||||
|
||||
|
||||
CGC.core().PopupMenu( self, menu )
|
||||
|
@ -3034,13 +3033,36 @@ class ListBoxTagsActiveSearchPredicates( ClientGUIListBoxes.ListBoxTagsPredicate
|
|||
self._DataHasChanged()
|
||||
|
||||
|
||||
def _EnterPredicates( self, predicates, permit_add = True, permit_remove = True ):
|
||||
def _EnterPredicates( self, predicates, permit_add = True, permit_remove = True, start_or_predicate = False ):
|
||||
|
||||
if len( predicates ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if start_or_predicate:
|
||||
|
||||
or_based_predicates = { ClientSearchPredicate.Predicate( ClientSearchPredicate.PREDICATE_TYPE_OR_CONTAINER, value = list( predicates ) ) }
|
||||
|
||||
try:
|
||||
|
||||
empty_file_search_context = self._file_search_context.Duplicate()
|
||||
|
||||
empty_file_search_context.SetPredicates( or_based_predicates )
|
||||
|
||||
or_based_predicates = ClientGUISearch.EditPredicates( self, or_based_predicates, empty_file_search_context = empty_file_search_context )
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._EnterPredicates( predicates, permit_add = False )
|
||||
self._EnterPredicates( or_based_predicates, permit_remove = False )
|
||||
|
||||
return
|
||||
|
||||
|
||||
terms_to_be_added = set()
|
||||
terms_to_be_removed = set()
|
||||
|
||||
|
@ -3127,6 +3149,15 @@ class ListBoxTagsActiveSearchPredicates( ClientGUIListBoxes.ListBoxTagsPredicate
|
|||
self._EnterPredicates( ( or_predicate, ), permit_remove = False )
|
||||
|
||||
|
||||
elif command == 'dissolve_or_predicate':
|
||||
|
||||
or_preds = [ p for p in predicates if p.IsORPredicate() ]
|
||||
|
||||
sub_preds = HydrusLists.MassUnion( [ p.GetValue() for p in or_preds ] )
|
||||
|
||||
self._EnterPredicates( or_preds, permit_add = False )
|
||||
self._EnterPredicates( sub_preds, permit_remove = False )
|
||||
|
||||
elif command == 'replace_or_predicate':
|
||||
|
||||
if or_predicate is not None:
|
||||
|
@ -3135,6 +3166,10 @@ class ListBoxTagsActiveSearchPredicates( ClientGUIListBoxes.ListBoxTagsPredicate
|
|||
self._EnterPredicates( ( or_predicate, ), permit_remove = False )
|
||||
|
||||
|
||||
elif command == 'start_or_predicate':
|
||||
|
||||
self._EnterPredicates( predicates, start_or_predicate = True )
|
||||
|
||||
elif command == 'remove_predicates':
|
||||
|
||||
self._EnterPredicates( predicates, permit_add = False )
|
||||
|
@ -3153,11 +3188,11 @@ class ListBoxTagsActiveSearchPredicates( ClientGUIListBoxes.ListBoxTagsPredicate
|
|||
|
||||
|
||||
|
||||
def EnterPredicates( self, page_key, predicates, permit_add = True, permit_remove = True ):
|
||||
def EnterPredicates( self, page_key, predicates, permit_add = True, permit_remove = True, start_or_predicate = False ):
|
||||
|
||||
if page_key == self._page_key:
|
||||
|
||||
self._EnterPredicates( predicates, permit_add = permit_add, permit_remove = permit_remove )
|
||||
self._EnterPredicates( predicates, permit_add = permit_add, permit_remove = permit_remove, start_or_predicate = start_or_predicate )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -74,7 +74,7 @@ class EditFavouriteSearchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'folder (blank for none): ', self._foldername ) )
|
||||
rows.append( ( 'folder (blank for none, "/" for nested): ', self._foldername ) )
|
||||
rows.append( ( 'name: ', self._name ) )
|
||||
|
||||
top_gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
|
|
@ -131,7 +131,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_SEED
|
||||
SERIALISABLE_NAME = 'File Import'
|
||||
SERIALISABLE_VERSION = 7
|
||||
SERIALISABLE_VERSION = 8
|
||||
|
||||
def __init__( self, file_seed_type: int = None, file_seed_data: str = None ):
|
||||
|
||||
|
@ -140,9 +140,11 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
file_seed_type = FILE_SEED_TYPE_URL
|
||||
|
||||
|
||||
top_wew_default = 'https://big-guys.4u/monica_lewinsky_hott.tiff.exe.vbs'
|
||||
|
||||
if file_seed_data is None:
|
||||
|
||||
file_seed_data = 'https://big-guys.4u/monica_lewinsky_hott.tiff.exe.vbs'
|
||||
file_seed_data = top_wew_default
|
||||
|
||||
|
||||
super().__init__()
|
||||
|
@ -151,7 +153,10 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
self.file_seed_data = file_seed_data
|
||||
self.file_seed_data_for_comparison = file_seed_data
|
||||
|
||||
self.Normalise() # this fixes the comparison file seed data and fails safely
|
||||
if self.file_seed_data != top_wew_default:
|
||||
|
||||
self.Normalise() # this fixes the comparison file seed data and fails safely
|
||||
|
||||
|
||||
self.created = HydrusTime.GetNow()
|
||||
self.modified = self.created
|
||||
|
@ -280,6 +285,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return (
|
||||
self.file_seed_type,
|
||||
self.file_seed_data,
|
||||
self.file_seed_data_for_comparison,
|
||||
self.created,
|
||||
self.modified,
|
||||
self.source_time,
|
||||
|
@ -302,6 +308,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
(
|
||||
self.file_seed_type,
|
||||
self.file_seed_data,
|
||||
self.file_seed_data_for_comparison,
|
||||
self.created,
|
||||
self.modified,
|
||||
self.source_time,
|
||||
|
@ -318,20 +325,6 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
serialisable_hashes
|
||||
) = serialisable_info
|
||||
|
||||
self.file_seed_data_for_comparison = self.file_seed_data
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
try:
|
||||
|
||||
self.file_seed_data_for_comparison = CG.client_controller.network_engine.domain_manager.NormaliseURL( self.file_seed_data )
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
self._external_filterable_tags = set( serialisable_external_filterable_tags )
|
||||
self._external_additional_service_keys_to_tags = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_external_additional_service_keys_to_tags )
|
||||
|
||||
|
@ -566,6 +559,64 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 7, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 7:
|
||||
|
||||
(
|
||||
file_seed_type,
|
||||
file_seed_data,
|
||||
created,
|
||||
modified,
|
||||
source_time,
|
||||
status,
|
||||
note,
|
||||
referral_url,
|
||||
request_headers,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_primary_urls,
|
||||
serialisable_source_urls,
|
||||
serialisable_tags,
|
||||
names_and_notes,
|
||||
serialisable_hashes
|
||||
) = old_serialisable_info
|
||||
|
||||
file_seed_data_for_comparison = None
|
||||
|
||||
if file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
try:
|
||||
|
||||
file_seed_data_for_comparison = CG.client_controller.network_engine.domain_manager.NormaliseURL( file_seed_data )
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
new_serialisable_info = (
|
||||
file_seed_type,
|
||||
file_seed_data,
|
||||
file_seed_data_for_comparison,
|
||||
created,
|
||||
modified,
|
||||
source_time,
|
||||
status,
|
||||
note,
|
||||
referral_url,
|
||||
request_headers,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_primary_urls,
|
||||
serialisable_source_urls,
|
||||
serialisable_tags,
|
||||
names_and_notes,
|
||||
serialisable_hashes
|
||||
)
|
||||
|
||||
return ( 8, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AddExternalAdditionalServiceKeysToTags( self, service_keys_to_tags ):
|
||||
|
||||
|
|
|
@ -839,7 +839,7 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
SERIALISABLE_NAME = 'URL Import'
|
||||
SERIALISABLE_VERSION = 4
|
||||
|
||||
def __init__( self, destination_location_context = None ):
|
||||
def __init__( self, destination_location_context = None, destination_tag_import_options = None ):
|
||||
|
||||
super().__init__()
|
||||
|
||||
|
@ -856,7 +856,14 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._file_import_options.SetDestinationLocationContext( destination_location_context )
|
||||
|
||||
|
||||
self._tag_import_options = TagImportOptions.TagImportOptions( is_default = True )
|
||||
if destination_tag_import_options is not None:
|
||||
|
||||
self._tag_import_options = destination_tag_import_options
|
||||
|
||||
else:
|
||||
|
||||
self._tag_import_options = TagImportOptions.TagImportOptions( is_default = True )
|
||||
|
||||
|
||||
self._note_import_options = NoteImportOptions.NoteImportOptions()
|
||||
self._note_import_options.SetIsDefault( True )
|
||||
|
|
|
@ -9,6 +9,7 @@ from hydrus.core import HydrusSerialisable
|
|||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
|
@ -368,7 +369,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if too_thin or too_short:
|
||||
|
||||
raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusNumbers.ResolutionToPrettyString( ( width, height ) ) + ' but the lower limit is ' + HydrusNumbers.ResolutionToPrettyString( self._min_resolution ) )
|
||||
raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + ClientData.ResolutionToPrettyString( ( width, height ) ) + ' but the lower limit is ' + ClientData.ResolutionToPrettyString( self._min_resolution ) )
|
||||
|
||||
|
||||
|
||||
|
@ -381,7 +382,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if too_wide or too_tall:
|
||||
|
||||
raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + HydrusNumbers.ResolutionToPrettyString( ( width, height ) ) + ' but the upper limit is ' + HydrusNumbers.ResolutionToPrettyString( self._max_resolution ) )
|
||||
raise HydrusExceptions.FileImportRulesException( 'File had resolution ' + ClientData.ResolutionToPrettyString( ( width, height ) ) + ' but the upper limit is ' + ClientData.ResolutionToPrettyString( self._max_resolution ) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -250,7 +250,15 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
SERIALISABLE_NAME = 'Tag Import Options'
|
||||
SERIALISABLE_VERSION = 8
|
||||
|
||||
def __init__( self, fetch_tags_even_if_url_recognised_and_file_already_in_db = False, fetch_tags_even_if_hash_recognised_and_file_already_in_db = False, tag_blacklist = None, tag_whitelist = None, service_keys_to_service_tag_import_options = None, is_default = False ):
|
||||
def __init__(
|
||||
self,
|
||||
fetch_tags_even_if_url_recognised_and_file_already_in_db = False,
|
||||
fetch_tags_even_if_hash_recognised_and_file_already_in_db = False,
|
||||
tag_blacklist = None,
|
||||
tag_whitelist = None,
|
||||
service_keys_to_service_tag_import_options = None,
|
||||
is_default = False
|
||||
):
|
||||
|
||||
super().__init__()
|
||||
|
||||
|
@ -641,6 +649,16 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._is_default = value
|
||||
|
||||
|
||||
def SetShouldFetchTagsEvenIfHashKnownAndFileAlreadyInDB( self, value: bool ):
|
||||
|
||||
self._fetch_tags_even_if_hash_recognised_and_file_already_in_db = value
|
||||
|
||||
|
||||
def SetShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB( self, value: bool ):
|
||||
|
||||
self._fetch_tags_even_if_url_recognised_and_file_already_in_db = value
|
||||
|
||||
|
||||
def ShouldFetchTagsEvenIfHashKnownAndFileAlreadyInDB( self ):
|
||||
|
||||
return self._fetch_tags_even_if_hash_recognised_and_file_already_in_db
|
||||
|
@ -656,6 +674,7 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return True in ( service_tag_import_options.WorthFetchingTags() for service_tag_import_options in self._service_keys_to_service_tag_import_options.values() )
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_TAG_IMPORT_OPTIONS ] = TagImportOptions
|
||||
|
||||
def NewInboxArchiveMatch( new_files, inbox_files, archive_files, status, inbox ):
|
||||
|
|
|
@ -12,10 +12,12 @@ from hydrus.core import HydrusTime
|
|||
from hydrus.core.files import HydrusPSDHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client import ClientUgoiraHandling
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
|
@ -1941,13 +1943,45 @@ class MediaSingleton( Media ):
|
|||
|
||||
if width is not None and height is not None:
|
||||
|
||||
info_string += f' ({HydrusNumbers.ResolutionToPrettyString( ( width, height ) )})'
|
||||
info_string += f' ({ClientData.ResolutionToPrettyString( ( width, height ) )})'
|
||||
|
||||
|
||||
if duration is not None:
|
||||
|
||||
info_string += f', {HydrusTime.MillisecondsDurationToPrettyTime( duration )}'
|
||||
|
||||
elif mime == HC.ANIMATION_UGOIRA:
|
||||
|
||||
if ClientUgoiraHandling.HasFrameTimesNote( self.GetMediaResult() ):
|
||||
|
||||
try:
|
||||
|
||||
# this is more work than we'd normally want to do, but prettyinfolines is called on a per-file basis so I think we are good. a tiny no-latency json load per human click is fine
|
||||
# we'll see how it goes
|
||||
frame_times = ClientUgoiraHandling.GetFrameTimesFromNote( self.GetMediaResult() )
|
||||
|
||||
if frame_times is not None:
|
||||
|
||||
note_duration = sum( frame_times )
|
||||
|
||||
info_string += f', {HydrusTime.MillisecondsDurationToPrettyTime( note_duration )} (note-based)'
|
||||
|
||||
|
||||
except:
|
||||
|
||||
info_string += f', unknown note-based duration'
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if num_frames is not None:
|
||||
|
||||
simulated_duration = num_frames * ClientUgoiraHandling.UGOIRA_DEFAULT_FRAME_DURATION_MS
|
||||
|
||||
info_string += f', {HydrusTime.MillisecondsDurationToPrettyTime( simulated_duration )} (speculated)'
|
||||
|
||||
|
||||
|
||||
|
||||
if num_frames is not None:
|
||||
|
||||
|
|
|
@ -859,6 +859,7 @@ class HydrusResourceClientAPIRestrictedGetFilesGetFilePath( HydrusResourceClient
|
|||
|
||||
hash = media_result.GetHash()
|
||||
mime = media_result.GetMime()
|
||||
size = media_result.GetSize()
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
|
@ -873,7 +874,9 @@ class HydrusResourceClientAPIRestrictedGetFilesGetFilePath( HydrusResourceClient
|
|||
|
||||
|
||||
body_dict = {
|
||||
'path' : path
|
||||
'path' : path,
|
||||
'filetype' : HC.mime_mimetype_string_lookup[ mime ],
|
||||
'size' : size
|
||||
}
|
||||
|
||||
mime = request.preferred_mime
|
||||
|
|
|
@ -22,6 +22,9 @@ class FavouriteSearchManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
# TODO: overhaul this whole thing, and the edit dialog, to not use None but '' for 'base folder path'
|
||||
# just needs a serialisable update on this end
|
||||
|
||||
serialisable_favourite_search_info = []
|
||||
|
||||
for row in self._favourite_search_rows:
|
||||
|
@ -111,18 +114,45 @@ class FavouriteSearchManager( HydrusSerialisable.SerialisableBase ):
|
|||
return list( self._favourite_search_rows )
|
||||
|
||||
|
||||
def GetFoldersToNames( self ):
|
||||
def GetNestedFoldersToNames( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
folders_to_names = collections.defaultdict( list )
|
||||
nested_folders_to_names = {}
|
||||
|
||||
for ( folder, name, file_search_context, synchronised, media_sort, media_collect ) in self._favourite_search_rows:
|
||||
|
||||
folders_to_names[ folder ].append( name )
|
||||
current_dict = nested_folders_to_names
|
||||
|
||||
if folder is not None:
|
||||
|
||||
folder_parts = folder.split( '/' )
|
||||
|
||||
for folder_part in folder_parts:
|
||||
|
||||
if folder_part == '':
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if folder_part not in current_dict:
|
||||
|
||||
current_dict[ folder_part ] = {}
|
||||
|
||||
|
||||
current_dict = current_dict[ folder_part ]
|
||||
|
||||
|
||||
|
||||
if None not in current_dict:
|
||||
|
||||
current_dict[ None ] = []
|
||||
|
||||
|
||||
current_dict[ None ].append( ( folder, name ) )
|
||||
|
||||
|
||||
return folders_to_names
|
||||
return nested_folders_to_names
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -105,8 +105,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 594
|
||||
CLIENT_API_VERSION = 72
|
||||
SOFTWARE_VERSION = 595
|
||||
CLIENT_API_VERSION = 73
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -1465,6 +1465,7 @@ DOCUMENTATION_DOWNLOADER_URL_CLASSES = 'downloader_url_classes.html'
|
|||
DOCUMENTATION_GETTING_STARTED_SUBSCRIPTIONS = 'getting_started_subscriptions.html'
|
||||
DOCUMENTATION_DATABASE_MIGRATION = 'database_migration.html'
|
||||
DOCUMENTATION_DUPLICATES = 'duplicates.html'
|
||||
DOCUMENTATION_DUPLICATES_AUTO_RESOLUTION = 'duplicates_auto_resolution.html'
|
||||
DOCUMENTATION_DOWNLOADER_SHARING = 'downloader_sharing.html'
|
||||
DOCUMENTATION_DOWNLOADER_PARSERS_PAGE_PARSERS_PAGE_PARSERS = 'downloader_parsers_page_parsers.html#page_parsers'
|
||||
DOCUMENTATION_DOWNLOADER_PARSERS_CONTENT_PARSERS_CONTENT_PARSERS = 'downloader_parsers_content_parsers.html#content_parsers'
|
||||
|
|
|
@ -265,7 +265,7 @@ def GenerateHumanTextSortKey():
|
|||
|
||||
int_convert = lambda t: int( t ) if t.isdecimal() else t
|
||||
|
||||
split_alphanum = lambda t: tuple( ( int_convert( sub_t ) for sub_t in re.split( '([0-9]+)', t.lower() ) ) )
|
||||
split_alphanum = lambda t: tuple( ( int_convert( sub_t ) for sub_t in re.split( '([0-9]+)', t.casefold() ) ) )
|
||||
|
||||
return split_alphanum
|
||||
|
||||
|
|
|
@ -307,6 +307,29 @@ class FastIndexUniqueList( collections.abc.MutableSequence ):
|
|||
|
||||
|
||||
|
||||
def ConvertTupleOfDatasToCasefolded( l: typing.Sequence ) -> typing.Tuple:
|
||||
|
||||
# TODO: We could convert/augment this guy to do HumanTextSort too so we have 3 < 22
|
||||
|
||||
def casefold_obj( o ):
|
||||
|
||||
if isinstance( o, str ):
|
||||
|
||||
return o.casefold()
|
||||
|
||||
elif isinstance( o, collections.abc.Sequence ):
|
||||
|
||||
return ConvertTupleOfDatasToCasefolded( o )
|
||||
|
||||
else:
|
||||
|
||||
return o
|
||||
|
||||
|
||||
|
||||
return tuple( ( casefold_obj( obj ) for obj in l ) )
|
||||
|
||||
|
||||
def IntelligentMassIntersect( sets_to_reduce: typing.Collection[ set ] ):
|
||||
|
||||
answer = None
|
||||
|
|
|
@ -101,35 +101,6 @@ def PixelsToInt( unit ):
|
|||
elif unit == 'megapixels': return 1000000
|
||||
|
||||
|
||||
def ResolutionToPrettyString( resolution ):
|
||||
|
||||
if resolution is None:
|
||||
|
||||
return 'no resolution'
|
||||
|
||||
|
||||
if not isinstance( resolution, tuple ):
|
||||
|
||||
try:
|
||||
|
||||
resolution = tuple( resolution )
|
||||
|
||||
except:
|
||||
|
||||
return 'broken resolution'
|
||||
|
||||
|
||||
|
||||
if resolution in HC.NICE_RESOLUTIONS:
|
||||
|
||||
return HC.NICE_RESOLUTIONS[ resolution ]
|
||||
|
||||
|
||||
( width, height ) = resolution
|
||||
|
||||
return '{}x{}'.format( ToHumanInt( width ), ToHumanInt( height ) )
|
||||
|
||||
|
||||
def ToHumanInt( num ):
|
||||
|
||||
try:
|
||||
|
|
|
@ -142,10 +142,10 @@ SERIALISABLE_TYPE_PETITION_HEADER = 124
|
|||
SERIALISABLE_TYPE_STRING_JOINER = 125
|
||||
SERIALISABLE_TYPE_FILE_FILTER = 126
|
||||
SERIALISABLE_TYPE_URL_CLASS_PARAMETER_FIXED_NAME = 127
|
||||
SERIALISABLE_TYPE_AUTO_DUPLICATES_RULE = 128
|
||||
SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR = 129
|
||||
SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_ONE_FILE = 130
|
||||
SERIALISABLE_TYPE_AUTO_DUPLICATES_PAIR_COMPARATOR_TWO_FILES_RELATIVE = 131
|
||||
SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_RULE = 128
|
||||
SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_SELECTOR_AND_COMPARATOR = 129
|
||||
SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_ONE_FILE = 130
|
||||
SERIALISABLE_TYPE_DUPLICATES_AUTO_RESOLUTION_PAIR_COMPARATOR_TWO_FILES_RELATIVE = 131
|
||||
SERIALISABLE_TYPE_METADATA_CONDITIONAL = 132
|
||||
SERIALISABLE_TYPE_PARSE_FORMULA_NESTED = 133
|
||||
|
||||
|
|
|
@ -10,7 +10,6 @@ from hydrus.core.files.images import HydrusImageHandling
|
|||
|
||||
from PIL import Image as PILImage
|
||||
|
||||
|
||||
# handle getting a list of frame paths from a ugoira without json metadata:
|
||||
def GetFramePathsFromUgoiraZip( path ):
|
||||
|
||||
|
@ -35,10 +34,11 @@ def GetUgoiraProperties( path_to_zip ):
|
|||
try:
|
||||
|
||||
return GetUgoiraPropertiesFromJSON( path_to_zip )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
try:
|
||||
|
||||
|
@ -58,6 +58,7 @@ def GetUgoiraProperties( path_to_zip ):
|
|||
except:
|
||||
|
||||
num_frames = None
|
||||
|
||||
|
||||
return ( ( width, height ), None, num_frames )
|
||||
|
||||
|
@ -68,7 +69,7 @@ def ZipLooksLikeUgoira( path_to_zip ):
|
|||
try:
|
||||
|
||||
frames = GetUgoiraFrameDataJSON( path_to_zip )
|
||||
|
||||
|
||||
if frames is not None and len( frames ) > 0 and all(('delay' in frame and 'file' in frame) for frame in frames):
|
||||
|
||||
return True
|
||||
|
@ -77,6 +78,7 @@ def ZipLooksLikeUgoira( path_to_zip ):
|
|||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
# what does an Ugoira look like? it has a standard, but this is not always followed, so be somewhat forgiving
|
||||
# it is a list of images named in the format 000123.jpg. this is 6-figure, starting at 000000
|
||||
|
@ -173,16 +175,18 @@ def ZipLooksLikeUgoira( path_to_zip ):
|
|||
### Handling ugoira files with frame data json:
|
||||
|
||||
def GetUgoiraJSON( path: str ):
|
||||
|
||||
|
||||
jsonFile = HydrusArchiveHandling.GetZipAsPath( path, 'animation.json' )
|
||||
|
||||
if not jsonFile.exists():
|
||||
|
||||
|
||||
raise HydrusExceptions.LimitedSupportFileException( 'Zip file has no animation.json!' )
|
||||
|
||||
|
||||
|
||||
with jsonFile.open('rb') as jsonData:
|
||||
|
||||
|
||||
return json.load( jsonData )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -202,20 +206,22 @@ def GetUgoiraFrameDataJSON( path: str ) -> typing.Optional[typing.List[UgoiraFra
|
|||
if isinstance(ugoiraJson, list):
|
||||
|
||||
return ugoiraJson
|
||||
|
||||
|
||||
else:
|
||||
|
||||
return ugoiraJson['frames']
|
||||
|
||||
|
||||
except:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def GetUgoiraPropertiesFromJSON( path ):
|
||||
|
||||
|
||||
frameData = GetUgoiraFrameDataJSON( path )
|
||||
|
||||
if frameData is None:
|
||||
|
@ -224,20 +230,19 @@ def GetUgoiraPropertiesFromJSON( path ):
|
|||
|
||||
|
||||
durations = [data['delay'] for data in frameData]
|
||||
|
||||
|
||||
duration = sum( durations )
|
||||
num_frames = len( durations )
|
||||
|
||||
|
||||
firstFrame = GetUgoiraFramePIL( path, 0 )
|
||||
|
||||
|
||||
return ( firstFrame.size, duration, num_frames )
|
||||
|
||||
|
||||
|
||||
|
||||
# Combined Ugoira functions:
|
||||
|
||||
def GetFramePathsUgoira( path ):
|
||||
|
||||
|
||||
try:
|
||||
|
||||
frameData = GetUgoiraFrameDataJSON( path )
|
||||
|
@ -245,36 +250,34 @@ def GetFramePathsUgoira( path ):
|
|||
if frameData is not None:
|
||||
|
||||
return [data['file'] for data in frameData]
|
||||
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
return GetFramePathsFromUgoiraZip( path )
|
||||
|
||||
|
||||
|
||||
def GetUgoiraFramePIL( path: str, frameIndex: int ) -> PILImage.Image:
|
||||
|
||||
framePaths = GetFramePathsUgoira( path )
|
||||
|
||||
|
||||
frameName = framePaths[frameIndex]
|
||||
|
||||
|
||||
frameFromZip = HydrusArchiveHandling.GetZipAsPath( path, frameName ).open( 'rb' )
|
||||
|
||||
return HydrusImageHandling.GeneratePILImage( frameFromZip )
|
||||
|
||||
|
||||
|
||||
def GenerateThumbnailNumPyFromUgoiraPath( path: str, target_resolution: typing.Tuple[int, int], frame_index: int ):
|
||||
|
||||
|
||||
pil_image = GetUgoiraFramePIL( path, frame_index )
|
||||
|
||||
|
||||
thumbnail_pil_image = pil_image.resize( target_resolution, PILImage.LANCZOS )
|
||||
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( thumbnail_pil_image )
|
||||
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -6993,6 +6993,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
self.assertEqual( response.status, 200 )
|
||||
|
||||
self.assertEqual( d[ 'path' ], os.path.join( TG.test_controller.db_dir, 'client_files', f'f{hash_hex[:2]}', f'{hash_hex}.png' ) )
|
||||
self.assertEqual( d[ 'filetype' ], 'image/png' )
|
||||
self.assertEqual( d[ 'size' ], 100 )
|
||||
|
||||
# thumbnail path
|
||||
|
||||
|
|
|
@ -263,13 +263,25 @@ class TestStringConverter( unittest.TestCase ):
|
|||
|
||||
string_converter = ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_ENCODE, 'hex' ) ] )
|
||||
|
||||
self.assertEqual( string_converter.Convert( b'\xe5\xafW\xa6\x87\xf0\x89\x89O^\xce\xdeP\x04\x94X' ), 'e5af57a687f089894f5ecede50049458' )
|
||||
self.assertEqual( string_converter.Convert( 'hello world' ), '68656c6c6f20776f726c64' )
|
||||
|
||||
#
|
||||
|
||||
string_converter = ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_ENCODE, 'base64' ) ] )
|
||||
|
||||
self.assertEqual( string_converter.Convert( b'\xe5\xafW\xa6\x87\xf0\x89\x89O^\xce\xdeP\x04\x94X' ), '5a9XpofwiYlPXs7eUASUWA==' )
|
||||
self.assertEqual( string_converter.Convert( 'hello world' ), 'aGVsbG8gd29ybGQ=' )
|
||||
|
||||
#
|
||||
|
||||
string_converter = ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_DECODE, 'hex' ) ] )
|
||||
|
||||
self.assertEqual( string_converter.Convert( '68656c6c6f20776f726c64' ), 'hello world' )
|
||||
|
||||
#
|
||||
|
||||
string_converter = ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_DECODE, 'base64' ) ] )
|
||||
|
||||
self.assertEqual( string_converter.Convert( 'aGVsbG8gd29ybGQ=' ), 'hello world' )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -184,6 +184,8 @@ class TestSerialisables( unittest.TestCase ):
|
|||
duplicate_content_merge_options_merge.SetTagServiceActions( [ ( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE, HydrusTags.TagFilter() ) ] )
|
||||
duplicate_content_merge_options_merge.SetRatingServiceActions( [ ( TC.LOCAL_RATING_LIKE_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE ), ( TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE ), ( TC.LOCAL_RATING_INCDEC_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE ) ] )
|
||||
|
||||
duplicate_content_merge_options_empty = ClientDuplicates.DuplicateContentMergeOptions()
|
||||
|
||||
inbox = True
|
||||
size = 40960
|
||||
mime = HC.IMAGE_JPEG
|
||||
|
@ -430,6 +432,20 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
HF.compare_content_update_packages( self, result, content_update_package )
|
||||
|
||||
#
|
||||
|
||||
result = duplicate_content_merge_options_empty.ProcessPairIntoContentUpdatePackage( one_media, two_media )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage()
|
||||
|
||||
HF.compare_content_update_packages( self, result, content_update_package )
|
||||
|
||||
result = duplicate_content_merge_options_empty.ProcessPairIntoContentUpdatePackage( two_media, one_media )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage()
|
||||
|
||||
HF.compare_content_update_packages( self, result, content_update_package )
|
||||
|
||||
|
||||
def test_SERIALISABLE_TYPE_SHORTCUT( self ):
|
||||
|
||||
|
|
|
@ -51,7 +51,7 @@ directory=/opt/hydrus
|
|||
command=python3 /opt/hydrus/hydrus_client.py --db_dir %(ENV_DB_DIR)s %(ENV_HYDRUS_EXTRA)s
|
||||
startretries=89
|
||||
autostart=true
|
||||
autorestart=true
|
||||
autorestart=unexpected
|
||||
stdout_logfile=/dev/stdout
|
||||
stderr_logfile=/dev/stderr
|
||||
stdout_logfile_maxbytes=0
|
||||
|
|
|
@ -1,3 +1 @@
|
|||
<svg width="1200" height="1227" viewBox="0 0 1200 1227" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M714.163 519.284L1160.89 0H1055.03L667.137 450.887L357.328 0H0L468.492 681.821L0 1226.37H105.866L515.491 750.218L842.672 1226.37H1200L714.137 519.284H714.163ZM569.165 687.828L521.697 619.934L144.011 79.6944H306.615L611.412 515.685L658.88 583.579L1055.08 1150.3H892.476L569.165 687.854V687.828Z" fill="white"/>
|
||||
</svg>
|
||||
<svg role="img" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg"><title>X</title><path d="M18.901 1.153h3.68l-8.04 9.19L24 22.846h-7.406l-5.8-7.584-6.638 7.584H.474l8.6-9.83L0 1.154h7.594l5.243 6.932ZM17.61 20.644h2.039L6.486 3.24H4.298Z"/></svg>
|
Before Width: | Height: | Size: 430 B After Width: | Height: | Size: 252 B |
|
@ -0,0 +1,3 @@
|
|||
<svg width="1200" height="1227" viewBox="0 0 1200 1227" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M714.163 519.284L1160.89 0H1055.03L667.137 450.887L357.328 0H0L468.492 681.821L0 1226.37H105.866L515.491 750.218L842.672 1226.37H1200L714.137 519.284H714.163ZM569.165 687.828L521.697 619.934L144.011 79.6944H306.615L611.412 515.685L658.88 583.579L1055.08 1150.3H892.476L569.165 687.854V687.828Z" fill="white"/>
|
||||
</svg>
|
After Width: | Height: | Size: 430 B |
Loading…
Reference in New Issue