Version 591
This commit is contained in:
parent
66ed5f1167
commit
7e4b17a415
|
@ -7,6 +7,42 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 591](https://github.com/hydrusnetwork/hydrus/releases/tag/v591)
|
||||
|
||||
### misc
|
||||
|
||||
* fixed a stupid oversight with last week's "move page focus left/right after closing tab" thing where it was firing even when the page closed was not the current tab!! it now correctly only moves your focus if you close the _current_ tab, not if you just middle click some other one
|
||||
* fixed the _share->export files_ menu command not showing if you right-clicked on just one file
|
||||
* cleaned some of the broader thumbnail menu code, separating the 'stuff to show if we have a focus' and 'stuff to show if we have a selection'; the various 'manage' commands now generally show even if there is no current 'focus' in the preview (which happens if you select with ctrl+click or ctrl+a and then right-click in whitespace)
|
||||
* the 'migrate tags' dialog now allows you to filter the sibling or parent pairs by whether the child/worse or parent/ideal tag has actual mapping counts on an arbitrary tag service. some new unit tests ensure this capability
|
||||
* fixed an error in the duplicate metadata merge system where if files were exchanging known URLs, and one of those URLs was not actually an URL (e.g. it was garbage data, or human-entered 'location' info), a secondary system that tried to merge correlated domain-based timestamps was throwing an exception
|
||||
* to reduce comma-confusion, the template for 'show num files and import status' on page names is now "name - (num_files - import_status)"
|
||||
* the option that governs whether page names have the file count after them (under _options->gui pages_) has a new choice--'show for all pages, but only if greater than zero'--which is now the default for new users
|
||||
|
||||
### some boring code cleanup
|
||||
|
||||
* broke up the over-coupled 'migrate tags' unit tests into separate content types and the new count-filtering stuff
|
||||
* cleaned up the 'share' menu construction code--it was messy after some recent rewrites
|
||||
* added some better error handling around some of the file/thumbnail path fetching/regen routines
|
||||
|
||||
### client api
|
||||
|
||||
* the client api gets a new permissions state this week: the permissions structure you edit for an access key can now be (and, as a convenient default, starts as) a simple 'permits everything' state. if the permissions are set to 'permit everything', then this overrules all the specific rules and tag search filter gubbins. nice and simple, and a permissions set this way will automatically inherit new permissions in the future. any api access keys that have all the permissions up to 'edit ratings' will be auto-updated to 'permits everything' and you will get an update saying this happened--check your permissions in _review services_ if you need finer control
|
||||
* added a new permission, `13`, for 'see local paths'
|
||||
* added `/get_files/file_path`, which fetches the local path of a file. it needs the new permission
|
||||
* added `/get_files/thumbnail_path`, which fetches the local path of a thumbnail and optionally the filetype of the actual thumb (jpeg or png). it needs the new permission
|
||||
* the `/request_new_permissions` command now accepts a `permits_everything` bool as a selective alternate to the `basic_permissions` list
|
||||
* the `/verify_access_key` command now responds with the name of the access key and the new `permits_everything` value
|
||||
* the API help is updated for the above
|
||||
* new unit tests test all the above
|
||||
* the Client API version is now 71
|
||||
|
||||
### client api refactoring
|
||||
|
||||
* the main `ClientLocalServerResources` file has been getting too huge (5,000 lines), so I've moved it and `ClientLocalServer` to their own `api` module and broken the Resources file up into core functions, the superclass, and the main verbs
|
||||
* fixed permissions check for `/manage_popups/update_popup`, which was checking for pages permission rather than popup permission
|
||||
* did a general linting pass of these easier-to-handle files; cleaned up some silly stuff
|
||||
|
||||
## [Version 590](https://github.com/hydrusnetwork/hydrus/releases/tag/v590)
|
||||
|
||||
### misc
|
||||
|
@ -338,51 +374,3 @@ title: Changelog
|
|||
* when you enter a wildcard into a Read tag autocomplete, it no longer always delivers the 'always autocompleting' version. so, if you enter `sa*s`, it will suggest `sa*s (wildcard search)` and perhaps `sa*s (any namespace)`, but it will no longer suggest the `sa*s*` variants until you, obviously, actually type that trailing asterisk yourself. I intermittently had no idea what the hell I was doing when I originally developed this stuff
|
||||
* the 'unnamespaced input gives `(any namespace)` wildcard results' tag display option is now correctly negatively enforced when entering unnamespaced wildcards. previously it was always adding them, and sometimes inserting them at the top of the list. the `(any namespace)` variant is now always below the unnamespaced when both are present
|
||||
* fixed up a bunch of jank unit tests that were testing this badly
|
||||
|
||||
## [Version 581](https://github.com/hydrusnetwork/hydrus/releases/tag/v581)
|
||||
|
||||
### misc
|
||||
|
||||
* thanks to a user, we have a much improved shimmie parser, for both file and gallery urls, that fetches md5 better, improves gallery navigation, stops grabbing bad urls and related tags by accident, and can handle namespaces for those shimmies that use them. for our purposes, this improves r34h and r34@paheal downloaders by default
|
||||
* thanks to a user, we have a new 'Dark Blue 1.1' styesheet with some improvements. the recommendation is: check the different scrollbar styling to see if you prefer the old version
|
||||
* timedelta widgets now enforce their minimum time on focus-out rather than value change. if it wants at least 20 minutes, you can now type in '5...' in the minutes column without it going nuts. let me know if you discover a way to out-fox the focus-out detection!
|
||||
* added a checkbox to file import options to govern whether 'import destinations' and 'archive all imports' apply to 'already in db' files. this turns on/off the logic that I made more reliable last week. default is that they do
|
||||
* added 'do sleep check' to _options->system_ to try some things out on systems that often false-positive this check
|
||||
* the 'review current network jobs' multi-column list has a new right-click menu to show a bit more debug info about each job: each of its network contexts, how the bandwidth is on each context, if the domain is ok, if it is waiting on a connection error, if it is waiting on serverside bandwidth, if it obey bandwidth, and if its tokens are ok. if you have been working with me on gallery jobs that just sit on 'starting soon', please check it out and let me know what you see. also, 'review current network jobs' is duplicated to the help->debug menu. I forgot where it was, so let's have it in both places
|
||||
* on the filename-import tagging panel, the filename and directory checkbox-and-text-edit widgets no longer emit a (sometimes laggy) update signal when typing when the checkbox is unchecked
|
||||
|
||||
### janitor stuff
|
||||
|
||||
* if you are a repository janitor, right-clicking on any tag shows a new 'admin' menu
|
||||
* if you have 'change options' permission, you will see 'block x'/'re-allow x' to let you quickly see if tags are blocked and then edit the repository tag filter respectively
|
||||
* if you have 'mappings petition resolution' permission, you can 'purge' the selected tags, which will deleted them from the service entirely. this launches a review window that previews the job and allows adding of more tags using the standard autocomplete interface. when 'fired off', it launches a tag migration job to queue up the full petition/delete upload
|
||||
* this new 'purge' window is also available from the normal 'administrate services' menu in the main gui
|
||||
* also under the 'administrate services' is a new 'purge tag filter' command, which applies the existing repository tag filter to its own mappings store, retroactively syncing you to it
|
||||
|
||||
### tag filters and migration
|
||||
|
||||
* I wrote a database routine that quickly converts a hydrus tag filter into the list of tags within a file and tag search context. this tech will have a variety of uses in the genre of 'hey please delete/fetch/check all these tags'
|
||||
* to start with, it is now plugged into the tag migration system, so when you set up, say, an 'all known files' tag migration that only looks for a namespace or a bunch of single tags, the 'setup' phase is now massively, massively faster (previously, with something like the PTR, this would be scanning through tens of millions of files for minutes; now it just targets the 50k or whatever using existing tag search tech usually within less than a second)
|
||||
* cleaned (KISSed) and reworked the tag filter logic a bit--it can now, underlyingly, handle 'no namespaced tags, except for creator:anything, but still allowing creator:blah'
|
||||
* optimised how tag filters do 'apply unnamespaced rules to namespaced tags' (which happens in some blacklists that want to be expansive)
|
||||
* improved how the tag filter describes itself in many cases. it should make more grammatical sense and repeat itself less now (e.g. no more 'all tags and namespaced tags and unnamespaced tags' rubbish)
|
||||
* improved how some tag filter rules are handled across the program, including fixing some edge-case false-positive namespace-rule detection
|
||||
* deleted some ancient and no longer used tag filtering code
|
||||
|
||||
### boring multi-column list stuff
|
||||
|
||||
* did more 'select, sort, and scroll' code cleanup in my multi-column lists, specifically: manage import folders; manage export folders; the string-to-string dict list; edit ngug; edit downloader display (both gugs and url classes, and with a one-shot show/hide choice on a multi-selection rather than asking for each in turn); the special 'duplicate' command of edit shortcut set; and the string converter conversions list (including better select logic on move up/down)
|
||||
* in keeping with the new general policy of 'when you edit a multi-column list, you just edit one row', the various 'edit' buttons under these lists across the program are now generally only enabled when you have one row selected
|
||||
* the new 'select, sort, and scroll to new item when a human adds it' tech now _deselects_ the previous selection. let me know if this screws up anywhere (maybe in a hacky multi-add somewhere it'll only select the last added?)
|
||||
* the aggravating 'clear the focus of the list on most changes bro' jank seems to be fixed--it was a dumb legacy thing
|
||||
* whenever the multi-column list does its new 'scroll-to' action, it now takes focus to better highlight where we are (rather than stay, for instance, leaving focus on the 'add' button you just clicked)
|
||||
|
||||
### other boring stuff
|
||||
|
||||
* worked a little more on a routine that collapses an arbitrary list of strings to a human-presentable summary and replaced the hardcoded hacky version that presents the 'paste queries' result in the 'edit subscription' panel with it
|
||||
* wrote a similar new routine to collapse an arbitrary list of strings to a single-line summary, appropriate for menu labels and such
|
||||
* fixed a layout issue in the 'manage downloader display' dialog that caused the 'edit' button on the 'media viewer urls' side to not show, lmaooooooo
|
||||
* ephemeral 'watcher' and 'gallery' network contexts now describe themselves with a nicer string
|
||||
* decoupled how some service admin stuff works behind the scenes to make it easier to launch this stuff from different UI widgets
|
||||
* refactored `ToHumanInt` and the `ToPrettyOrdinalString` guys to a new `HydrusNumbers.py` file
|
||||
* fixed some bad Client API documentation for the params in `/get_files/search_files`
|
||||
|
|
|
@ -86,7 +86,7 @@ If the client does not support CBOR, you'll get 406.
|
|||
|
||||
## Access and permissions
|
||||
|
||||
The client gives access to its API through different 'access keys', which are the typical 64-character hex used in many other places across hydrus. Each guarantees different permissions such as handling files or tags. Most of the time, a user will provide full access, but do not assume this. If the access header or parameter is not provided, you will get 401, and all insufficient permission problems will return 403 with appropriate error text.
|
||||
The client gives access to its API through different 'access keys', which are the typical random 64-character hex used in many other places across hydrus. Each guarantees different permissions such as handling files or tags. Most of the time, a user will provide full access, but do not assume this. If a access key header or parameter is not provided, you will get 401, and all insufficient permission problems will return 403 with appropriate error text.
|
||||
|
||||
Access is required for every request. You can provide this as an http header, like so:
|
||||
|
||||
|
@ -94,7 +94,9 @@ Access is required for every request. You can provide this as an http header, li
|
|||
Hydrus-Client-API-Access-Key : 0150d9c4f6a6d2082534a997f4588dcf0c56dffe1d03ffbf98472236112236ae
|
||||
```
|
||||
|
||||
Or you can include it in the normal parameters of any request (except _POST /add\_files/add\_file_, which uses the entire POST body for the file's bytes). For GET, this means including it into the URL parameters:
|
||||
Or you can include it in the normal parameters of any request (except _POST /add\_files/add\_file_, which uses the entire POST body for the file's bytes).
|
||||
|
||||
For GET, this means including it into the URL parameters:
|
||||
|
||||
```
|
||||
/get_files/thumbnail?file_id=452158&Hydrus-Client-API-Access-Key=0150d9c4f6a6d2082534a997f4588dcf0c56dffe1d03ffbf98472236112236ae
|
||||
|
@ -389,7 +391,8 @@ Required Headers: n/a
|
|||
Arguments:
|
||||
|
||||
: * `name`: (descriptive name of your access)
|
||||
* `basic_permissions`: A JSON-encoded list of numerical permission identifiers you want to request.
|
||||
* `permits_everything`: (selective, bool, whether to permit all tasks now and in future)
|
||||
* `basic_permissions`: Selective. A JSON-encoded list of numerical permission identifiers you want to request.
|
||||
|
||||
The permissions are currently:
|
||||
|
||||
|
@ -406,11 +409,17 @@ Arguments:
|
|||
* 10 - Manage Popups
|
||||
* 11 - Edit File Times
|
||||
* 12 - Commit Pending
|
||||
|
||||
``` title="Example request"
|
||||
/request_new_permissions?name=my%20import%20script&basic_permissions=[0,1]
|
||||
```
|
||||
|
||||
* 13 - See Local Paths
|
||||
|
||||
|
||||
``` title="Example request"
|
||||
/request_new_permissions?name=migrator&permit_everything=true
|
||||
```
|
||||
|
||||
``` title="Example request (for permissions [0,1])"
|
||||
/request_new_permissions?name=my%20import%20script&basic_permissions=%5B0%2C1%5D
|
||||
```
|
||||
|
||||
Response:
|
||||
: Some JSON with your access key, which is 64 characters of hex. This will not be valid until the user approves the request in the client ui.
|
||||
```json title="Example response"
|
||||
|
@ -419,6 +428,8 @@ Response:
|
|||
}
|
||||
```
|
||||
|
||||
The `permits_everything` overrules all the individual permissions and will encompass any new permissions added in future. It is a convenient catch-all for local-only services where you are running things yourself or the user otherwise implicitly trusts you.
|
||||
|
||||
### **GET `/session_key`** { id="session_key" }
|
||||
|
||||
_Get a new session key._
|
||||
|
@ -455,13 +466,15 @@ Arguments: n/a
|
|||
|
||||
Response:
|
||||
: 401/403/419 and some error text if the provided access/session key is invalid, otherwise some JSON with basic permission info.
|
||||
|
||||
```json title="Example response"
|
||||
{
|
||||
"name" : "autotagger",
|
||||
"permits_everything" : false,
|
||||
"basic_permissions" : [0, 1, 3],
|
||||
"human_description" : "API Permissions (autotagger): add tags to files, import files, search for files: Can search: only autotag this"
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
### **GET `/get_service`** { id="get_service" }
|
||||
|
||||
|
@ -2089,7 +2102,6 @@ If you add `detailed_url_information=true`, a new entry, `detailed_known_urls`,
|
|||
}
|
||||
```
|
||||
|
||||
|
||||
### **GET `/get_files/file`** { id="get_files_file" }
|
||||
|
||||
_Get a file._
|
||||
|
@ -2105,7 +2117,7 @@ Arguments :
|
|||
* `hash`: (selective, a hexadecimal SHA256 hash for the file)
|
||||
* `download`: (optional, boolean, default `false`)
|
||||
|
||||
Only use one of file_id or hash. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
Only use one of `file_id` or `hash`. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
|
||||
``` title="Example request"
|
||||
/get_files/file?file_id=452158
|
||||
|
@ -2119,6 +2131,8 @@ Response:
|
|||
|
||||
By default, this will set the `Content-Disposition` header to `inline`, which causes a web browser to show the file. If you set `download=true`, it will set it to `attachment`, which triggers the browser to automatically download it (or open the 'save as' dialog) instead.
|
||||
|
||||
This stuff supports `Range` requests, so if you want to build a video player, go nuts.
|
||||
|
||||
### **GET `/get_files/thumbnail`** { id="get_files_thumbnail" }
|
||||
|
||||
_Get a file's thumbnail._
|
||||
|
@ -2133,7 +2147,7 @@ Arguments:
|
|||
* `file_id`: (selective, numerical file id for the file)
|
||||
* `hash`: (selective, a hexadecimal SHA256 hash for the file)
|
||||
|
||||
Only use one. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
Only use one. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
|
||||
``` title="Example request"
|
||||
/get_files/thumbnail?file_id=452158
|
||||
|
@ -2155,6 +2169,84 @@ Response:
|
|||
!!! note "Size of Defaults"
|
||||
If you get a 'default' filetype thumbnail like the pdf or hydrus one, you will be pulling the pngs straight from the hydrus/static folder. They will most likely be 200x200 pixels.
|
||||
|
||||
### **GET `/get_files/file_path`** { id="get_files_file_path" }
|
||||
|
||||
_Get a local file path._
|
||||
|
||||
Restricted access:
|
||||
: YES. Search for Files permission and See Local Paths permission needed. Additional search permission limits may apply.
|
||||
|
||||
Required Headers: n/a
|
||||
|
||||
Arguments :
|
||||
:
|
||||
* `file_id`: (selective, numerical file id for the file)
|
||||
* `hash`: (selective, a hexadecimal SHA256 hash for the file)
|
||||
|
||||
Only use one. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
|
||||
``` title="Example request"
|
||||
/get_files/file_path?file_id=452158
|
||||
```
|
||||
``` title="Example request"
|
||||
/get_files/file_path?hash=7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a
|
||||
```
|
||||
|
||||
Response:
|
||||
: The actual path to the file on the host system.
|
||||
|
||||
``` json title="Example response"
|
||||
{
|
||||
"path" : "D:\hydrus_files\f7f\7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a.jpg"
|
||||
}
|
||||
```
|
||||
|
||||
This will give 404 if the file is not stored locally (which includes if it _should_ exist but is actually missing from the file store).
|
||||
|
||||
### **GET `/get_files/thumbnail_path`** { id="get_files_thumbnail" }
|
||||
|
||||
_Get a local thumbnail path._
|
||||
|
||||
Restricted access:
|
||||
: YES. Search for Files permission and See Local Paths permission needed. Additional search permission limits may apply.
|
||||
|
||||
Required Headers: n/a
|
||||
|
||||
Arguments:
|
||||
:
|
||||
* `file_id`: (selective, numerical file id for the file)
|
||||
* `hash`: (selective, a hexadecimal SHA256 hash for the file)
|
||||
* `include_thumbnail_filetype`: (optional, boolean, defaults to `false`)
|
||||
|
||||
Only use one of `file_id` or `hash`. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
|
||||
|
||||
``` title="Example request"
|
||||
/get_files/thumbnail?file_id=452158
|
||||
```
|
||||
``` title="Example request"
|
||||
/get_files/thumbnail?hash=7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a&include_thumbnail_filetype=true
|
||||
```
|
||||
|
||||
Response:
|
||||
: The actual path to the thumbnail on the host system.
|
||||
|
||||
``` json title="Example response"
|
||||
{
|
||||
"path" : "D:\hydrus_files\f7f\7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a.thumbnail"
|
||||
}
|
||||
```
|
||||
|
||||
``` json title="Example response with include_thumbnail_filetype=true"
|
||||
{
|
||||
"path" : "C:\hydrus_thumbs\f85\85daaefdaa662761d7cb1b026d7b101e74301be08e50bf09a235794ec8656f79.thumbnail",
|
||||
"filetype" : "image/png"
|
||||
}
|
||||
```
|
||||
|
||||
All thumbnails in hydrus have the .thumbnail file extension and in content are either jpeg (almost always) or png (to handle transparency).
|
||||
|
||||
This will 400 if the given file type does not have a thumbnail in hydrus, and it will 404 if there should be a thumbnail but one does not exist and cannot be generated from the source file (which probably would mean that the source file was itself Not Found).
|
||||
|
||||
### **GET `/get_files/render`** { id="get_files_render" }
|
||||
|
||||
_Get an image file as rendered by Hydrus._
|
||||
|
|
|
@ -34,6 +34,37 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_591"><a href="#version_591">version 591</a></h2>
|
||||
<ul>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>fixed a stupid oversight with last week's "move page focus left/right after closing tab" thing where it was firing even when the page closed was not the current tab!! it now correctly only moves your focus if you close the _current_ tab, not if you just middle click some other one</li>
|
||||
<li>fixed the _share->export files_ menu command not showing if you right-clicked on just one file</li>
|
||||
<li>cleaned some of the broader thumbnail menu code, separating the 'stuff to show if we have a focus' and 'stuff to show if we have a selection'; the various 'manage' commands now generally show even if there is no current 'focus' in the preview (which happens if you select with ctrl+click or ctrl+a and then right-click in whitespace)</li>
|
||||
<li>the 'migrate tags' dialog now allows you to filter the sibling or parent pairs by whether the child/worse or parent/ideal tag has actual mapping counts on an arbitrary tag service. some new unit tests ensure this capability</li>
|
||||
<li>fixed an error in the duplicate metadata merge system where if files were exchanging known URLs, and one of those URLs was not actually an URL (e.g. it was garbage data, or human-entered 'location' info), a secondary system that tried to merge correlated domain-based timestamps was throwing an exception</li>
|
||||
<li>to reduce comma-confusion, the template for 'show num files and import status' on page names is now "name - (num_files - import_status)"</li>
|
||||
<li>the option that governs whether page names have the file count after them (under _options->gui pages_) has a new choice--'show for all pages, but only if greater than zero'--which is now the default for new users</li>
|
||||
<li><h3>some boring code cleanup</h3></li>
|
||||
<li>broke up the over-coupled 'migrate tags' unit tests into separate content types and the new count-filtering stuff</li>
|
||||
<li>cleaned up the 'share' menu construction code--it was messy after some recent rewrites</li>
|
||||
<li>added some better error handling around some of the file/thumbnail path fetching/regen routines</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>the client api gets a new permissions state this week: the permissions structure you edit for an access key can now be (and, as a convenient default, starts as) a simple 'permits everything' state. if the permissions are set to 'permit everything', then this overrules all the specific rules and tag search filter gubbins. nice and simple, and a permissions set this way will automatically inherit new permissions in the future. any api access keys that have all the permissions up to 'edit ratings' will be auto-updated to 'permits everything' and you will get an update saying this happened--check your permissions in _review services_ if you need finer control</li>
|
||||
<li>added a new permission, `13`, for 'see local paths'</li>
|
||||
<li>added `/get_files/file_path`, which fetches the local path of a file. it needs the new permission</li>
|
||||
<li>added `/get_files/thumbnail_path`, which fetches the local path of a thumbnail and optionally the filetype of the actual thumb (jpeg or png). it needs the new permission</li>
|
||||
<li>the `/request_new_permissions` command now accepts a `permits_everything` bool as a selective alternate to the `basic_permissions` list</li>
|
||||
<li>the `/verify_access_key` command now responds with the name of the access key and the new `permits_everything` value</li>
|
||||
<li>the API help is updated for the above</li>
|
||||
<li>new unit tests test all the above</li>
|
||||
<li>the Client API version is now 71</li>
|
||||
<li><h3>client api refactoring</h3></li>
|
||||
<li>the main `ClientLocalServerResources` file has been getting too huge (5,000 lines), so I've moved it and `ClientLocalServer` to their own `api` module and broken the Resources file up into core functions, the superclass, and the main verbs</li>
|
||||
<li>fixed permissions check for `/manage_popups/update_popup`, which was checking for pages permission rather than popup permission</li>
|
||||
<li>did a general linting pass of these easier-to-handle files; cleaned up some silly stuff</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_590"><a href="#version_590">version 590</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -24,6 +24,7 @@ CLIENT_API_PERMISSION_EDIT_RATINGS = 9
|
|||
CLIENT_API_PERMISSION_MANAGE_POPUPS = 10
|
||||
CLIENT_API_PERMISSION_EDIT_TIMES = 11
|
||||
CLIENT_API_PERMISSION_COMMIT_PENDING = 12
|
||||
CLIENT_API_PERMISSION_SEE_LOCAL_PATHS = 13
|
||||
|
||||
ALLOWED_PERMISSIONS = (
|
||||
CLIENT_API_PERMISSION_ADD_FILES,
|
||||
|
@ -38,7 +39,8 @@ ALLOWED_PERMISSIONS = (
|
|||
CLIENT_API_PERMISSION_EDIT_RATINGS,
|
||||
CLIENT_API_PERMISSION_MANAGE_POPUPS,
|
||||
CLIENT_API_PERMISSION_EDIT_TIMES,
|
||||
CLIENT_API_PERMISSION_COMMIT_PENDING
|
||||
CLIENT_API_PERMISSION_COMMIT_PENDING,
|
||||
CLIENT_API_PERMISSION_SEE_LOCAL_PATHS
|
||||
)
|
||||
|
||||
basic_permission_to_str_lookup = {}
|
||||
|
@ -56,6 +58,7 @@ basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_EDIT_RATINGS ] = 'edit fil
|
|||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_POPUPS ] = 'manage popups'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_EDIT_TIMES ] = 'edit file times'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_COMMIT_PENDING ] = 'commit pending'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_SEE_LOCAL_PATHS ] = 'see local file paths'
|
||||
|
||||
SEARCH_RESULTS_CACHE_TIMEOUT = 4 * 3600
|
||||
|
||||
|
@ -235,9 +238,9 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CLIENT_API_PERMISSIONS
|
||||
SERIALISABLE_NAME = 'Client API Permissions'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, name = 'new api permissions', access_key = None, basic_permissions = None, search_tag_filter = None ):
|
||||
def __init__( self, name = 'new api permissions', access_key = None, permits_everything = True, basic_permissions = None, search_tag_filter = None ):
|
||||
|
||||
if access_key is None:
|
||||
|
||||
|
@ -258,6 +261,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
self._access_key = access_key
|
||||
|
||||
self._permits_everything = permits_everything
|
||||
self._basic_permissions = set( basic_permissions )
|
||||
self._search_tag_filter = search_tag_filter
|
||||
|
||||
|
@ -274,17 +278,17 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
serialisable_basic_permissions = list( self._basic_permissions )
|
||||
serialisable_search_tag_filter = self._search_tag_filter.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_access_key, serialisable_basic_permissions, serialisable_search_tag_filter )
|
||||
return ( serialisable_access_key, self._permits_everything, serialisable_basic_permissions, serialisable_search_tag_filter )
|
||||
|
||||
|
||||
def _HasPermission( self, permission ):
|
||||
|
||||
return permission in self._basic_permissions
|
||||
return self._permits_everything or permission in self._basic_permissions
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_access_key, serialisable_basic_permissions, serialisable_search_tag_filter ) = serialisable_info
|
||||
( serialisable_access_key, self._permits_everything, serialisable_basic_permissions, serialisable_search_tag_filter ) = serialisable_info
|
||||
|
||||
self._access_key = bytes.fromhex( serialisable_access_key )
|
||||
|
||||
|
@ -292,10 +296,43 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._search_tag_filter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_search_tag_filter )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( serialisable_access_key, serialisable_basic_permissions, serialisable_search_tag_filter ) = old_serialisable_info
|
||||
|
||||
basic_permissions = set( serialisable_basic_permissions )
|
||||
|
||||
# note this isn't everything as of 2024-09, but everything until recently. we want to capture more people for the whole convenience point of doing this
|
||||
permits_everything = basic_permissions.issubset( {
|
||||
CLIENT_API_PERMISSION_ADD_FILES,
|
||||
CLIENT_API_PERMISSION_ADD_TAGS,
|
||||
CLIENT_API_PERMISSION_ADD_URLS,
|
||||
CLIENT_API_PERMISSION_SEARCH_FILES,
|
||||
CLIENT_API_PERMISSION_MANAGE_PAGES,
|
||||
CLIENT_API_PERMISSION_MANAGE_HEADERS,
|
||||
CLIENT_API_PERMISSION_MANAGE_DATABASE,
|
||||
CLIENT_API_PERMISSION_ADD_NOTES,
|
||||
CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS,
|
||||
CLIENT_API_PERMISSION_EDIT_RATINGS
|
||||
} )
|
||||
|
||||
new_serialisable_info = ( serialisable_access_key, permits_everything, serialisable_basic_permissions, serialisable_search_tag_filter )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def CheckAtLeastOnePermission( self, permissions ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if self._permits_everything:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if True not in ( self._HasPermission( permission ) for permission in permissions ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'You need at least one these permissions: {}'.format( ', '.join( basic_permission_to_str_lookup[ permission ] for permission in permissions ) ) )
|
||||
|
@ -307,7 +344,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._search_tag_filter.AllowsEverything():
|
||||
if self._permits_everything or self._search_tag_filter.AllowsEverything():
|
||||
|
||||
return
|
||||
|
||||
|
@ -330,6 +367,11 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._permits_everything:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if not ( self._HasPermission( CLIENT_API_PERMISSION_SEARCH_FILES ) and self._search_tag_filter.AllowsEverything() ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'You do not have permission to see all files, so you cannot do this.' )
|
||||
|
@ -349,7 +391,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._search_tag_filter.AllowsEverything():
|
||||
if self._permits_everything or self._search_tag_filter.AllowsEverything():
|
||||
|
||||
return
|
||||
|
||||
|
@ -377,7 +419,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._search_tag_filter.AllowsEverything():
|
||||
if self._permits_everything or self._search_tag_filter.AllowsEverything():
|
||||
|
||||
return predicates
|
||||
|
||||
|
@ -406,6 +448,11 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._permits_everything:
|
||||
|
||||
return ''
|
||||
|
||||
|
||||
p_strings = []
|
||||
|
||||
if self._HasPermission( CLIENT_API_PERMISSION_SEARCH_FILES ):
|
||||
|
@ -429,6 +476,11 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
if self._permits_everything:
|
||||
|
||||
return 'can do anything'
|
||||
|
||||
|
||||
sorted_perms = sorted( ( basic_permission_to_str_lookup[ p ] for p in self._basic_permissions ) )
|
||||
|
||||
return ', '.join( sorted_perms )
|
||||
|
@ -462,6 +514,14 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
|
||||
def PermitsEverything( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._permits_everything
|
||||
|
||||
|
||||
|
||||
def SetLastSearchResults( self, hash_ids ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -306,11 +306,13 @@ network_context_type_description_lookup = {
|
|||
PAGE_FILE_COUNT_DISPLAY_ALL = 0
|
||||
PAGE_FILE_COUNT_DISPLAY_NONE = 1
|
||||
PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS = 2
|
||||
PAGE_FILE_COUNT_DISPLAY_ALL_BUT_ONLY_IF_GREATER_THAN_ZERO = 3
|
||||
|
||||
page_file_count_display_string_lookup = {
|
||||
PAGE_FILE_COUNT_DISPLAY_ALL : 'for all pages',
|
||||
PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS : 'for import pages',
|
||||
PAGE_FILE_COUNT_DISPLAY_NONE : 'for no pages'
|
||||
PAGE_FILE_COUNT_DISPLAY_NONE : 'for no pages',
|
||||
PAGE_FILE_COUNT_DISPLAY_ALL_BUT_ONLY_IF_GREATER_THAN_ZERO : 'for all pages, but only if greater than zero'
|
||||
}
|
||||
|
||||
PAGE_STATE_NORMAL = 0
|
||||
|
|
|
@ -1943,7 +1943,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
context_factory = twisted.internet.ssl.DefaultOpenSSLContextFactory( ssl_key_path, ssl_cert_path, sslmethod )
|
||||
|
||||
|
||||
from hydrus.client.networking import ClientLocalServer
|
||||
from hydrus.client.networking.api import ClientLocalServer
|
||||
|
||||
if service_type == HC.CLIENT_API_SERVICE:
|
||||
|
||||
|
|
|
@ -1800,6 +1800,11 @@ class ClientFilesManager( object ):
|
|||
|
||||
def RegenerateThumbnail( self, media ):
|
||||
|
||||
if not media.GetLocationsManager().IsLocal():
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'I was called to regenerate a thumbnail from source, but the source file does not think it is in the local file store!' )
|
||||
|
||||
|
||||
hash = media.GetHash()
|
||||
mime = media.GetMime()
|
||||
|
||||
|
|
|
@ -614,15 +614,19 @@ class MigrationSourceHTA( MigrationSource ):
|
|||
|
||||
class MigrationSourceHTPA( MigrationSource ):
|
||||
|
||||
def __init__( self, controller, path, left_tag_filter, right_tag_filter ):
|
||||
def __init__( self, controller, path, content_type, left_tag_filter, right_tag_filter, left_side_needs_count, right_side_needs_count, needs_count_service_key ):
|
||||
|
||||
name = os.path.basename( path )
|
||||
|
||||
super().__init__( controller, name )
|
||||
|
||||
self._path = path
|
||||
self._content_type = content_type
|
||||
self._left_tag_filter = left_tag_filter
|
||||
self._right_tag_filter = right_tag_filter
|
||||
self._left_side_needs_count = left_side_needs_count
|
||||
self._right_side_needs_count = right_side_needs_count
|
||||
self._needs_count_service_key = needs_count_service_key
|
||||
|
||||
self._htpa = None
|
||||
self._iterator = None
|
||||
|
@ -654,6 +658,11 @@ class MigrationSourceHTPA( MigrationSource ):
|
|||
data = [ ( left_tag, right_tag ) for ( left_tag, right_tag ) in data if self._left_tag_filter.TagOK( left_tag ) and self._right_tag_filter.TagOK( right_tag ) ]
|
||||
|
||||
|
||||
if self._left_side_needs_count or self._right_side_needs_count:
|
||||
|
||||
data = self._controller.Read( 'migration_filter_pairs_by_count', data, self._content_type, self._left_side_needs_count, self._right_side_needs_count, self._needs_count_service_key )
|
||||
|
||||
|
||||
return data
|
||||
|
||||
|
||||
|
@ -739,7 +748,7 @@ class MigrationSourceTagServiceMappings( MigrationSource ):
|
|||
|
||||
class MigrationSourceTagServicePairs( MigrationSource ):
|
||||
|
||||
def __init__( self, controller, tag_service_key, content_type, left_tag_filter, right_tag_filter, content_statuses ):
|
||||
def __init__( self, controller, tag_service_key, content_type, left_tag_filter, right_tag_filter, content_statuses, left_side_needs_count, right_side_needs_count, needs_count_service_key ):
|
||||
|
||||
name = controller.services_manager.GetName( tag_service_key )
|
||||
|
||||
|
@ -750,6 +759,9 @@ class MigrationSourceTagServicePairs( MigrationSource ):
|
|||
self._left_tag_filter = left_tag_filter
|
||||
self._right_tag_filter = right_tag_filter
|
||||
self._content_statuses = content_statuses
|
||||
self._left_side_needs_count = left_side_needs_count
|
||||
self._right_side_needs_count = right_side_needs_count
|
||||
self._needs_count_service_key = needs_count_service_key
|
||||
|
||||
self._database_temp_job_name = 'migrate_{}'.format( os.urandom( 16 ).hex() )
|
||||
|
||||
|
@ -768,6 +780,11 @@ class MigrationSourceTagServicePairs( MigrationSource ):
|
|||
self._work_to_do = False
|
||||
|
||||
|
||||
if self._left_side_needs_count or self._right_side_needs_count:
|
||||
|
||||
data = self._controller.Read( 'migration_filter_pairs_by_count', data, self._content_type, self._left_side_needs_count, self._right_side_needs_count, self._needs_count_service_key )
|
||||
|
||||
|
||||
return data
|
||||
|
||||
|
||||
|
|
|
@ -373,7 +373,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
'close_page_focus_goes' : CC.CLOSED_PAGE_FOCUS_GOES_RIGHT,
|
||||
'num_recent_petition_reasons' : 5,
|
||||
'max_page_name_chars' : 20,
|
||||
'page_file_count_display' : CC.PAGE_FILE_COUNT_DISPLAY_ALL,
|
||||
'page_file_count_display' : CC.PAGE_FILE_COUNT_DISPLAY_ALL_BUT_ONLY_IF_GREATER_THAN_ZERO,
|
||||
'network_timeout' : 10,
|
||||
'connection_error_wait_time' : 15,
|
||||
'serverside_bandwidth_wait_time' : 60,
|
||||
|
|
|
@ -5343,6 +5343,60 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._Execute( 'DROP TABLE {};'.format( database_temp_job_name ) )
|
||||
|
||||
|
||||
def _MigrationFilterPairsByCount( self, pairs, content_type, left_side_needs_count, right_side_needs_count, needs_count_service_key ):
|
||||
|
||||
def tag_has_count( tag_id ):
|
||||
|
||||
results = self.modules_mappings_counts.GetCountsForTag( ClientTags.TAG_DISPLAY_STORAGE, self.modules_services.combined_file_service_id, tag_service_id, tag_id )
|
||||
|
||||
if len( results ) == 0:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
( gumpf_id, current_count, pending_count ) = results[0]
|
||||
|
||||
return current_count + pending_count > 0
|
||||
|
||||
|
||||
tag_service_id = self.modules_services.GetServiceId( needs_count_service_key )
|
||||
|
||||
good_pairs = []
|
||||
|
||||
for ( a, b ) in pairs:
|
||||
|
||||
if left_side_needs_count:
|
||||
|
||||
a_id = self.modules_tags_local_cache.GetTagId( a )
|
||||
|
||||
if not tag_has_count( a_id ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
if right_side_needs_count:
|
||||
|
||||
b_id = self.modules_tags_local_cache.GetTagId( b )
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
# siblings tests the ideal, not the 'right' alone
|
||||
b_id = self.modules_tag_siblings.GetIdealTagId( ClientTags.TAG_DISPLAY_DISPLAY_IDEAL, tag_service_id, b_id )
|
||||
|
||||
|
||||
if not tag_has_count( b_id ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
good_pairs.append( ( a, b ) )
|
||||
|
||||
|
||||
return good_pairs
|
||||
|
||||
|
||||
def _MigrationGetMappings( self, database_temp_job_name, location_context: ClientLocation.LocationContext, tag_service_key, hash_type, tag_filter, content_statuses ):
|
||||
|
||||
time_started_precise = HydrusTime.GetNowPrecise()
|
||||
|
@ -6891,6 +6945,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'media_result': result = self._GetMediaResultFromHash( *args, **kwargs )
|
||||
elif action == 'media_results': result = self._GetMediaResultsFromHashes( *args, **kwargs )
|
||||
elif action == 'media_results_from_ids': result = self._GetMediaResults( *args, **kwargs )
|
||||
elif action == 'migration_filter_pairs_by_count': result = self._MigrationFilterPairsByCount( *args, **kwargs )
|
||||
elif action == 'migration_get_mappings': result = self._MigrationGetMappings( *args, **kwargs )
|
||||
elif action == 'migration_get_pairs': result = self._MigrationGetPairs( *args, **kwargs )
|
||||
elif action == 'missing_repository_update_hashes': result = self.modules_repositories.GetRepositoryUpdateHashesIDoNotHave( *args, **kwargs )
|
||||
|
@ -10803,6 +10858,40 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 590:
|
||||
|
||||
try:
|
||||
|
||||
client_api_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_CLIENT_API_MANAGER )
|
||||
|
||||
all_permissions = client_api_manager.GetAllPermissions()
|
||||
|
||||
for permissions in all_permissions:
|
||||
|
||||
if permissions.PermitsEverything():
|
||||
|
||||
message = 'Hey, for convenience, at least one of your Client API access permissions was upgraded to "permits everything". This is a simpler state that will auto-inherit new permissions as they are added in future. If you need finer control, please check the settings in "services->review services".'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
self.modules_serialisable.SetJSONDump( client_api_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to check some API stuff failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusNumbers.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -793,7 +793,21 @@ def get_updated_domain_modified_timestamp_datas( destination_media: ClientMedia.
|
|||
|
||||
from hydrus.client.networking import ClientNetworkingFunctions
|
||||
|
||||
domains = { ClientNetworkingFunctions.ConvertURLIntoDomain( url ) for url in urls }
|
||||
domains = set()
|
||||
|
||||
for url in urls:
|
||||
|
||||
try:
|
||||
|
||||
domain = ClientNetworkingFunctions.ConvertURLIntoDomain( url )
|
||||
|
||||
domains.add( domain )
|
||||
|
||||
except:
|
||||
|
||||
continue # not an url in the strict sense, let's skip since this method really wants to be dealing with nice URLs
|
||||
|
||||
|
||||
|
||||
timestamp_datas = []
|
||||
source_timestamp_manager = source_media.GetLocationsManager().GetTimesManager()
|
||||
|
|
|
@ -5941,7 +5941,7 @@ QMenuBar::item { padding: 2px 8px; margin: 0px; }'''
|
|||
|
||||
#
|
||||
|
||||
api_permissions = ClientAPI.APIPermissions( name = 'hydrus test access', basic_permissions = list( ClientAPI.ALLOWED_PERMISSIONS ), search_tag_filter = HydrusTags.TagFilter() )
|
||||
api_permissions = ClientAPI.APIPermissions( name = 'hydrus test access', permits_everything = True )
|
||||
|
||||
access_key = api_permissions.GetAccessKey()
|
||||
|
||||
|
|
|
@ -66,7 +66,7 @@ class CaptureAPIAccessPermissionsRequestPanel( ClientGUIScrolledPanels.ReviewPan
|
|||
|
||||
class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, api_permissions ):
|
||||
def __init__( self, parent, api_permissions: ClientAPI.APIPermissions ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
|
@ -84,6 +84,8 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._permissions_panel = ClientGUICommon.StaticBox( self, 'permissions' )
|
||||
|
||||
self._permits_everything = QW.QCheckBox( self._permissions_panel )
|
||||
|
||||
self._basic_permissions = ClientGUICommon.BetterCheckBoxList( self._permissions_panel )
|
||||
|
||||
for permission in ClientAPI.ALLOWED_PERMISSIONS:
|
||||
|
@ -113,6 +115,8 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._name.setText( name )
|
||||
|
||||
self._permits_everything.setChecked( api_permissions.PermitsEverything() )
|
||||
|
||||
basic_permissions = api_permissions.GetBasicPermissions()
|
||||
|
||||
self._basic_permissions.SetValue( basic_permissions )
|
||||
|
@ -126,6 +130,7 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
self._permissions_panel.Add( ClientGUICommon.WrapInText( self._permits_everything, self._permissions_panel, 'permits everything: ' ), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._permissions_panel.Add( self._basic_permissions, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._permissions_panel.Add( self._check_all_permissions_button, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._permissions_panel.Add( ClientGUICommon.WrapInText( self._search_tag_filter, self._permissions_panel, 'tag search permissions: ' ), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
@ -141,6 +146,7 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._UpdateEnabled()
|
||||
|
||||
self._permits_everything.clicked.connect( self._UpdateEnabled )
|
||||
self._basic_permissions.checkBoxListChanged.connect( self._UpdateEnabled )
|
||||
|
||||
|
||||
|
@ -154,17 +160,30 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _UpdateEnabled( self ):
|
||||
|
||||
can_search = ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES in self._basic_permissions.GetValue()
|
||||
|
||||
self._search_tag_filter.setEnabled( can_search )
|
||||
|
||||
self._check_all_permissions_button.setEnabled( False )
|
||||
|
||||
for i in range( self._basic_permissions.count() ):
|
||||
if self._permits_everything.isChecked():
|
||||
|
||||
if not self._basic_permissions.IsChecked( i ):
|
||||
self._basic_permissions.setEnabled( False )
|
||||
self._check_all_permissions_button.setEnabled( False )
|
||||
self._search_tag_filter.setEnabled( False )
|
||||
|
||||
else:
|
||||
|
||||
self._basic_permissions.setEnabled( True )
|
||||
self._check_all_permissions_button.setEnabled( True )
|
||||
self._search_tag_filter.setEnabled( True )
|
||||
|
||||
can_search = ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES in self._basic_permissions.GetValue()
|
||||
|
||||
self._search_tag_filter.setEnabled( can_search )
|
||||
|
||||
self._check_all_permissions_button.setEnabled( False )
|
||||
|
||||
for i in range( self._basic_permissions.count() ):
|
||||
|
||||
self._check_all_permissions_button.setEnabled( True )
|
||||
if not self._basic_permissions.IsChecked( i ):
|
||||
|
||||
self._check_all_permissions_button.setEnabled( True )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -174,10 +193,11 @@ class EditAPIPermissionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
name = self._name.text()
|
||||
access_key = bytes.fromhex( self._access_key.text() )
|
||||
|
||||
permits_everything = self._permits_everything.isChecked()
|
||||
basic_permissions = self._basic_permissions.GetValue()
|
||||
search_tag_filter = self._search_tag_filter.GetValue()
|
||||
|
||||
api_permissions = ClientAPI.APIPermissions( name = name, access_key = access_key, basic_permissions = basic_permissions, search_tag_filter = search_tag_filter )
|
||||
api_permissions = ClientAPI.APIPermissions( name = name, access_key = access_key, permits_everything = permits_everything, basic_permissions = basic_permissions, search_tag_filter = search_tag_filter )
|
||||
|
||||
return api_permissions
|
||||
|
||||
|
|
|
@ -1472,8 +1472,6 @@ class CanvasPanel( Canvas ):
|
|||
|
||||
new_options = CG.client_controller.new_options
|
||||
|
||||
advanced_mode = new_options.GetBoolean( 'advanced_mode' )
|
||||
|
||||
if self._current_media is not None:
|
||||
|
||||
services = CG.client_controller.services_manager.GetServices()
|
||||
|
|
|
@ -357,12 +357,17 @@ def AddKnownURLsViewCopyMenu( win, menu, focus_media, num_files_selected: int, s
|
|||
|
||||
# figure out which urls this focused file has
|
||||
|
||||
if focus_media.IsCollection():
|
||||
|
||||
focus_media = focus_media.GetDisplayMedia()
|
||||
|
||||
focus_urls = []
|
||||
|
||||
focus_urls = focus_media.GetLocationsManager().GetURLs()
|
||||
if focus_media is not None:
|
||||
|
||||
if focus_media.IsCollection():
|
||||
|
||||
focus_media = focus_media.GetDisplayMedia()
|
||||
|
||||
|
||||
focus_urls = focus_media.GetLocationsManager().GetURLs()
|
||||
|
||||
|
||||
focus_matched_labels_and_urls = []
|
||||
focus_unmatched_urls = []
|
||||
|
@ -924,32 +929,31 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
|
||||
focused_is_local = focused_media is not None and focused_media.GetLocationsManager().IsLocal()
|
||||
|
||||
selection_is_useful = len( selected_media ) > 0 and not ( len( selected_media ) == 1 and focused_media in selected_media )
|
||||
# i.e. we aren't just clicked one one guy
|
||||
selection_verbs_are_appropriate = len( selected_media ) > 0 and not ( len( selected_media ) == 1 and focused_media in selected_media )
|
||||
|
||||
local_selection = [ m for m in selected_media if m.GetLocationsManager().IsLocal() ]
|
||||
|
||||
local_selection_is_useful = len( local_selection ) > 0 and not ( len( local_selection ) == 1 and focused_media in local_selection )
|
||||
# i.e. we aren't just clicked one one local guy
|
||||
local_selection_verbs_are_appropriate = len( local_selection ) > 0 and not ( len( local_selection ) == 1 and focused_media in local_selection )
|
||||
|
||||
share_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
if local_selection_is_useful:
|
||||
if len( local_selection ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'export files', 'Export the selected files to an external folder.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_EXPORT_FILES ) )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( share_menu )
|
||||
|
||||
if local_selection_is_useful:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy files', 'Copy these files to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILES, simple_data = CAC.FILE_COMMAND_TARGET_SELECTED_FILES ) )
|
||||
|
||||
|
||||
if local_selection_verbs_are_appropriate:
|
||||
|
||||
if local_selection_is_useful:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy paths', 'Copy these files\' paths to your clipboard, just as raw text.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_PATHS, simple_data = CAC.FILE_COMMAND_TARGET_SELECTED_FILES ) )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy files', 'Copy these files to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILES, simple_data = CAC.FILE_COMMAND_TARGET_SELECTED_FILES ) )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy paths', 'Copy these files\' paths to your clipboard, just as raw text.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_PATHS, simple_data = CAC.FILE_COMMAND_TARGET_SELECTED_FILES ) )
|
||||
|
||||
|
||||
if selection_is_useful:
|
||||
if selection_verbs_are_appropriate:
|
||||
|
||||
ipfs_service_keys_to_num_filenames = collections.Counter()
|
||||
|
||||
|
@ -974,22 +978,19 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
ClientGUIMenus.AppendMenuItem( share_menu, f'copy {name} multihashes ({HydrusNumbers.ToHumanInt(ipfs_service_keys_to_num_filenames[ipfs_service_key])} hashes)', 'Copy the selected files\' multihashes to the clipboard.', win.ProcessApplicationCommand, application_command )
|
||||
|
||||
|
||||
|
||||
if selection_is_useful:
|
||||
copy_hashes_menu = ClientGUIMenus.GenerateMenu( share_menu )
|
||||
|
||||
copy_hash_menu = ClientGUIMenus.GenerateMenu( share_menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, 'sha256', 'Copy these files\' SHA256 hashes to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha256' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, 'md5', 'Copy these files\' MD5 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'md5' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, 'sha1', 'Copy these files\' SHA1 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha1' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, 'sha512', 'Copy these files\' SHA512 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha512' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, 'sha256', 'Copy these files\' SHA256 hashes to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha256' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, 'md5', 'Copy these files\' MD5 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'md5' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, 'sha1', 'Copy these files\' SHA1 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha1' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, 'sha512', 'Copy these files\' SHA512 hashes to your clipboard. Your client may not know all of these.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'sha512' ) ) )
|
||||
|
||||
blurhashes = [ media.GetFileInfoManager().blurhash for media in selected_media ]
|
||||
blurhashes = [ b for b in blurhashes if b is not None ]
|
||||
|
||||
if len( blurhashes ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, f'blurhash ({HydrusNumbers.ToHumanInt(len(blurhashes))} hashes)', 'Copy these files\' blurhashes.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'blurhash' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, f'blurhash ({HydrusNumbers.ToHumanInt(len(blurhashes))} hashes)', 'Copy these files\' blurhashes.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'blurhash' ) ) )
|
||||
|
||||
|
||||
pixel_hashes = [ media.GetFileInfoManager().pixel_hash for media in selected_media ]
|
||||
|
@ -997,19 +998,13 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
|
||||
if len( pixel_hashes ):
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, f'pixel hashes ({HydrusNumbers.ToHumanInt(len(pixel_hashes))} hashes)', 'Copy these files\' pixel hashes.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'pixel_hash' ) ) )
|
||||
ClientGUIMenus.AppendMenuItem( copy_hashes_menu, f'pixel hashes ({HydrusNumbers.ToHumanInt(len(pixel_hashes))} hashes)', 'Copy these files\' pixel hashes.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_SELECTED_FILES, 'pixel_hash' ) ) )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( share_menu, copy_hash_menu, 'copy hashes' )
|
||||
|
||||
|
||||
if selection_is_useful:
|
||||
ClientGUIMenus.AppendMenu( share_menu, copy_hashes_menu, 'copy hashes' )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy file ids', 'Copy these files\' internal file/hash_ids.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_ID, simple_data = CAC.FILE_COMMAND_TARGET_SELECTED_FILES ) )
|
||||
|
||||
|
||||
if focused_media is not None and selection_is_useful:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( share_menu )
|
||||
|
||||
|
||||
|
@ -1017,9 +1012,6 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy file', 'Copy this file to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILES, simple_data = CAC.FILE_COMMAND_TARGET_FOCUSED_FILE ) )
|
||||
|
||||
|
||||
if focused_is_local:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy path', 'Copy this file\'s path to your clipboard, just as raw text.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_PATHS, simple_data = CAC.FILE_COMMAND_TARGET_FOCUSED_FILE ) )
|
||||
|
||||
|
||||
|
@ -1041,9 +1033,6 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
ClientGUIMenus.AppendMenuItem( share_menu, f'copy {name} multihash ({multihash})', 'Copy the selected file\'s multihash to the clipboard.', win.ProcessApplicationCommand, application_command )
|
||||
|
||||
|
||||
|
||||
if focused_media is not None:
|
||||
|
||||
copy_hash_menu = ClientGUIMenus.GenerateMenu( share_menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( copy_hash_menu, 'sha256 ({})'.format( focused_media.GetHash().hex() ), 'Copy this file\'s SHA256 hash to your clipboard.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_HASHES, simple_data = ( CAC.FILE_COMMAND_TARGET_FOCUSED_FILE, 'sha256' ) ) )
|
||||
|
@ -1067,9 +1056,6 @@ def AddShareMenu( win: QW.QWidget, menu: QW.QMenu, focused_media: typing.Optiona
|
|||
|
||||
ClientGUIMenus.AppendMenu( share_menu, copy_hash_menu, 'copy hash' )
|
||||
|
||||
|
||||
if focused_media is not None:
|
||||
|
||||
hash_id_str = HydrusNumbers.ToHumanInt( focused_media.GetHashId() )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( share_menu, 'copy file id ({})'.format( hash_id_str ), 'Copy this file\'s internal file/hash_id.', win.ProcessApplicationCommand, CAC.ApplicationCommand.STATICCreateSimpleCommand( CAC.SIMPLE_COPY_FILE_ID, simple_data = CAC.FILE_COMMAND_TARGET_FOCUSED_FILE ) )
|
||||
|
|
|
@ -134,6 +134,34 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._migration_source_right_tag_pair_filter = ClientGUITags.TagFilterButton( self._migration_panel, message, tag_filter, label_prefix = 'right: ' )
|
||||
|
||||
#
|
||||
|
||||
self._pair_have_count_panel = QW.QWidget( self._migration_panel )
|
||||
|
||||
self._migration_source_child_must_have_count = QW.QCheckBox( self._pair_have_count_panel )
|
||||
self._migration_source_worse_must_have_count = QW.QCheckBox( self._pair_have_count_panel )
|
||||
self._migration_source_parent_must_have_count = QW.QCheckBox( self._pair_have_count_panel )
|
||||
self._migration_source_ideal_must_have_count = QW.QCheckBox( self._pair_have_count_panel )
|
||||
|
||||
self._migration_source_child_must_have_count.setText( 'only if child (left) side has count' )
|
||||
self._migration_source_worse_must_have_count.setText( 'only if worse (left) side has count' )
|
||||
self._migration_source_parent_must_have_count.setText( 'only if parent (right) side has count' )
|
||||
self._migration_source_ideal_must_have_count.setText( 'only if ideal (where right side terminates) has count' )
|
||||
|
||||
self._migration_source_child_must_have_count.setToolTip( ClientGUIFunctions.WrapToolTip( 'Only include this pair if the child (left) side has an actual real mappings count in the service.' ) )
|
||||
self._migration_source_worse_must_have_count.setToolTip( ClientGUIFunctions.WrapToolTip( 'Only include this pair if the worse (left) side has an actual real mappings count in the service.' ) )
|
||||
self._migration_source_parent_must_have_count.setToolTip( ClientGUIFunctions.WrapToolTip( 'Only include this pair if the parent (right) side has an actual real mappings count in the service.' ) )
|
||||
self._migration_source_ideal_must_have_count.setToolTip( ClientGUIFunctions.WrapToolTip( 'Only include this pair if the ideal (where the chain of the right side terminates) has an actual real mappings count in the service.' ) )
|
||||
|
||||
self._migration_source_have_count_service = ClientGUICommon.BetterChoice( self._pair_have_count_panel )
|
||||
|
||||
for service in CG.client_controller.services_manager.GetServices( HC.REAL_TAG_SERVICES ):
|
||||
|
||||
self._migration_source_have_count_service.addItem( service.GetName(), service.GetServiceKey() )
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._migration_destination = ClientGUICommon.BetterChoice( self._migration_panel )
|
||||
|
||||
self._migration_destination_archive_path_button = ClientGUICommon.BetterButton( self._migration_panel, 'no path set', self._SetDestinationArchivePath )
|
||||
|
@ -161,10 +189,21 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
QP.AddToLayout( file_left_vbox, self._migration_source_location_context_button, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( file_left_vbox, self._migration_source_left_tag_pair_filter, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
have_count_vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( have_count_vbox, self._migration_source_child_must_have_count, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( have_count_vbox, self._migration_source_worse_must_have_count, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( have_count_vbox, self._migration_source_parent_must_have_count, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( have_count_vbox, self._migration_source_ideal_must_have_count, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( have_count_vbox, ClientGUICommon.WrapInText( self._migration_source_have_count_service, self._pair_have_count_panel, 'in service: ' ), CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self._pair_have_count_panel.setLayout( have_count_vbox )
|
||||
|
||||
tag_right_vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( tag_right_vbox, self._migration_source_tag_filter, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( tag_right_vbox, self._migration_source_right_tag_pair_filter, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( tag_right_vbox, self._pair_have_count_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
dest_hash_type_hbox = QP.HBoxLayout()
|
||||
|
||||
|
@ -212,13 +251,13 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
message = 'The content from the SOURCE that the FILTER ALLOWS is applied using the ACTION to the DESTINATION.'
|
||||
message = 'The CONTENT TYPE from the SOURCE that the FILTER ALLOWS is applied using the ACTION to the DESTINATION.'
|
||||
message += '\n' * 2
|
||||
message += 'To delete content en masse from one location, select what you want to delete with the filter and set the source and destination the same.'
|
||||
message += '\n' * 2
|
||||
message += 'These migrations can be powerful, so be very careful that you understand what you are doing and choose what you want. Large jobs may have a significant initial setup time, during which case the client may hang briefly, but once they start they are pausable or cancellable. If you do want to perform a large action, it is a good idea to back up your database first, just in case you get a result you did not intend.'
|
||||
message += 'These migrations can be powerful, so be very careful that you understand what you are doing and choose what you want. Large jobs may have a significant initial setup time, during which the client may hang briefly, but once they start they are pausable or cancellable. If you do want to perform a large action, it is a good idea to back up your database first, just in case you get a result you did not intend.'
|
||||
message += '\n' * 2
|
||||
message += 'You may need to restart your client to see their effect.'
|
||||
message += 'You may need to restart your client to see the full effect.'
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, message )
|
||||
st.setWordWrap( True )
|
||||
|
@ -244,6 +283,11 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._migration_source_content_status_filter.activated.connect( self._UpdateMigrationControlsActions )
|
||||
self._migration_source_file_filtering_type.activated.connect( self._UpdateMigrationControlsFileFilter )
|
||||
|
||||
self._migration_source_worse_must_have_count.clicked.connect( self._UpdateMigrationControlsPairCount )
|
||||
self._migration_source_child_must_have_count.clicked.connect( self._UpdateMigrationControlsPairCount )
|
||||
self._migration_source_ideal_must_have_count.clicked.connect( self._UpdateMigrationControlsPairCount )
|
||||
self._migration_source_parent_must_have_count.clicked.connect( self._UpdateMigrationControlsPairCount )
|
||||
|
||||
|
||||
def _MigrationGo( self ):
|
||||
|
||||
|
@ -377,6 +421,19 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
extra_info = ' for "{}" on the left and "{}" on the right'.format( left_s, right_s )
|
||||
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
left_side_needs_count = self._migration_source_worse_must_have_count.isChecked()
|
||||
right_side_needs_count = self._migration_source_ideal_must_have_count.isChecked()
|
||||
|
||||
else:
|
||||
|
||||
left_side_needs_count = self._migration_source_child_must_have_count.isChecked()
|
||||
right_side_needs_count = self._migration_source_parent_must_have_count.isChecked()
|
||||
|
||||
|
||||
needs_count_service_key = self._migration_source_have_count_service.GetValue()
|
||||
|
||||
if source_service_key == self.HTPA_SERVICE_KEY:
|
||||
|
||||
if self._source_archive_path is None:
|
||||
|
@ -386,11 +443,11 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
return
|
||||
|
||||
|
||||
source = ClientMigration.MigrationSourceHTPA( CG.client_controller, self._source_archive_path, left_tag_pair_filter, right_tag_pair_filter )
|
||||
source = ClientMigration.MigrationSourceHTPA( CG.client_controller, self._source_archive_path, content_type, left_tag_pair_filter, right_tag_pair_filter, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
else:
|
||||
|
||||
source = ClientMigration.MigrationSourceTagServicePairs( CG.client_controller, source_service_key, content_type, left_tag_pair_filter, right_tag_pair_filter, content_statuses )
|
||||
source = ClientMigration.MigrationSourceTagServicePairs( CG.client_controller, source_service_key, content_type, left_tag_pair_filter, right_tag_pair_filter, content_statuses, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
|
||||
|
||||
|
@ -744,6 +801,8 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._migration_source_content_status_filter.setEnabled( True )
|
||||
|
||||
self._migration_source_have_count_service.SetValue( source )
|
||||
|
||||
|
||||
self._UpdateMigrationControlsActions()
|
||||
|
||||
|
@ -783,6 +842,7 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._migration_source_tag_filter.show()
|
||||
self._migration_source_left_tag_pair_filter.hide()
|
||||
self._migration_source_right_tag_pair_filter.hide()
|
||||
self._pair_have_count_panel.hide()
|
||||
|
||||
self._UpdateMigrationControlsFileFilter()
|
||||
|
||||
|
@ -796,6 +856,16 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._migration_source_tag_filter.hide()
|
||||
self._migration_source_left_tag_pair_filter.show()
|
||||
self._migration_source_right_tag_pair_filter.show()
|
||||
self._pair_have_count_panel.show()
|
||||
|
||||
we_siblings = content_type == HC.CONTENT_TYPE_TAG_SIBLINGS
|
||||
|
||||
self._migration_source_child_must_have_count.setVisible( not we_siblings )
|
||||
self._migration_source_parent_must_have_count.setVisible( not we_siblings )
|
||||
|
||||
self._migration_source_worse_must_have_count.setVisible( we_siblings )
|
||||
self._migration_source_ideal_must_have_count.setVisible( we_siblings )
|
||||
|
||||
|
||||
|
||||
self._migration_source.SetValue( self._service_key )
|
||||
|
@ -803,5 +873,22 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._UpdateMigrationControlsNewSource()
|
||||
self._UpdateMigrationControlsNewDestination()
|
||||
self._UpdateMigrationControlsPairCount()
|
||||
|
||||
|
||||
def _UpdateMigrationControlsPairCount( self ):
|
||||
|
||||
content_type = self._migration_content_type.GetValue()
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
enable_it = self._migration_source_worse_must_have_count.isChecked() or self._migration_source_ideal_must_have_count.isChecked()
|
||||
|
||||
else:
|
||||
|
||||
enable_it = self._migration_source_child_must_have_count.isChecked() or self._migration_source_parent_must_have_count.isChecked()
|
||||
|
||||
|
||||
self._migration_source_have_count_service.setEnabled( enable_it )
|
||||
|
||||
|
||||
|
|
|
@ -876,6 +876,7 @@ class Page( QW.QWidget ):
|
|||
self._management_panel.REPEATINGPageUpdate()
|
||||
|
||||
|
||||
|
||||
directions_for_notebook_tabs = {}
|
||||
|
||||
directions_for_notebook_tabs[ CC.DIRECTION_UP ] = QW.QTabWidget.North
|
||||
|
@ -1051,6 +1052,8 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
|
||||
we_are_closing_the_current_focus = index == self.currentIndex()
|
||||
|
||||
page.CleanBeforeClose()
|
||||
|
||||
page_key = page.GetPageKey()
|
||||
|
@ -1070,22 +1073,25 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
self._controller.pub( 'notify_closed_page', page )
|
||||
|
||||
|
||||
focus_goes_to = self._controller.new_options.GetInteger( 'close_page_focus_goes' )
|
||||
|
||||
new_page_focus = None
|
||||
|
||||
if focus_goes_to == CC.CLOSED_PAGE_FOCUS_GOES_LEFT:
|
||||
if we_are_closing_the_current_focus:
|
||||
|
||||
new_page_focus = index - 1
|
||||
focus_goes_to = self._controller.new_options.GetInteger( 'close_page_focus_goes' )
|
||||
|
||||
elif focus_goes_to == CC.CLOSED_PAGE_FOCUS_GOES_RIGHT:
|
||||
new_page_focus = None
|
||||
|
||||
new_page_focus = index
|
||||
if focus_goes_to == CC.CLOSED_PAGE_FOCUS_GOES_LEFT:
|
||||
|
||||
new_page_focus = index - 1
|
||||
|
||||
elif focus_goes_to == CC.CLOSED_PAGE_FOCUS_GOES_RIGHT:
|
||||
|
||||
new_page_focus = index
|
||||
|
||||
|
||||
|
||||
if new_page_focus is not None and index >= 0 or index <= self.count() - 1 and new_page_focus != self.currentIndex():
|
||||
|
||||
self.setCurrentIndex( new_page_focus )
|
||||
if new_page_focus is not None and index >= 0 or index <= self.count() - 1 and new_page_focus != self.currentIndex():
|
||||
|
||||
self.setCurrentIndex( new_page_focus )
|
||||
|
||||
|
||||
|
||||
self._UpdatePreviousPageIndex()
|
||||
|
@ -1359,7 +1365,11 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
( num_files, ( num_value, num_range ) ) = page.GetNumFileSummary()
|
||||
|
||||
if page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ALL or ( page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS and page.IsImporter() ):
|
||||
a = page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ALL
|
||||
b = page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS and page.IsImporter()
|
||||
c = page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ALL_BUT_ONLY_IF_GREATER_THAN_ZERO and num_files > 0
|
||||
|
||||
if a or b or c:
|
||||
|
||||
num_string += HydrusNumbers.ToHumanInt( num_files )
|
||||
|
||||
|
@ -1370,7 +1380,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
if len( num_string ) > 0:
|
||||
|
||||
num_string += ', '
|
||||
num_string += ' - '
|
||||
|
||||
|
||||
num_string += HydrusNumbers.ValueRangeToPrettyString( num_value, num_range )
|
||||
|
@ -1379,7 +1389,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
if len( num_string ) > 0:
|
||||
|
||||
page_name += ' (' + num_string + ')'
|
||||
page_name += f' ({num_string})'
|
||||
|
||||
|
||||
safe_page_name = ClientGUIFunctions.EscapeMnemonics( page_name )
|
||||
|
|
|
@ -3824,312 +3824,314 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
num_inbox = self.GetNumInbox()
|
||||
num_archive = self.GetNumArchive()
|
||||
|
||||
any_selected = num_selected > 0
|
||||
multiple_selected = num_selected > 1
|
||||
|
||||
menu = ClientGUIMenus.GenerateMenu( self.window() )
|
||||
|
||||
if self._HasFocusSingleton():
|
||||
# variables
|
||||
|
||||
collections_selected = True in ( media.IsCollection() for media in self._selected_media )
|
||||
|
||||
services_manager = CG.client_controller.services_manager
|
||||
|
||||
services = services_manager.GetServices()
|
||||
|
||||
file_repositories = [ service for service in services if service.GetServiceType() == HC.FILE_REPOSITORY ]
|
||||
|
||||
ipfs_services = [ service for service in services if service.GetServiceType() == HC.IPFS ]
|
||||
|
||||
local_ratings_services = [ service for service in services if service.GetServiceType() in HC.RATINGS_SERVICES ]
|
||||
|
||||
i_can_post_ratings = len( local_ratings_services ) > 0
|
||||
|
||||
local_media_file_service_keys = { service.GetServiceKey() for service in services if service.GetServiceType() == HC.LOCAL_FILE_DOMAIN }
|
||||
|
||||
file_repository_service_keys = { repository.GetServiceKey() for repository in file_repositories }
|
||||
upload_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_CREATE ) }
|
||||
petition_resolve_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_MODERATE ) }
|
||||
petition_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_PETITION ) } - petition_resolve_permission_file_service_keys
|
||||
user_manage_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_ACCOUNTS, HC.PERMISSION_ACTION_MODERATE ) }
|
||||
ipfs_service_keys = { service.GetServiceKey() for service in ipfs_services }
|
||||
|
||||
if multiple_selected:
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
download_phrase = 'download all possible selected'
|
||||
rescind_download_phrase = 'cancel downloads for all possible selected'
|
||||
upload_phrase = 'upload all possible selected to'
|
||||
rescind_upload_phrase = 'rescind pending selected uploads to'
|
||||
petition_phrase = 'petition all possible selected for removal from'
|
||||
rescind_petition_phrase = 'rescind selected petitions for'
|
||||
remote_delete_phrase = 'delete all possible selected from'
|
||||
modify_account_phrase = 'modify the accounts that uploaded selected to'
|
||||
|
||||
# variables
|
||||
pin_phrase = 'pin all to'
|
||||
rescind_pin_phrase = 'rescind pin to'
|
||||
unpin_phrase = 'unpin all from'
|
||||
rescind_unpin_phrase = 'rescind unpin from'
|
||||
|
||||
collections_selected = True in ( media.IsCollection() for media in self._selected_media )
|
||||
archive_phrase = 'archive selected'
|
||||
inbox_phrase = 're-inbox selected'
|
||||
local_delete_phrase = 'delete selected'
|
||||
delete_physically_phrase = 'delete selected physically now'
|
||||
undelete_phrase = 'undelete selected'
|
||||
clear_deletion_phrase = 'clear deletion record for selected'
|
||||
|
||||
services_manager = CG.client_controller.services_manager
|
||||
else:
|
||||
|
||||
services = services_manager.GetServices()
|
||||
download_phrase = 'download'
|
||||
rescind_download_phrase = 'cancel download'
|
||||
upload_phrase = 'upload to'
|
||||
rescind_upload_phrase = 'rescind pending upload to'
|
||||
petition_phrase = 'petition for removal from'
|
||||
rescind_petition_phrase = 'rescind petition for'
|
||||
remote_delete_phrase = 'delete from'
|
||||
modify_account_phrase = 'modify the account that uploaded this to'
|
||||
|
||||
service_keys_to_names = { service.GetServiceKey() : service.GetName() for service in services }
|
||||
pin_phrase = 'pin to'
|
||||
rescind_pin_phrase = 'rescind pin to'
|
||||
unpin_phrase = 'unpin from'
|
||||
rescind_unpin_phrase = 'rescind unpin from'
|
||||
|
||||
file_repositories = [ service for service in services if service.GetServiceType() == HC.FILE_REPOSITORY ]
|
||||
archive_phrase = 'archive'
|
||||
inbox_phrase = 're-inbox'
|
||||
local_delete_phrase = 'delete'
|
||||
delete_physically_phrase = 'delete physically now'
|
||||
undelete_phrase = 'undelete'
|
||||
clear_deletion_phrase = 'clear deletion record'
|
||||
|
||||
ipfs_services = [ service for service in services if service.GetServiceType() == HC.IPFS ]
|
||||
|
||||
# info about the files
|
||||
|
||||
remote_service_keys = CG.client_controller.services_manager.GetRemoteFileServiceKeys()
|
||||
|
||||
groups_of_current_remote_service_keys = [ locations_manager.GetCurrent().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_pending_remote_service_keys = [ locations_manager.GetPending().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_petitioned_remote_service_keys = [ locations_manager.GetPetitioned().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_deleted_remote_service_keys = [ locations_manager.GetDeleted().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
|
||||
current_remote_service_keys = HydrusLists.MassUnion( groups_of_current_remote_service_keys )
|
||||
pending_remote_service_keys = HydrusLists.MassUnion( groups_of_pending_remote_service_keys )
|
||||
petitioned_remote_service_keys = HydrusLists.MassUnion( groups_of_petitioned_remote_service_keys )
|
||||
deleted_remote_service_keys = HydrusLists.MassUnion( groups_of_deleted_remote_service_keys )
|
||||
|
||||
common_current_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_current_remote_service_keys )
|
||||
common_pending_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_pending_remote_service_keys )
|
||||
common_petitioned_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_petitioned_remote_service_keys )
|
||||
common_deleted_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_deleted_remote_service_keys )
|
||||
|
||||
disparate_current_remote_service_keys = current_remote_service_keys - common_current_remote_service_keys
|
||||
disparate_pending_remote_service_keys = pending_remote_service_keys - common_pending_remote_service_keys
|
||||
disparate_petitioned_remote_service_keys = petitioned_remote_service_keys - common_petitioned_remote_service_keys
|
||||
disparate_deleted_remote_service_keys = deleted_remote_service_keys - common_deleted_remote_service_keys
|
||||
|
||||
pending_file_service_keys = pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
petitioned_file_service_keys = petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
common_current_file_service_keys = common_current_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_pending_file_service_keys = common_pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_petitioned_file_service_keys = common_petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_deleted_file_service_keys = common_deleted_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
disparate_current_file_service_keys = disparate_current_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_pending_file_service_keys = disparate_pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_petitioned_file_service_keys = disparate_petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_deleted_file_service_keys = disparate_deleted_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
pending_ipfs_service_keys = pending_remote_service_keys.intersection( ipfs_service_keys )
|
||||
petitioned_ipfs_service_keys = petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
common_current_ipfs_service_keys = common_current_remote_service_keys.intersection( ipfs_service_keys )
|
||||
common_pending_ipfs_service_keys = common_pending_file_service_keys.intersection( ipfs_service_keys )
|
||||
common_petitioned_ipfs_service_keys = common_petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
disparate_current_ipfs_service_keys = disparate_current_remote_service_keys.intersection( ipfs_service_keys )
|
||||
disparate_pending_ipfs_service_keys = disparate_pending_remote_service_keys.intersection( ipfs_service_keys )
|
||||
disparate_petitioned_ipfs_service_keys = disparate_petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
# valid commands for the files
|
||||
|
||||
current_file_service_keys = set()
|
||||
|
||||
uploadable_file_service_keys = set()
|
||||
|
||||
downloadable_file_service_keys = set()
|
||||
|
||||
petitionable_file_service_keys = set()
|
||||
|
||||
deletable_file_service_keys = set()
|
||||
|
||||
modifyable_file_service_keys = set()
|
||||
|
||||
pinnable_ipfs_service_keys = set()
|
||||
|
||||
unpinnable_ipfs_service_keys = set()
|
||||
|
||||
remote_file_service_keys = ipfs_service_keys.union( file_repository_service_keys )
|
||||
|
||||
for locations_manager in selected_locations_managers:
|
||||
|
||||
local_ratings_services = [ service for service in services if service.GetServiceType() in HC.RATINGS_SERVICES ]
|
||||
current = locations_manager.GetCurrent()
|
||||
deleted = locations_manager.GetDeleted()
|
||||
pending = locations_manager.GetPending()
|
||||
petitioned = locations_manager.GetPetitioned()
|
||||
|
||||
i_can_post_ratings = len( local_ratings_services ) > 0
|
||||
# ALL
|
||||
|
||||
local_media_file_service_keys = { service.GetServiceKey() for service in services if service.GetServiceType() == HC.LOCAL_FILE_DOMAIN }
|
||||
current_file_service_keys.update( current )
|
||||
|
||||
file_repository_service_keys = { repository.GetServiceKey() for repository in file_repositories }
|
||||
upload_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_CREATE ) }
|
||||
petition_resolve_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_MODERATE ) }
|
||||
petition_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_PETITION ) } - petition_resolve_permission_file_service_keys
|
||||
user_manage_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_ACCOUNTS, HC.PERMISSION_ACTION_MODERATE ) }
|
||||
ipfs_service_keys = { service.GetServiceKey() for service in ipfs_services }
|
||||
# FILE REPOS
|
||||
|
||||
if multiple_selected:
|
||||
# we can upload (set pending) to a repo_id when we have permission, a file is local, not current, not pending, and either ( not deleted or we_can_overrule )
|
||||
|
||||
if locations_manager.IsLocal():
|
||||
|
||||
download_phrase = 'download all possible selected'
|
||||
rescind_download_phrase = 'cancel downloads for all possible selected'
|
||||
upload_phrase = 'upload all possible selected to'
|
||||
rescind_upload_phrase = 'rescind pending selected uploads to'
|
||||
petition_phrase = 'petition all possible selected for removal from'
|
||||
rescind_petition_phrase = 'rescind selected petitions for'
|
||||
remote_delete_phrase = 'delete all possible selected from'
|
||||
modify_account_phrase = 'modify the accounts that uploaded selected to'
|
||||
cannot_upload_to = current.union( pending ).union( deleted.difference( petition_resolve_permission_file_service_keys ) )
|
||||
|
||||
pin_phrase = 'pin all to'
|
||||
rescind_pin_phrase = 'rescind pin to'
|
||||
unpin_phrase = 'unpin all from'
|
||||
rescind_unpin_phrase = 'rescind unpin from'
|
||||
can_upload_to = upload_permission_file_service_keys.difference( cannot_upload_to )
|
||||
|
||||
archive_phrase = 'archive selected'
|
||||
inbox_phrase = 're-inbox selected'
|
||||
local_delete_phrase = 'delete selected'
|
||||
delete_physically_phrase = 'delete selected physically now'
|
||||
undelete_phrase = 'undelete selected'
|
||||
clear_deletion_phrase = 'clear deletion record for selected'
|
||||
|
||||
else:
|
||||
|
||||
download_phrase = 'download'
|
||||
rescind_download_phrase = 'cancel download'
|
||||
upload_phrase = 'upload to'
|
||||
rescind_upload_phrase = 'rescind pending upload to'
|
||||
petition_phrase = 'petition for removal from'
|
||||
rescind_petition_phrase = 'rescind petition for'
|
||||
remote_delete_phrase = 'delete from'
|
||||
modify_account_phrase = 'modify the account that uploaded this to'
|
||||
|
||||
pin_phrase = 'pin to'
|
||||
rescind_pin_phrase = 'rescind pin to'
|
||||
unpin_phrase = 'unpin from'
|
||||
rescind_unpin_phrase = 'rescind unpin from'
|
||||
|
||||
archive_phrase = 'archive'
|
||||
inbox_phrase = 're-inbox'
|
||||
local_delete_phrase = 'delete'
|
||||
delete_physically_phrase = 'delete physically now'
|
||||
undelete_phrase = 'undelete'
|
||||
clear_deletion_phrase = 'clear deletion record'
|
||||
uploadable_file_service_keys.update( can_upload_to )
|
||||
|
||||
|
||||
# info about the files
|
||||
# we can download (set pending to local) when we have permission, a file is not local and not already downloading and current
|
||||
|
||||
remote_service_keys = CG.client_controller.services_manager.GetRemoteFileServiceKeys()
|
||||
|
||||
groups_of_current_remote_service_keys = [ locations_manager.GetCurrent().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_pending_remote_service_keys = [ locations_manager.GetPending().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_petitioned_remote_service_keys = [ locations_manager.GetPetitioned().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
groups_of_deleted_remote_service_keys = [ locations_manager.GetDeleted().intersection( remote_service_keys ) for locations_manager in selected_locations_managers ]
|
||||
|
||||
current_remote_service_keys = HydrusLists.MassUnion( groups_of_current_remote_service_keys )
|
||||
pending_remote_service_keys = HydrusLists.MassUnion( groups_of_pending_remote_service_keys )
|
||||
petitioned_remote_service_keys = HydrusLists.MassUnion( groups_of_petitioned_remote_service_keys )
|
||||
deleted_remote_service_keys = HydrusLists.MassUnion( groups_of_deleted_remote_service_keys )
|
||||
|
||||
common_current_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_current_remote_service_keys )
|
||||
common_pending_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_pending_remote_service_keys )
|
||||
common_petitioned_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_petitioned_remote_service_keys )
|
||||
common_deleted_remote_service_keys = HydrusLists.IntelligentMassIntersect( groups_of_deleted_remote_service_keys )
|
||||
|
||||
disparate_current_remote_service_keys = current_remote_service_keys - common_current_remote_service_keys
|
||||
disparate_pending_remote_service_keys = pending_remote_service_keys - common_pending_remote_service_keys
|
||||
disparate_petitioned_remote_service_keys = petitioned_remote_service_keys - common_petitioned_remote_service_keys
|
||||
disparate_deleted_remote_service_keys = deleted_remote_service_keys - common_deleted_remote_service_keys
|
||||
|
||||
pending_file_service_keys = pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
petitioned_file_service_keys = petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
common_current_file_service_keys = common_current_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_pending_file_service_keys = common_pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_petitioned_file_service_keys = common_petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
common_deleted_file_service_keys = common_deleted_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
disparate_current_file_service_keys = disparate_current_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_pending_file_service_keys = disparate_pending_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_petitioned_file_service_keys = disparate_petitioned_remote_service_keys.intersection( file_repository_service_keys )
|
||||
disparate_deleted_file_service_keys = disparate_deleted_remote_service_keys.intersection( file_repository_service_keys )
|
||||
|
||||
pending_ipfs_service_keys = pending_remote_service_keys.intersection( ipfs_service_keys )
|
||||
petitioned_ipfs_service_keys = petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
common_current_ipfs_service_keys = common_current_remote_service_keys.intersection( ipfs_service_keys )
|
||||
common_pending_ipfs_service_keys = common_pending_file_service_keys.intersection( ipfs_service_keys )
|
||||
common_petitioned_ipfs_service_keys = common_petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
disparate_current_ipfs_service_keys = disparate_current_remote_service_keys.intersection( ipfs_service_keys )
|
||||
disparate_pending_ipfs_service_keys = disparate_pending_remote_service_keys.intersection( ipfs_service_keys )
|
||||
disparate_petitioned_ipfs_service_keys = disparate_petitioned_remote_service_keys.intersection( ipfs_service_keys )
|
||||
|
||||
# valid commands for the files
|
||||
|
||||
current_file_service_keys = set()
|
||||
|
||||
uploadable_file_service_keys = set()
|
||||
|
||||
downloadable_file_service_keys = set()
|
||||
|
||||
petitionable_file_service_keys = set()
|
||||
|
||||
deletable_file_service_keys = set()
|
||||
|
||||
modifyable_file_service_keys = set()
|
||||
|
||||
pinnable_ipfs_service_keys = set()
|
||||
|
||||
unpinnable_ipfs_service_keys = set()
|
||||
|
||||
remote_file_service_keys = ipfs_service_keys.union( file_repository_service_keys )
|
||||
|
||||
for locations_manager in selected_locations_managers:
|
||||
if not locations_manager.IsLocal() and not locations_manager.IsDownloading():
|
||||
|
||||
current = locations_manager.GetCurrent()
|
||||
deleted = locations_manager.GetDeleted()
|
||||
pending = locations_manager.GetPending()
|
||||
petitioned = locations_manager.GetPetitioned()
|
||||
|
||||
# ALL
|
||||
|
||||
current_file_service_keys.update( current )
|
||||
|
||||
# FILE REPOS
|
||||
|
||||
# we can upload (set pending) to a repo_id when we have permission, a file is local, not current, not pending, and either ( not deleted or we_can_overrule )
|
||||
|
||||
if locations_manager.IsLocal():
|
||||
|
||||
cannot_upload_to = current.union( pending ).union( deleted.difference( petition_resolve_permission_file_service_keys ) )
|
||||
|
||||
can_upload_to = upload_permission_file_service_keys.difference( cannot_upload_to )
|
||||
|
||||
uploadable_file_service_keys.update( can_upload_to )
|
||||
|
||||
|
||||
# we can download (set pending to local) when we have permission, a file is not local and not already downloading and current
|
||||
|
||||
if not locations_manager.IsLocal() and not locations_manager.IsDownloading():
|
||||
|
||||
downloadable_file_service_keys.update( remote_file_service_keys.intersection( current ) )
|
||||
|
||||
|
||||
# we can petition when we have permission and a file is current and it is not already petitioned
|
||||
|
||||
petitionable_file_service_keys.update( ( petition_permission_file_service_keys & current ) - petitioned )
|
||||
|
||||
# we can delete remote when we have permission and a file is current and it is not already petitioned
|
||||
|
||||
deletable_file_service_keys.update( ( petition_resolve_permission_file_service_keys & current ) - petitioned )
|
||||
|
||||
# we can modify users when we have permission and the file is current or deleted
|
||||
|
||||
modifyable_file_service_keys.update( user_manage_permission_file_service_keys & ( current | deleted ) )
|
||||
|
||||
# IPFS
|
||||
|
||||
# we can pin if a file is local, not current, not pending
|
||||
|
||||
if locations_manager.IsLocal():
|
||||
|
||||
pinnable_ipfs_service_keys.update( ipfs_service_keys - current - pending )
|
||||
|
||||
|
||||
# we can unpin a file if it is current and not petitioned
|
||||
|
||||
unpinnable_ipfs_service_keys.update( ( ipfs_service_keys & current ) - petitioned )
|
||||
downloadable_file_service_keys.update( remote_file_service_keys.intersection( current ) )
|
||||
|
||||
|
||||
# do the actual menu
|
||||
# we can petition when we have permission and a file is current and it is not already petitioned
|
||||
|
||||
selection_info_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
petitionable_file_service_keys.update( ( petition_permission_file_service_keys & current ) - petitioned )
|
||||
|
||||
selected_files_string = ClientMedia.GetMediasFiletypeSummaryString( self._selected_media )
|
||||
# we can delete remote when we have permission and a file is current and it is not already petitioned
|
||||
|
||||
selection_info_menu_label = f'{selected_files_string}, {self._GetPrettyTotalSize( only_selected = True )}'
|
||||
deletable_file_service_keys.update( ( petition_resolve_permission_file_service_keys & current ) - petitioned )
|
||||
|
||||
if multiple_selected:
|
||||
# we can modify users when we have permission and the file is current or deleted
|
||||
|
||||
modifyable_file_service_keys.update( user_manage_permission_file_service_keys & ( current | deleted ) )
|
||||
|
||||
# IPFS
|
||||
|
||||
# we can pin if a file is local, not current, not pending
|
||||
|
||||
if locations_manager.IsLocal():
|
||||
|
||||
pretty_total_duration = self._GetPrettyTotalDuration( only_selected = True )
|
||||
pinnable_ipfs_service_keys.update( ipfs_service_keys - current - pending )
|
||||
|
||||
if pretty_total_duration != '':
|
||||
|
||||
selection_info_menu_label += ', {}'.format( pretty_total_duration )
|
||||
|
||||
|
||||
# we can unpin a file if it is current and not petitioned
|
||||
|
||||
unpinnable_ipfs_service_keys.update( ( ipfs_service_keys & current ) - petitioned )
|
||||
|
||||
|
||||
# do the actual menu
|
||||
|
||||
selection_info_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
selected_files_string = ClientMedia.GetMediasFiletypeSummaryString( self._selected_media )
|
||||
|
||||
selection_info_menu_label = f'{selected_files_string}, {self._GetPrettyTotalSize( only_selected = True )}'
|
||||
|
||||
if multiple_selected:
|
||||
|
||||
pretty_total_duration = self._GetPrettyTotalDuration( only_selected = True )
|
||||
|
||||
if pretty_total_duration != '':
|
||||
|
||||
else:
|
||||
selection_info_menu_label += ', {}'.format( pretty_total_duration )
|
||||
|
||||
# TODO: move away from this hell function GetPrettyInfoLines and set the timestamp tooltips to the be the full ISO time
|
||||
|
||||
else:
|
||||
|
||||
# TODO: move away from this hell function GetPrettyInfoLines and set the timestamp tooltips to the be the full ISO time
|
||||
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
|
||||
pretty_info_lines = list( focus_singleton.GetPrettyInfoLines() )
|
||||
|
||||
ClientGUIMediaMenus.AddPrettyInfoLines( selection_info_menu, pretty_info_lines )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( selection_info_menu )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( selection_info_menu )
|
||||
|
||||
ClientGUIMediaMenus.AddFileViewingStatsMenu( selection_info_menu, self._selected_media )
|
||||
|
||||
if len( disparate_current_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddFileViewingStatsMenu( selection_info_menu, self._selected_media )
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_current_file_service_keys, 'some uploaded to' )
|
||||
|
||||
if len( disparate_current_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_current_file_service_keys, 'some uploaded to' )
|
||||
|
||||
|
||||
if multiple_selected and len( common_current_file_service_keys ) > 0:
|
||||
|
||||
if multiple_selected and len( common_current_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_current_file_service_keys, 'selected uploaded to' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_current_file_service_keys, 'selected uploaded to' )
|
||||
|
||||
if len( disparate_pending_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_pending_file_service_keys, 'some pending to' )
|
||||
|
||||
|
||||
if len( disparate_pending_file_service_keys ) > 0:
|
||||
|
||||
if len( common_pending_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_pending_file_service_keys, 'pending to' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_pending_file_service_keys, 'some pending to' )
|
||||
|
||||
if len( disparate_petitioned_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_petitioned_file_service_keys, 'some petitioned for removal from' )
|
||||
|
||||
|
||||
if len( common_pending_file_service_keys ) > 0:
|
||||
|
||||
if len( common_petitioned_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_petitioned_file_service_keys, 'petitioned for removal from' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_pending_file_service_keys, 'pending to' )
|
||||
|
||||
if len( disparate_deleted_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_deleted_file_service_keys, 'some deleted from' )
|
||||
|
||||
|
||||
if len( disparate_petitioned_file_service_keys ) > 0:
|
||||
|
||||
if len( common_deleted_file_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_deleted_file_service_keys, 'deleted from' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_petitioned_file_service_keys, 'some petitioned for removal from' )
|
||||
|
||||
if len( disparate_current_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_current_ipfs_service_keys, 'some pinned to' )
|
||||
|
||||
|
||||
if len( common_petitioned_file_service_keys ) > 0:
|
||||
|
||||
if multiple_selected and len( common_current_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_current_ipfs_service_keys, 'selected pinned to' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_petitioned_file_service_keys, 'petitioned for removal from' )
|
||||
|
||||
if len( disparate_pending_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_pending_ipfs_service_keys, 'some to be pinned to' )
|
||||
|
||||
|
||||
if len( disparate_deleted_file_service_keys ) > 0:
|
||||
|
||||
if len( common_pending_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_pending_ipfs_service_keys, 'to be pinned to' )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_deleted_file_service_keys, 'some deleted from' )
|
||||
|
||||
if len( disparate_petitioned_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_petitioned_ipfs_service_keys, 'some to be unpinned from' )
|
||||
|
||||
|
||||
if len( common_deleted_file_service_keys ) > 0:
|
||||
|
||||
if len( common_petitioned_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_petitioned_ipfs_service_keys, unpin_phrase )
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_deleted_file_service_keys, 'deleted from' )
|
||||
|
||||
|
||||
if len( disparate_current_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_current_ipfs_service_keys, 'some pinned to' )
|
||||
|
||||
|
||||
if multiple_selected and len( common_current_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_current_ipfs_service_keys, 'selected pinned to' )
|
||||
|
||||
|
||||
if len( disparate_pending_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_pending_ipfs_service_keys, 'some to be pinned to' )
|
||||
|
||||
|
||||
if len( common_pending_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_pending_ipfs_service_keys, 'to be pinned to' )
|
||||
|
||||
|
||||
if len( disparate_petitioned_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, disparate_petitioned_ipfs_service_keys, 'some to be unpinned from' )
|
||||
|
||||
|
||||
if len( common_petitioned_ipfs_service_keys ) > 0:
|
||||
|
||||
ClientGUIMediaMenus.AddServiceKeyLabelsToMenu( selection_info_menu, common_petitioned_ipfs_service_keys, unpin_phrase )
|
||||
|
||||
|
||||
if any_selected:
|
||||
|
||||
if len( selection_info_menu.actions() ) == 0:
|
||||
|
||||
|
@ -4142,8 +4144,8 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
ClientGUIMenus.AppendMenu( menu, selection_info_menu, selection_info_menu_label )
|
||||
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'refresh', 'Refresh the current search.', self.refreshQuery.emit )
|
||||
|
||||
|
@ -4171,9 +4173,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
earliest_index = self._sorted_media.index( ordered_selected_media[0] )
|
||||
|
||||
num_selected = len( self._selected_media )
|
||||
|
||||
selection_is_contiguous = num_selected > 0 and self._sorted_media.index( ordered_selected_media[-1] ) - earliest_index == num_selected - 1
|
||||
selection_is_contiguous = any_selected and self._sorted_media.index( ordered_selected_media[-1] ) - earliest_index == num_selected - 1
|
||||
|
||||
AddMoveMenu( self, menu, self._selected_media, self._sorted_media, self._focused_media, selection_is_contiguous, earliest_index )
|
||||
|
||||
|
@ -4191,59 +4191,55 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
|
||||
|
||||
if self._HasFocusSingleton():
|
||||
if selection_has_inbox:
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
ClientGUIMenus.AppendMenuItem( menu, archive_phrase, 'Archive the selected files.', self._Archive )
|
||||
|
||||
if selection_has_inbox:
|
||||
|
||||
if selection_has_archive:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, inbox_phrase, 'Put the selected files back in the inbox.', self._Inbox )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
user_command_deletable_file_service_keys = local_media_file_service_keys.union( [ CC.LOCAL_UPDATE_SERVICE_KEY ] )
|
||||
|
||||
local_file_service_keys_we_are_in = sorted( current_file_service_keys.intersection( user_command_deletable_file_service_keys ), key = CG.client_controller.services_manager.GetName )
|
||||
|
||||
if len( local_file_service_keys_we_are_in ) > 0:
|
||||
|
||||
delete_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
for file_service_key in local_file_service_keys_we_are_in:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, archive_phrase, 'Archive the selected files.', self._Archive )
|
||||
service_name = CG.client_controller.services_manager.GetName( file_service_key )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( delete_menu, f'from {service_name}', f'Delete the selected files from {service_name}.', self._Delete, file_service_key )
|
||||
|
||||
|
||||
if selection_has_archive:
|
||||
ClientGUIMenus.AppendMenu( menu, delete_menu, local_delete_phrase )
|
||||
|
||||
|
||||
if selection_has_trash:
|
||||
|
||||
if selection_has_local_file_domain:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, inbox_phrase, 'Put the selected files back in the inbox.', self._Inbox )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete trash physically now', 'Completely delete the selected trashed files, forcing an immediate physical delete from your hard drive.', self._Delete, CC.COMBINED_LOCAL_FILE_SERVICE_KEY, only_those_in_file_service_key = CC.TRASH_SERVICE_KEY )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
ClientGUIMenus.AppendMenuItem( menu, delete_physically_phrase, 'Completely delete the selected files, forcing an immediate physical delete from your hard drive.', self._Delete, CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
ClientGUIMenus.AppendMenuItem( menu, undelete_phrase, 'Restore the selected files back to \'my files\'.', self._Undelete )
|
||||
|
||||
user_command_deletable_file_service_keys = local_media_file_service_keys.union( [ CC.LOCAL_UPDATE_SERVICE_KEY ] )
|
||||
|
||||
if selection_has_deletion_record:
|
||||
|
||||
local_file_service_keys_we_are_in = sorted( current_file_service_keys.intersection( user_command_deletable_file_service_keys ), key = CG.client_controller.services_manager.GetName )
|
||||
ClientGUIMenus.AppendMenuItem( menu, clear_deletion_phrase, 'Clear the deletion record for these files, allowing them to reimport even if previously deleted files are set to be discarded.', self._ClearDeleteRecord )
|
||||
|
||||
if len( local_file_service_keys_we_are_in ) > 0:
|
||||
|
||||
delete_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
for file_service_key in local_file_service_keys_we_are_in:
|
||||
|
||||
service_name = CG.client_controller.services_manager.GetName( file_service_key )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( delete_menu, f'from {service_name}', f'Delete the selected files from {service_name}.', self._Delete, file_service_key )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, delete_menu, local_delete_phrase )
|
||||
|
||||
|
||||
if selection_has_trash:
|
||||
|
||||
if selection_has_local_file_domain:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete trash physically now', 'Completely delete the selected trashed files, forcing an immediate physical delete from your hard drive.', self._Delete, CC.COMBINED_LOCAL_FILE_SERVICE_KEY, only_those_in_file_service_key = CC.TRASH_SERVICE_KEY )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, delete_physically_phrase, 'Completely delete the selected files, forcing an immediate physical delete from your hard drive.', self._Delete, CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
ClientGUIMenus.AppendMenuItem( menu, undelete_phrase, 'Restore the selected files back to \'my files\'.', self._Undelete )
|
||||
|
||||
|
||||
if selection_has_deletion_record:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, clear_deletion_phrase, 'Clear the deletion record for these files, allowing them to reimport even if previously deleted files are set to be discarded.', self._ClearDeleteRecord )
|
||||
|
||||
|
||||
#
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
if any_selected:
|
||||
|
||||
manage_menu = ClientGUIMenus.GenerateMenu( menu )
|
||||
|
||||
|
@ -4254,7 +4250,14 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
ClientGUIMenus.AppendMenuItem( manage_menu, 'ratings', 'Manage ratings for the selected files.', self._ManageRatings )
|
||||
|
||||
|
||||
num_notes = focus_singleton.GetNotesManager().GetNumNotes()
|
||||
num_notes = 0
|
||||
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
|
||||
num_notes = focus_singleton.GetNotesManager().GetNumNotes()
|
||||
|
||||
|
||||
notes_str = 'notes'
|
||||
|
||||
|
@ -4268,7 +4271,12 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
ClientGUIMenus.AppendMenuItem( manage_menu, 'times', 'Edit the timestamps for your files.', self._ManageTimestamps )
|
||||
ClientGUIMenus.AppendMenuItem( manage_menu, 'force filetype', 'Force your files to appear as a different filetype.', ClientGUIMediaModalActions.SetFilesForcedFiletypes, self, self._selected_media )
|
||||
|
||||
ClientGUIMediaMenus.AddDuplicatesMenu( self, manage_menu, self._location_context, focus_singleton, num_selected, collections_selected )
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
focus_singleton = self._GetFocusSingleton()
|
||||
|
||||
ClientGUIMediaMenus.AddDuplicatesMenu( self, manage_menu, self._location_context, focus_singleton, num_selected, collections_selected )
|
||||
|
||||
|
||||
regen_menu = ClientGUIMenus.GenerateMenu( manage_menu )
|
||||
|
||||
|
|
|
@ -1755,9 +1755,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._page_file_count_display = ClientGUICommon.BetterChoice( self._page_names_panel )
|
||||
|
||||
for display_type in ( CC.PAGE_FILE_COUNT_DISPLAY_ALL, CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS, CC.PAGE_FILE_COUNT_DISPLAY_NONE ):
|
||||
for display_type in ( CC.PAGE_FILE_COUNT_DISPLAY_ALL, CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS, CC.PAGE_FILE_COUNT_DISPLAY_NONE, CC.PAGE_FILE_COUNT_DISPLAY_ALL_BUT_ONLY_IF_GREATER_THAN_ZERO ):
|
||||
|
||||
self._page_file_count_display.addItem( CC.page_file_count_display_string_lookup[ display_type], display_type )
|
||||
self._page_file_count_display.addItem( CC.page_file_count_display_string_lookup[ display_type ], display_type )
|
||||
|
||||
|
||||
self._import_page_progress_display = QW.QCheckBox( self._page_names_panel )
|
||||
|
@ -1846,7 +1846,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'In new page chooser, show "all my files" if appropriate: ', self._show_all_my_files_on_page_chooser ) )
|
||||
rows.append( ( 'In new page chooser, show "local files": ', self._show_local_files_on_page_chooser ) )
|
||||
rows.append( ( 'Put new page tabs on: ', self._default_new_page_goes ) )
|
||||
rows.append( ( 'When closing tabs, move focus: ', self._close_page_focus_goes ) )
|
||||
rows.append( ( 'When closing the current tab, move focus: ', self._close_page_focus_goes ) )
|
||||
rows.append( ( 'Notebook tab alignment: ', self._notebook_tab_alignment ) )
|
||||
rows.append( ( 'Selection chases dropped page after drag and drop: ', self._page_drop_chase_normally ) )
|
||||
rows.append( ( ' With shift held down?: ', self._page_drop_chase_with_shift ) )
|
||||
|
|
|
@ -1987,7 +1987,7 @@ class FileSeedCacheStatus( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if num_ignored > 0:
|
||||
|
||||
simple_status_strings.append( '{}Ig'.format( HydrusNumbers.ToHumanInt( num_ignored ) ) )
|
||||
simple_status_strings.append( '{}Ign'.format( HydrusNumbers.ToHumanInt( num_ignored ) ) )
|
||||
|
||||
|
||||
show_deleted_on_file_seed_short_summary = CG.client_controller.new_options.GetBoolean( 'show_deleted_on_file_seed_short_summary' )
|
||||
|
|
|
@ -1,167 +0,0 @@
|
|||
from twisted.web.resource import NoResource
|
||||
|
||||
from hydrus.core.networking import HydrusServer
|
||||
|
||||
from hydrus.client.networking import ClientLocalServerResources
|
||||
|
||||
class HydrusClientService( HydrusServer.HydrusService ):
|
||||
|
||||
def __init__( self, service, allow_non_local_connections ):
|
||||
|
||||
if allow_non_local_connections:
|
||||
|
||||
self._client_requests_domain = HydrusServer.REMOTE_DOMAIN
|
||||
|
||||
else:
|
||||
|
||||
self._client_requests_domain = HydrusServer.LOCAL_DOMAIN
|
||||
|
||||
|
||||
HydrusServer.HydrusService.__init__( self, service )
|
||||
|
||||
|
||||
|
||||
class HydrusServiceClientAPI( HydrusClientService ):
|
||||
|
||||
def _InitRoot( self ):
|
||||
|
||||
root = HydrusClientService._InitRoot( self )
|
||||
|
||||
root.putChild( b'api_version', ClientLocalServerResources.HydrusResourceClientAPIVersion( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'request_new_permissions', ClientLocalServerResources.HydrusResourceClientAPIPermissionsRequest( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'session_key', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAccountSessionKey( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'verify_access_key', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAccountVerify( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'get_services', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetServices( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'get_service', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetService( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_files = NoResource()
|
||||
|
||||
root.putChild( b'add_files', add_files )
|
||||
|
||||
add_files.putChild( b'add_file', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesAddFile( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'clear_file_deletion_record', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesClearDeletedFileRecord( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'delete_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesDeleteFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'undelete_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesUndeleteFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'migrate_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesMigrateFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'archive_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesArchiveFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'unarchive_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesUnarchiveFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'generate_hashes', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddFilesGenerateHashes( self._service, self._client_requests_domain ) )
|
||||
|
||||
edit_ratings = NoResource()
|
||||
|
||||
root.putChild( b'edit_ratings', edit_ratings )
|
||||
|
||||
edit_ratings.putChild( b'set_rating', ClientLocalServerResources.HydrusResourceClientAPIRestrictedEditRatingsSetRating( self._service, self._client_requests_domain ) )
|
||||
|
||||
edit_times = NoResource()
|
||||
|
||||
root.putChild( b'edit_times', edit_times )
|
||||
|
||||
edit_times.putChild( b'set_time', ClientLocalServerResources.HydrusResourceClientAPIRestrictedEditTimesSetTime( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_tags = NoResource()
|
||||
|
||||
root.putChild( b'add_tags', add_tags )
|
||||
|
||||
add_tags.putChild( b'add_tags', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsAddTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'clean_tags', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsCleanTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'search_tags', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsSearchTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'get_siblings_and_parents', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsGetTagSiblingsParents( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_urls = NoResource()
|
||||
|
||||
root.putChild( b'add_urls', add_urls )
|
||||
|
||||
add_urls.putChild( b'get_url_info', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsGetURLInfo( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'get_url_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsGetURLFiles( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'add_url', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsImportURL( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'associate_url', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsAssociateURL( self._service, self._client_requests_domain ) )
|
||||
|
||||
get_files = NoResource()
|
||||
|
||||
root.putChild( b'get_files', get_files )
|
||||
|
||||
get_files.putChild( b'search_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesSearchFiles( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file_metadata', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesFileMetadata( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file_hashes', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesFileHashes( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetFile( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'thumbnail', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'render', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetRenderedFile( self._service, self._client_requests_domain) )
|
||||
|
||||
add_notes = NoResource()
|
||||
|
||||
root.putChild( b'add_notes', add_notes )
|
||||
|
||||
add_notes.putChild( b'set_notes', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddNotesSetNotes( self._service, self._client_requests_domain ) )
|
||||
add_notes.putChild( b'delete_notes', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddNotesDeleteNotes( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_cookies = NoResource()
|
||||
|
||||
root.putChild( b'manage_cookies', manage_cookies )
|
||||
|
||||
manage_cookies.putChild( b'get_cookies', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesGetCookies( self._service, self._client_requests_domain ) )
|
||||
manage_cookies.putChild( b'set_cookies', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetCookies( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_database = NoResource()
|
||||
|
||||
root.putChild( b'manage_database', manage_database )
|
||||
|
||||
manage_database.putChild( b'mr_bones', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseMrBones( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'lock_on', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseLockOn( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'lock_off', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseLockOff( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'get_client_options', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageDatabaseGetClientOptions( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_services = NoResource()
|
||||
|
||||
root.putChild( b'manage_services', manage_services )
|
||||
|
||||
manage_services.putChild( b'get_pending_counts', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesPendingCounts( self._service, self._client_requests_domain ) )
|
||||
manage_services.putChild( b'commit_pending', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesCommitPending( self._service, self._client_requests_domain ) )
|
||||
manage_services.putChild( b'forget_pending', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageServicesForgetPending( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_file_relationships = NoResource()
|
||||
|
||||
root.putChild( b'manage_file_relationships', manage_file_relationships )
|
||||
|
||||
manage_file_relationships.putChild( b'get_file_relationships', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRelationships( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_potentials_count', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialsCount( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_potential_pairs', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialPairs( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_random_potentials', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRandomPotentials( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'remove_potentials', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsRemovePotentials( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'set_file_relationships', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'set_kings', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetKings( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_headers = NoResource()
|
||||
|
||||
root.putChild( b'manage_headers', manage_headers )
|
||||
|
||||
manage_headers.putChild( b'set_user_agent', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( self._service, self._client_requests_domain ) )
|
||||
manage_headers.putChild( b'get_headers', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesGetHeaders( self._service, self._client_requests_domain ) )
|
||||
manage_headers.putChild( b'set_headers', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_pages = NoResource()
|
||||
|
||||
root.putChild( b'manage_pages', manage_pages )
|
||||
|
||||
manage_pages.putChild( b'add_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesAddFiles( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'focus_page', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesFocusPage( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_pages', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesGetPages( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_page_info', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'refresh_page', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesRefreshPage( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_popups = NoResource()
|
||||
|
||||
root.putChild( b'manage_popups', manage_popups )
|
||||
|
||||
manage_popups.putChild( b'get_popups', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsGetPopups( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'cancel_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsCancelPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'dismiss_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_and_dismiss_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsFinishAndDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'call_user_callable', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'add_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsAddPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'update_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( self._service, self._client_requests_domain ) )
|
||||
|
||||
return root
|
||||
|
||||
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,182 @@
|
|||
from twisted.web.resource import NoResource
|
||||
|
||||
from hydrus.core.networking import HydrusServer
|
||||
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesAccess
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesAddFiles
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesAddNotes
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesAddTags
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesAddURLs
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesEditRatings
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesEditTimes
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesGetFiles
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManageCookies
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManageDatabase
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManageFileRelationships
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManagePages
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManagePopups
|
||||
from hydrus.client.networking.api import ClientLocalServerResourcesManageServices
|
||||
|
||||
class HydrusClientService( HydrusServer.HydrusService ):
|
||||
|
||||
def __init__( self, service, allow_non_local_connections ):
|
||||
|
||||
if allow_non_local_connections:
|
||||
|
||||
self._client_requests_domain = HydrusServer.REMOTE_DOMAIN
|
||||
|
||||
else:
|
||||
|
||||
self._client_requests_domain = HydrusServer.LOCAL_DOMAIN
|
||||
|
||||
|
||||
HydrusServer.HydrusService.__init__( self, service )
|
||||
|
||||
|
||||
|
||||
class HydrusServiceClientAPI( HydrusClientService ):
|
||||
|
||||
def _InitRoot( self ):
|
||||
|
||||
root = HydrusClientService._InitRoot( self )
|
||||
|
||||
root.putChild( b'api_version', ClientLocalServerResourcesAccess.HydrusResourceClientAPIVersion( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'request_new_permissions', ClientLocalServerResourcesAccess.HydrusResourceClientAPIPermissionsRequest( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'session_key', ClientLocalServerResourcesAccess.HydrusResourceClientAPIRestrictedAccountSessionKey( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'verify_access_key', ClientLocalServerResourcesAccess.HydrusResourceClientAPIRestrictedAccountVerify( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'get_services', ClientLocalServerResourcesAccess.HydrusResourceClientAPIRestrictedGetServices( self._service, self._client_requests_domain ) )
|
||||
root.putChild( b'get_service', ClientLocalServerResourcesAccess.HydrusResourceClientAPIRestrictedGetService( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_files = NoResource()
|
||||
|
||||
root.putChild( b'add_files', add_files )
|
||||
|
||||
add_files.putChild( b'add_file', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesAddFile( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'clear_file_deletion_record', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesClearDeletedFileRecord( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'delete_files', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesDeleteFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'undelete_files', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesUndeleteFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'migrate_files', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesMigrateFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'archive_files', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesArchiveFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'unarchive_files', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesUnarchiveFiles( self._service, self._client_requests_domain ) )
|
||||
add_files.putChild( b'generate_hashes', ClientLocalServerResourcesAddFiles.HydrusResourceClientAPIRestrictedAddFilesGenerateHashes( self._service, self._client_requests_domain ) )
|
||||
|
||||
edit_ratings = NoResource()
|
||||
|
||||
root.putChild( b'edit_ratings', edit_ratings )
|
||||
|
||||
edit_ratings.putChild( b'set_rating', ClientLocalServerResourcesEditRatings.HydrusResourceClientAPIRestrictedEditRatingsSetRating( self._service, self._client_requests_domain ) )
|
||||
|
||||
edit_times = NoResource()
|
||||
|
||||
root.putChild( b'edit_times', edit_times )
|
||||
|
||||
edit_times.putChild( b'set_time', ClientLocalServerResourcesEditTimes.HydrusResourceClientAPIRestrictedEditTimesSetTime( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_tags = NoResource()
|
||||
|
||||
root.putChild( b'add_tags', add_tags )
|
||||
|
||||
add_tags.putChild( b'add_tags', ClientLocalServerResourcesAddTags.HydrusResourceClientAPIRestrictedAddTagsAddTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'clean_tags', ClientLocalServerResourcesAddTags.HydrusResourceClientAPIRestrictedAddTagsCleanTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'search_tags', ClientLocalServerResourcesAddTags.HydrusResourceClientAPIRestrictedAddTagsSearchTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'get_siblings_and_parents', ClientLocalServerResourcesAddTags.HydrusResourceClientAPIRestrictedAddTagsGetTagSiblingsParents( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_urls = NoResource()
|
||||
|
||||
root.putChild( b'add_urls', add_urls )
|
||||
|
||||
add_urls.putChild( b'get_url_info', ClientLocalServerResourcesAddURLs.HydrusResourceClientAPIRestrictedAddURLsGetURLInfo( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'get_url_files', ClientLocalServerResourcesAddURLs.HydrusResourceClientAPIRestrictedAddURLsGetURLFiles( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'add_url', ClientLocalServerResourcesAddURLs.HydrusResourceClientAPIRestrictedAddURLsImportURL( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'associate_url', ClientLocalServerResourcesAddURLs.HydrusResourceClientAPIRestrictedAddURLsAssociateURL( self._service, self._client_requests_domain ) )
|
||||
|
||||
get_files = NoResource()
|
||||
|
||||
root.putChild( b'get_files', get_files )
|
||||
|
||||
get_files.putChild( b'search_files', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesSearchFiles( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file_metadata', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesFileMetadata( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file_hashes', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesFileHashes( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesGetFile( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'file_path', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesGetFilePath( self._service, self._client_requests_domain) )
|
||||
get_files.putChild( b'thumbnail', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'thumbnail_path', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesGetThumbnailPath( self._service, self._client_requests_domain) )
|
||||
get_files.putChild( b'render', ClientLocalServerResourcesGetFiles.HydrusResourceClientAPIRestrictedGetFilesGetRenderedFile( self._service, self._client_requests_domain) )
|
||||
|
||||
add_notes = NoResource()
|
||||
|
||||
root.putChild( b'add_notes', add_notes )
|
||||
|
||||
add_notes.putChild( b'set_notes', ClientLocalServerResourcesAddNotes.HydrusResourceClientAPIRestrictedAddNotesSetNotes( self._service, self._client_requests_domain ) )
|
||||
add_notes.putChild( b'delete_notes', ClientLocalServerResourcesAddNotes.HydrusResourceClientAPIRestrictedAddNotesDeleteNotes( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_cookies = NoResource()
|
||||
|
||||
root.putChild( b'manage_cookies', manage_cookies )
|
||||
|
||||
manage_cookies.putChild( b'get_cookies', ClientLocalServerResourcesManageCookies.HydrusResourceClientAPIRestrictedManageCookiesGetCookies( self._service, self._client_requests_domain ) )
|
||||
manage_cookies.putChild( b'set_cookies', ClientLocalServerResourcesManageCookies.HydrusResourceClientAPIRestrictedManageCookiesSetCookies( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_database = NoResource()
|
||||
|
||||
root.putChild( b'manage_database', manage_database )
|
||||
|
||||
manage_database.putChild( b'mr_bones', ClientLocalServerResourcesManageDatabase.HydrusResourceClientAPIRestrictedManageDatabaseMrBones( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'lock_on', ClientLocalServerResourcesManageDatabase.HydrusResourceClientAPIRestrictedManageDatabaseLockOn( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'lock_off', ClientLocalServerResourcesManageDatabase.HydrusResourceClientAPIRestrictedManageDatabaseLockOff( self._service, self._client_requests_domain ) )
|
||||
manage_database.putChild( b'get_client_options', ClientLocalServerResourcesManageDatabase.HydrusResourceClientAPIRestrictedManageDatabaseGetClientOptions( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_services = NoResource()
|
||||
|
||||
root.putChild( b'manage_services', manage_services )
|
||||
|
||||
manage_services.putChild( b'get_pending_counts', ClientLocalServerResourcesManageServices.HydrusResourceClientAPIRestrictedManageServicesPendingCounts( self._service, self._client_requests_domain ) )
|
||||
manage_services.putChild( b'commit_pending', ClientLocalServerResourcesManageServices.HydrusResourceClientAPIRestrictedManageServicesCommitPending( self._service, self._client_requests_domain ) )
|
||||
manage_services.putChild( b'forget_pending', ClientLocalServerResourcesManageServices.HydrusResourceClientAPIRestrictedManageServicesForgetPending( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_file_relationships = NoResource()
|
||||
|
||||
root.putChild( b'manage_file_relationships', manage_file_relationships )
|
||||
|
||||
manage_file_relationships.putChild( b'get_file_relationships', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRelationships( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_potentials_count', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialsCount( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_potential_pairs', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialPairs( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'get_random_potentials', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRandomPotentials( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'remove_potentials', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsRemovePotentials( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'set_file_relationships', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships( self._service, self._client_requests_domain ) )
|
||||
manage_file_relationships.putChild( b'set_kings', ClientLocalServerResourcesManageFileRelationships.HydrusResourceClientAPIRestrictedManageFileRelationshipsSetKings( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_headers = NoResource()
|
||||
|
||||
root.putChild( b'manage_headers', manage_headers )
|
||||
|
||||
manage_headers.putChild( b'set_user_agent', ClientLocalServerResourcesManageCookies.HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( self._service, self._client_requests_domain ) )
|
||||
manage_headers.putChild( b'get_headers', ClientLocalServerResourcesManageCookies.HydrusResourceClientAPIRestrictedManageCookiesGetHeaders( self._service, self._client_requests_domain ) )
|
||||
manage_headers.putChild( b'set_headers', ClientLocalServerResourcesManageCookies.HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_pages = NoResource()
|
||||
|
||||
root.putChild( b'manage_pages', manage_pages )
|
||||
|
||||
manage_pages.putChild( b'add_files', ClientLocalServerResourcesManagePages.HydrusResourceClientAPIRestrictedManagePagesAddFiles( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'focus_page', ClientLocalServerResourcesManagePages.HydrusResourceClientAPIRestrictedManagePagesFocusPage( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_pages', ClientLocalServerResourcesManagePages.HydrusResourceClientAPIRestrictedManagePagesGetPages( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_page_info', ClientLocalServerResourcesManagePages.HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'refresh_page', ClientLocalServerResourcesManagePages.HydrusResourceClientAPIRestrictedManagePagesRefreshPage( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_popups = NoResource()
|
||||
|
||||
root.putChild( b'manage_popups', manage_popups )
|
||||
|
||||
manage_popups.putChild( b'get_popups', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsGetPopups( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'cancel_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsCancelPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'dismiss_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_and_dismiss_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsFinishAndDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'call_user_callable', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'add_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsAddPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'update_popup', ClientLocalServerResourcesManagePopups.HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( self._service, self._client_requests_domain ) )
|
||||
|
||||
return root
|
||||
|
||||
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,200 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
|
||||
class HydrusResourceClientAPI( HydrusServerResources.HydrusResource ):
|
||||
|
||||
BLOCKED_WHEN_BUSY = True
|
||||
|
||||
def _callbackParseGETArgs( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
parsed_request_args = ClientLocalServerCore.ParseClientAPIGETArgs( request.args )
|
||||
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
requested_response_mime = ClientLocalServerCore.ParseRequestedResponseMime( request )
|
||||
|
||||
if requested_response_mime == HC.APPLICATION_CBOR and not ClientLocalServerCore.CBOR_AVAILABLE:
|
||||
|
||||
raise HydrusExceptions.NotAcceptable( 'Sorry, this service does not support CBOR!' )
|
||||
|
||||
|
||||
request.preferred_mime = requested_response_mime
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _callbackParsePOSTArgs( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
( parsed_request_args, total_bytes_read ) = ClientLocalServerCore.ParseClientAPIPOSTArgs( request )
|
||||
|
||||
self._reportDataUsed( request, total_bytes_read )
|
||||
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
requested_response_mime = ClientLocalServerCore.ParseRequestedResponseMime( request )
|
||||
|
||||
if requested_response_mime == HC.APPLICATION_CBOR and not ClientLocalServerCore.CBOR_AVAILABLE:
|
||||
|
||||
raise HydrusExceptions.NotAcceptable( 'Sorry, this service does not support CBOR!' )
|
||||
|
||||
|
||||
request.preferred_mime = requested_response_mime
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _reportDataUsed( self, request, num_bytes ):
|
||||
|
||||
self._service.ReportDataUsed( num_bytes )
|
||||
|
||||
|
||||
def _reportRequestStarted( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
HydrusServerResources.HydrusResource._reportRequestStarted( self, request )
|
||||
|
||||
CG.client_controller.ResetIdleTimerFromClientAPI()
|
||||
|
||||
|
||||
def _checkService( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
HydrusServerResources.HydrusResource._checkService( self, request )
|
||||
|
||||
if self.BLOCKED_WHEN_BUSY and HG.client_busy.locked():
|
||||
|
||||
raise HydrusExceptions.ServerBusyException( 'This server is busy, please try again later.' )
|
||||
|
||||
|
||||
if not self._service.BandwidthOK():
|
||||
|
||||
raise HydrusExceptions.BandwidthException( 'This service has run out of bandwidth. Please try again later.' )
|
||||
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestricted( HydrusResourceClientAPI ):
|
||||
|
||||
def _callbackCheckAccountRestrictions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
HydrusResourceClientAPI._callbackCheckAccountRestrictions( self, request )
|
||||
|
||||
self._CheckAPIPermissions( request )
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _callbackEstablishAccountFromHeader( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
access_key = self._ParseClientAPIAccessKey( request, 'header' )
|
||||
|
||||
if access_key is not None:
|
||||
|
||||
self._EstablishAPIPermissions( request, access_key )
|
||||
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _callbackEstablishAccountFromArgs( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if request.client_api_permissions is None:
|
||||
|
||||
access_key = self._ParseClientAPIAccessKey( request, 'args' )
|
||||
|
||||
if access_key is not None:
|
||||
|
||||
self._EstablishAPIPermissions( request, access_key )
|
||||
|
||||
|
||||
|
||||
if request.client_api_permissions is None:
|
||||
|
||||
raise HydrusExceptions.MissingCredentialsException( 'No access key or session key provided!' )
|
||||
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def _EstablishAPIPermissions( self, request, access_key ):
|
||||
|
||||
try:
|
||||
|
||||
api_permissions = CG.client_controller.client_api_manager.GetPermissions( access_key )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( str( e ) )
|
||||
|
||||
|
||||
request.client_api_permissions = api_permissions
|
||||
|
||||
|
||||
def _ParseClientAPIKey( self, request, source, name_of_key ):
|
||||
|
||||
key = None
|
||||
|
||||
if source == 'header':
|
||||
|
||||
if request.requestHeaders.hasHeader( name_of_key ):
|
||||
|
||||
key_texts = request.requestHeaders.getRawHeaders( name_of_key )
|
||||
|
||||
key_text = key_texts[0]
|
||||
|
||||
try:
|
||||
|
||||
key = bytes.fromhex( key_text )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Problem parsing {}!'.format( name_of_key ) )
|
||||
|
||||
|
||||
|
||||
elif source == 'args':
|
||||
|
||||
if name_of_key in request.parsed_request_args:
|
||||
|
||||
key = request.parsed_request_args.GetValue( name_of_key, bytes )
|
||||
|
||||
|
||||
|
||||
return key
|
||||
|
||||
|
||||
def _ParseClientAPIAccessKey( self, request, source ):
|
||||
|
||||
access_key = self._ParseClientAPIKey( request, source, 'Hydrus-Client-API-Access-Key' )
|
||||
|
||||
if access_key is None:
|
||||
|
||||
session_key = self._ParseClientAPIKey( request, source, 'Hydrus-Client-API-Session-Key' )
|
||||
|
||||
if session_key is None:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
try:
|
||||
|
||||
access_key = CG.client_controller.client_api_manager.GetAccessKey( session_key )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.SessionException( str( e ) )
|
||||
|
||||
|
||||
|
||||
return access_key
|
||||
|
||||
|
|
@ -0,0 +1,271 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIPermissionsRequest( ClientLocalServerResources.HydrusResourceClientAPI ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if not ClientAPI.api_request_dialog_open:
|
||||
|
||||
raise HydrusExceptions.ConflictException( 'The permission registration dialog is not open. Please open it under "review services" in the hydrus client.' )
|
||||
|
||||
|
||||
name = request.parsed_request_args.GetValue( 'name', str )
|
||||
|
||||
permits_everything = request.parsed_request_args.GetValue( 'permits_everything', bool, default_value = False )
|
||||
|
||||
basic_permissions = request.parsed_request_args.GetValue( 'basic_permissions', list, expected_list_type = int, default_value = [] )
|
||||
|
||||
basic_permissions = [ int( value ) for value in basic_permissions ]
|
||||
|
||||
api_permissions = ClientAPI.APIPermissions( name = name, permits_everything = permits_everything, basic_permissions = basic_permissions )
|
||||
|
||||
ClientAPI.last_api_permissions_request = api_permissions
|
||||
|
||||
access_key = api_permissions.GetAccessKey()
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict[ 'access_key' ] = access_key.hex()
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIVersion( ClientLocalServerResources.HydrusResourceClientAPI ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict[ 'version' ] = HC.CLIENT_API_VERSION
|
||||
body_dict[ 'hydrus_version' ] = HC.SOFTWARE_VERSION
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAccount( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAccountSessionKey( HydrusResourceClientAPIRestrictedAccount ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
new_session_key = CG.client_controller.client_api_manager.GenerateSessionKey( request.client_api_permissions.GetAccessKey() )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict[ 'session_key' ] = new_session_key.hex()
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAccountVerify( HydrusResourceClientAPIRestrictedAccount ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
api_permissions = request.client_api_permissions
|
||||
|
||||
permits_everything = api_permissions.PermitsEverything()
|
||||
|
||||
if permits_everything:
|
||||
|
||||
basic_permissions = ClientAPI.ALLOWED_PERMISSIONS
|
||||
|
||||
else:
|
||||
|
||||
basic_permissions = api_permissions.GetBasicPermissions()
|
||||
|
||||
|
||||
human_description = api_permissions.ToHumanString()
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict[ 'name' ] = api_permissions.GetName()
|
||||
body_dict[ 'permits_everything' ] = api_permissions.PermitsEverything()
|
||||
body_dict[ 'basic_permissions' ] = sorted( basic_permissions ) # set->list for json
|
||||
body_dict[ 'human_description' ] = human_description
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetService( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckAtLeastOnePermission(
|
||||
(
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_FILES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_EDIT_RATINGS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_NOTES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
allowed_service_types = {
|
||||
HC.LOCAL_TAG,
|
||||
HC.TAG_REPOSITORY,
|
||||
HC.LOCAL_FILE_DOMAIN,
|
||||
HC.LOCAL_FILE_UPDATE_DOMAIN,
|
||||
HC.FILE_REPOSITORY,
|
||||
HC.COMBINED_LOCAL_FILE,
|
||||
HC.COMBINED_LOCAL_MEDIA,
|
||||
HC.COMBINED_FILE,
|
||||
HC.COMBINED_TAG,
|
||||
HC.LOCAL_RATING_LIKE,
|
||||
HC.LOCAL_RATING_NUMERICAL,
|
||||
HC.LOCAL_RATING_INCDEC,
|
||||
HC.LOCAL_FILE_TRASH_DOMAIN
|
||||
}
|
||||
|
||||
if 'service_key' in request.parsed_request_args:
|
||||
|
||||
service_key = request.parsed_request_args.GetValue( 'service_key', bytes )
|
||||
|
||||
elif 'service_name' in request.parsed_request_args:
|
||||
|
||||
service_name = request.parsed_request_args.GetValue( 'service_name', str )
|
||||
|
||||
try:
|
||||
|
||||
service_key = CG.client_controller.services_manager.GetServiceKeyFromName( allowed_service_types, service_name )
|
||||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Sorry, did not find a service with name "{}"!'.format( service_name ) )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you need to give a service_key or service_name!' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
service = CG.client_controller.services_manager.GetService( service_key )
|
||||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Sorry, did not find a service with key "{}"!'.format( service_key.hex() ) )
|
||||
|
||||
|
||||
if service.GetServiceType() not in allowed_service_types:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, for now, you cannot ask about this service!' )
|
||||
|
||||
|
||||
body_dict = {
|
||||
'service' : {
|
||||
'name' : service.GetName(),
|
||||
'type' : service.GetServiceType(),
|
||||
'type_pretty' : HC.service_string_lookup[ service.GetServiceType() ],
|
||||
'service_key' : service.GetServiceKey().hex()
|
||||
}
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetServices( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckAtLeastOnePermission(
|
||||
(
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_FILES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_EDIT_RATINGS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_ADD_NOTES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES,
|
||||
ClientAPI.CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS,
|
||||
ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
jobs = [
|
||||
( ( HC.LOCAL_TAG, ), 'local_tags' ),
|
||||
( ( HC.TAG_REPOSITORY, ), 'tag_repositories' ),
|
||||
( ( HC.LOCAL_FILE_DOMAIN, ), 'local_files' ),
|
||||
( ( HC.LOCAL_FILE_UPDATE_DOMAIN, ), 'local_updates' ),
|
||||
( ( HC.FILE_REPOSITORY, ), 'file_repositories' ),
|
||||
( ( HC.COMBINED_LOCAL_FILE, ), 'all_local_files' ),
|
||||
( ( HC.COMBINED_LOCAL_MEDIA, ), 'all_local_media' ),
|
||||
( ( HC.COMBINED_FILE, ), 'all_known_files' ),
|
||||
( ( HC.COMBINED_TAG, ), 'all_known_tags' ),
|
||||
( ( HC.LOCAL_FILE_TRASH_DOMAIN, ), 'trash' )
|
||||
]
|
||||
|
||||
body_dict = {}
|
||||
|
||||
for ( service_types, name ) in jobs:
|
||||
|
||||
services = CG.client_controller.services_manager.GetServices( service_types )
|
||||
|
||||
services_list = []
|
||||
|
||||
for service in services:
|
||||
|
||||
service_dict = {
|
||||
'name' : service.GetName(),
|
||||
'type' : service.GetServiceType(),
|
||||
'type_pretty' : HC.service_string_lookup[ service.GetServiceType() ],
|
||||
'service_key' : service.GetServiceKey().hex()
|
||||
}
|
||||
|
||||
services_list.append( service_dict )
|
||||
|
||||
|
||||
body_dict[ name ] = services_list
|
||||
|
||||
|
||||
body_dict[ 'services' ] = ClientLocalServerCore.GetServicesDict()
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,360 @@
|
|||
|
||||
import os
|
||||
import traceback
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core.files import HydrusFileHandling
|
||||
from hydrus.core.files.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientPaths
|
||||
from hydrus.client import ClientImageHandling
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientFileMigration
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFiles( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_FILES )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesAddFile( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
path = None
|
||||
delete_after_successful_import = False
|
||||
|
||||
if not hasattr( request, 'temp_file_info' ):
|
||||
|
||||
# ok the caller has not sent us a file in the POST content, we have a 'path'
|
||||
|
||||
path = request.parsed_request_args.GetValue( 'path', str )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Path "{}" does not exist!'.format( path ) )
|
||||
|
||||
|
||||
if not os.path.isfile( path ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Path "{}" is not a file!'.format( path ) )
|
||||
|
||||
|
||||
delete_after_successful_import = request.parsed_request_args.GetValue( 'delete_after_successful_import', bool, default_value = False )
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
request.temp_file_info = ( os_file_handle, temp_path )
|
||||
|
||||
HydrusPaths.MirrorFile( path, temp_path )
|
||||
|
||||
|
||||
( os_file_handle, temp_path ) = request.temp_file_info
|
||||
|
||||
file_import_options = CG.client_controller.new_options.GetDefaultFileImportOptions( FileImportOptions.IMPORT_TYPE_QUIET ).Duplicate()
|
||||
|
||||
custom_location_context = ClientLocalServerCore.ParseLocalFileDomainLocationContext( request )
|
||||
|
||||
if custom_location_context is not None:
|
||||
|
||||
file_import_options.SetDestinationLocationContext( custom_location_context )
|
||||
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'API POSTed File' )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
try:
|
||||
|
||||
file_import_status = file_import_job.DoWork()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
if isinstance( e, ( HydrusExceptions.VetoException, HydrusExceptions.UnsupportedFileException ) ):
|
||||
|
||||
note = str( e )
|
||||
|
||||
else:
|
||||
|
||||
note = HydrusText.GetFirstLine( repr( e ) )
|
||||
|
||||
|
||||
file_import_status = ClientImportFiles.FileImportStatus( CC.STATUS_ERROR, file_import_job.GetHash(), note = note )
|
||||
|
||||
body_dict[ 'traceback' ] = traceback.format_exc()
|
||||
|
||||
|
||||
if path is not None:
|
||||
|
||||
if delete_after_successful_import and file_import_status.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
ClientPaths.DeletePath( path )
|
||||
|
||||
|
||||
|
||||
body_dict[ 'status' ] = file_import_status.status
|
||||
body_dict[ 'hash' ] = HydrusData.BytesToNoneOrHex( file_import_status.hash )
|
||||
body_dict[ 'note' ] = file_import_status.note
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesArchiveFiles( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, hashes )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_update )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesClearDeletedFileRecord( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
media_results = [ media_result for media_result in media_results if CC.COMBINED_LOCAL_FILE_SERVICE_KEY in media_result.GetLocationsManager().GetDeleted() ]
|
||||
|
||||
clearee_hashes = { m.GetHash() for m in media_results }
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, clearee_hashes )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_update )
|
||||
|
||||
CG.client_controller.Write( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesDeleteFiles( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ), deleted_allowed = False )
|
||||
|
||||
if 'reason' in request.parsed_request_args:
|
||||
|
||||
reason = request.parsed_request_args.GetValue( 'reason', str )
|
||||
|
||||
else:
|
||||
|
||||
reason = 'Deleted via Client API.'
|
||||
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
location_context.LimitToServiceTypes( CG.client_controller.services_manager.GetServiceType, ( HC.COMBINED_LOCAL_FILE, HC.COMBINED_LOCAL_MEDIA, HC.LOCAL_FILE_DOMAIN ) )
|
||||
|
||||
if CG.client_controller.new_options.GetBoolean( 'delete_lock_for_archived_files' ):
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
undeletable_media_results = [ m for m in media_results if m.IsDeleteLocked() ]
|
||||
|
||||
if len( undeletable_media_results ) > 0:
|
||||
|
||||
message = 'Sorry, some of the files you selected are currently delete locked. Their hashes are:'
|
||||
message += '\n' * 2
|
||||
message += '\n'.join( sorted( [ m.GetHash().hex() for m in undeletable_media_results ] ) )
|
||||
|
||||
raise HydrusExceptions.ConflictException( message )
|
||||
|
||||
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = reason )
|
||||
|
||||
for service_key in location_context.current_service_keys:
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( service_key, content_update )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesMigrateFiles( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocalFileDomainLocationContext( request )
|
||||
|
||||
if location_context is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you need to set a destination for the migration!' )
|
||||
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
for media_result in media_results:
|
||||
|
||||
if not CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY in media_result.GetLocationsManager().GetCurrent():
|
||||
|
||||
raise HydrusExceptions.BadRequestException( f'The file "{media_result.GetHash().hex()} is not in any local file domains, so I cannot copy!' )
|
||||
|
||||
|
||||
|
||||
for service_key in location_context.current_service_keys:
|
||||
|
||||
CG.client_controller.CallToThread( ClientFileMigration.MoveOrDuplicateLocalFiles, service_key, HC.CONTENT_UPDATE_ADD, media_results )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesUnarchiveFiles( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_INBOX, hashes )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_update )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesUndeleteFiles( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
location_context.LimitToServiceTypes( CG.client_controller.services_manager.GetServiceType, ( HC.LOCAL_FILE_DOMAIN, HC.COMBINED_LOCAL_MEDIA ) )
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
# this is the only scan I have to do. all the stuff like 'can I undelete from here' and 'what does an undelete to combined local media mean' is all sorted at the db level no worries
|
||||
media_results = [ media_result for media_result in media_results if CC.COMBINED_LOCAL_FILE_SERVICE_KEY in media_result.GetLocationsManager().GetCurrent() ]
|
||||
|
||||
hashes = { media_result.GetHash() for media_result in media_results }
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_UNDELETE, hashes )
|
||||
|
||||
for service_key in location_context.current_service_keys:
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( service_key, content_update )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddFilesGenerateHashes( HydrusResourceClientAPIRestrictedAddFiles ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if not hasattr( request, 'temp_file_info' ):
|
||||
|
||||
path = request.parsed_request_args.GetValue( 'path', str )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Path "{}" does not exist!'.format( path ) )
|
||||
|
||||
|
||||
if not os.path.isfile( path ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Path "{}" is not a file!'.format( path ) )
|
||||
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
request.temp_file_info = ( os_file_handle, temp_path )
|
||||
|
||||
HydrusPaths.MirrorFile( path, temp_path )
|
||||
|
||||
|
||||
( os_file_handle, temp_path ) = request.temp_file_info
|
||||
|
||||
mime = HydrusFileHandling.GetMime( temp_path )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
sha256_hash = HydrusFileHandling.GetHashFromPath( temp_path )
|
||||
|
||||
body_dict['hash'] = sha256_hash.hex()
|
||||
|
||||
if mime in HC.FILES_THAT_HAVE_PERCEPTUAL_HASH or mime in HC.FILES_THAT_CAN_HAVE_PIXEL_HASH:
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( temp_path, mime )
|
||||
|
||||
if mime in HC.FILES_THAT_HAVE_PERCEPTUAL_HASH:
|
||||
|
||||
perceptual_hashes = ClientImageHandling.GenerateShapePerceptualHashesNumPy( numpy_image )
|
||||
|
||||
body_dict['perceptual_hashes'] = [ perceptual_hash.hex() for perceptual_hash in perceptual_hashes ]
|
||||
|
||||
if mime in HC.FILES_THAT_CAN_HAVE_PIXEL_HASH:
|
||||
|
||||
pixel_hash = HydrusImageHandling.GetImagePixelHashNumPy( numpy_image )
|
||||
|
||||
body_dict['pixel_hash'] = pixel_hash.hex()
|
||||
|
||||
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,127 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddNotes( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_NOTES )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddNotesSetNotes( HydrusResourceClientAPIRestrictedAddNotes ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if 'hash' in request.parsed_request_args:
|
||||
|
||||
hash = request.parsed_request_args.GetValue( 'hash', bytes )
|
||||
|
||||
elif 'file_id' in request.parsed_request_args:
|
||||
|
||||
hash_id = request.parsed_request_args.GetValue( 'file_id', int )
|
||||
|
||||
hash_ids_to_hashes = CG.client_controller.Read( 'hash_ids_to_hashes', hash_ids = [ hash_id ] )
|
||||
|
||||
hash = hash_ids_to_hashes[ hash_id ]
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'There was no file identifier or hash given!' )
|
||||
|
||||
|
||||
new_names_to_notes = request.parsed_request_args.GetValue( 'notes', dict, expected_dict_types = ( str, str ) )
|
||||
|
||||
merge_cleverly = request.parsed_request_args.GetValue( 'merge_cleverly', bool, default_value = False )
|
||||
|
||||
if merge_cleverly:
|
||||
|
||||
from hydrus.client.importing.options import NoteImportOptions
|
||||
|
||||
extend_existing_note_if_possible = request.parsed_request_args.GetValue( 'extend_existing_note_if_possible', bool, default_value = True )
|
||||
conflict_resolution = request.parsed_request_args.GetValue( 'conflict_resolution', int, default_value = NoteImportOptions.NOTE_IMPORT_CONFLICT_RENAME )
|
||||
|
||||
if conflict_resolution not in NoteImportOptions.note_import_conflict_str_lookup:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The given conflict resolution type was not in the allowed range!' )
|
||||
|
||||
|
||||
note_import_options = NoteImportOptions.NoteImportOptions()
|
||||
|
||||
note_import_options.SetIsDefault( False )
|
||||
note_import_options.SetExtendExistingNoteIfPossible( extend_existing_note_if_possible )
|
||||
note_import_options.SetConflictResolution( conflict_resolution )
|
||||
|
||||
media_result = CG.client_controller.Read( 'media_result', hash )
|
||||
|
||||
existing_names_to_notes = media_result.GetNotesManager().GetNamesToNotes()
|
||||
|
||||
names_and_notes = list( new_names_to_notes.items() )
|
||||
|
||||
new_names_to_notes = note_import_options.GetUpdateeNamesToNotes( existing_names_to_notes, names_and_notes )
|
||||
|
||||
|
||||
content_updates = [ ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_SET, ( hash, name, note ) ) for ( name, note ) in new_names_to_notes.items() ]
|
||||
|
||||
if len( content_updates ) > 0:
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( CC.LOCAL_NOTES_SERVICE_KEY, content_updates )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
body_dict = {
|
||||
'notes': new_names_to_notes
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddNotesDeleteNotes( HydrusResourceClientAPIRestrictedAddNotes ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if 'hash' in request.parsed_request_args:
|
||||
|
||||
hash = request.parsed_request_args.GetValue( 'hash', bytes )
|
||||
|
||||
elif 'file_id' in request.parsed_request_args:
|
||||
|
||||
hash_id = request.parsed_request_args.GetValue( 'file_id', int )
|
||||
|
||||
hash_ids_to_hashes = CG.client_controller.Read( 'hash_ids_to_hashes', hash_ids = [ hash_id ] )
|
||||
|
||||
hash = hash_ids_to_hashes[ hash_id ]
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'There was no file identifier or hash given!' )
|
||||
|
||||
|
||||
note_names = request.parsed_request_args.GetValue( 'note_names', list, expected_list_type = str )
|
||||
|
||||
content_updates = [ ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_DELETE, ( hash, name ) ) for name in note_names ]
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( CC.LOCAL_NOTES_SERVICE_KEY, content_updates )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,399 @@
|
|||
import collections
|
||||
import collections.abc
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusLists
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core.networking import HydrusNetworkVariableHandling
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
from hydrus.client.search import ClientSearchAutocomplete
|
||||
from hydrus.client.search import ClientSearchFileSearchContext
|
||||
from hydrus.client.search import ClientSearchPredicate
|
||||
from hydrus.client.search import ClientSearchTagContext
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTags( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRestrictedAddTags ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
#
|
||||
|
||||
override_previously_deleted_mappings = request.parsed_request_args.GetValue( 'override_previously_deleted_mappings', bool, default_value = True )
|
||||
create_new_deleted_mappings = request.parsed_request_args.GetValue( 'create_new_deleted_mappings', bool, default_value = True )
|
||||
|
||||
service_keys_to_actions_to_tags = None
|
||||
|
||||
if 'service_keys_to_tags' in request.parsed_request_args:
|
||||
|
||||
service_keys_to_tags = request.parsed_request_args.GetValue( 'service_keys_to_tags', dict )
|
||||
|
||||
service_keys_to_actions_to_tags = {}
|
||||
|
||||
for ( service_key, tags ) in service_keys_to_tags.items():
|
||||
|
||||
service = ClientLocalServerCore.CheckTagService( service_key )
|
||||
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'tags in service_keys_to_tags', tags, list, expected_list_type = str )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if service.GetServiceType() == HC.LOCAL_TAG:
|
||||
|
||||
content_action = HC.CONTENT_UPDATE_ADD
|
||||
|
||||
else:
|
||||
|
||||
content_action = HC.CONTENT_UPDATE_PEND
|
||||
|
||||
|
||||
service_keys_to_actions_to_tags[ service_key ] = collections.defaultdict( set )
|
||||
|
||||
service_keys_to_actions_to_tags[ service_key ][ content_action ].update( tags )
|
||||
|
||||
|
||||
|
||||
if 'service_keys_to_actions_to_tags' in request.parsed_request_args:
|
||||
|
||||
parsed_service_keys_to_actions_to_tags = request.parsed_request_args.GetValue( 'service_keys_to_actions_to_tags', dict )
|
||||
|
||||
service_keys_to_actions_to_tags = {}
|
||||
|
||||
for ( service_key, parsed_actions_to_tags ) in parsed_service_keys_to_actions_to_tags.items():
|
||||
|
||||
service = ClientLocalServerCore.CheckTagService( service_key )
|
||||
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'actions_to_tags', parsed_actions_to_tags, dict )
|
||||
|
||||
actions_to_tags = {}
|
||||
|
||||
for ( parsed_content_action, tags ) in parsed_actions_to_tags.items():
|
||||
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'action in actions_to_tags', parsed_content_action, str )
|
||||
|
||||
try:
|
||||
|
||||
content_action = int( parsed_content_action )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, got an action, "{}", that was not an integer!'.format( parsed_content_action ) )
|
||||
|
||||
|
||||
if service.GetServiceType() == HC.LOCAL_TAG:
|
||||
|
||||
if content_action not in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you submitted a content action of "{}" for service "{}", but you can only add/delete on a local tag service!'.format( parsed_content_action, service_key.hex() ) )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if content_action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you submitted a content action of "{}" for service "{}", but you cannot add/delete on a remote tag service!'.format( parsed_content_action, service_key.hex() ) )
|
||||
|
||||
|
||||
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'tags in actions_to_tags', tags, list ) # do not test for str here, it can be reason tuples!
|
||||
|
||||
actions_to_tags[ content_action ] = tags
|
||||
|
||||
|
||||
if len( actions_to_tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
service_keys_to_actions_to_tags[ service_key ] = actions_to_tags
|
||||
|
||||
|
||||
|
||||
if service_keys_to_actions_to_tags is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Need a service_keys_to_tags or service_keys_to_actions_to_tags parameter!' )
|
||||
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage()
|
||||
|
||||
media_results = []
|
||||
|
||||
if not override_previously_deleted_mappings or not create_new_deleted_mappings:
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
|
||||
for ( service_key, actions_to_tags ) in service_keys_to_actions_to_tags.items():
|
||||
|
||||
for ( content_action, tags ) in actions_to_tags.items():
|
||||
|
||||
tags = list( tags )
|
||||
|
||||
content_action = int( content_action )
|
||||
|
||||
content_update_tags = []
|
||||
|
||||
tags_to_reasons = {}
|
||||
|
||||
for tag_item in tags:
|
||||
|
||||
reason = 'Petitioned from API'
|
||||
|
||||
if isinstance( tag_item, str ):
|
||||
|
||||
tag = tag_item
|
||||
|
||||
elif HydrusLists.IsAListLikeCollection( tag_item ) and len( tag_item ) == 2:
|
||||
|
||||
( tag, reason ) = tag_item
|
||||
|
||||
if not ( isinstance( tag, str ) and isinstance( reason, str ) ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
else:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
try:
|
||||
|
||||
tag = HydrusTags.CleanTag( tag )
|
||||
|
||||
except:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
content_update_tags.append( tag )
|
||||
tags_to_reasons[ tag ] = reason
|
||||
|
||||
|
||||
if len( content_update_tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
content_updates = []
|
||||
|
||||
for tag in content_update_tags:
|
||||
|
||||
hashes_for_this_package = hashes
|
||||
|
||||
if content_action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_PEND ) and not override_previously_deleted_mappings:
|
||||
|
||||
hashes_for_this_package = ClientImportOptions.FilterNotPreviouslyDeletedTagHashes( service_key, media_results, tag )
|
||||
|
||||
|
||||
if content_action in ( HC.CONTENT_UPDATE_DELETE, HC.CONTENT_UPDATE_PETITION ) and not create_new_deleted_mappings:
|
||||
|
||||
hashes_for_this_package = ClientImportOptions.FilterCurrentTagHashes( service_key, media_results, tag )
|
||||
|
||||
|
||||
if content_action == HC.CONTENT_UPDATE_PETITION:
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes_for_this_package ), reason = tags_to_reasons[ tag ] )
|
||||
|
||||
else:
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes_for_this_package ) )
|
||||
|
||||
|
||||
content_updates.append( content_update )
|
||||
|
||||
|
||||
content_update_package.AddContentUpdates( service_key, content_updates )
|
||||
|
||||
|
||||
|
||||
if content_update_package.HasContent():
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTagsSearchTags( HydrusResourceClientAPIRestrictedAddTags ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
# this doesn't need 'add tags' atm. I was going to add it, but I'm not sure it is actually appropriate
|
||||
# this thing probably should have been in search files space, but whatever
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES )
|
||||
|
||||
|
||||
def _GetParsedAutocompleteText( self, search, tag_service_key ) -> ClientSearchAutocomplete.ParsedAutocompleteText:
|
||||
|
||||
tag_autocomplete_options = CG.client_controller.tag_display_manager.GetTagAutocompleteOptions( tag_service_key )
|
||||
|
||||
collapse_search_characters = True
|
||||
|
||||
parsed_autocomplete_text = ClientSearchAutocomplete.ParsedAutocompleteText( search, tag_autocomplete_options, collapse_search_characters )
|
||||
|
||||
parsed_autocomplete_text.SetInclusive( True )
|
||||
|
||||
return parsed_autocomplete_text
|
||||
|
||||
|
||||
def _GetTagMatches( self, request: HydrusServerRequest.HydrusRequest, tag_display_type: int, tag_service_key: bytes, parsed_autocomplete_text: ClientSearchAutocomplete.ParsedAutocompleteText ) -> typing.List[ ClientSearchPredicate.Predicate ]:
|
||||
|
||||
matches = []
|
||||
|
||||
if parsed_autocomplete_text.IsAcceptableForTagSearches():
|
||||
|
||||
tag_context = ClientSearchTagContext.TagContext( service_key = tag_service_key )
|
||||
|
||||
autocomplete_search_text = parsed_autocomplete_text.GetSearchText( True )
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
file_search_context = ClientSearchFileSearchContext.FileSearchContext( location_context = location_context, tag_context = tag_context )
|
||||
|
||||
job_status = ClientThreading.JobStatus( cancellable = True )
|
||||
|
||||
request.disconnect_callables.append( job_status.Cancel )
|
||||
|
||||
search_namespaces_into_full_tags = parsed_autocomplete_text.GetTagAutocompleteOptions().SearchNamespacesIntoFullTags()
|
||||
|
||||
predicates = CG.client_controller.Read( 'autocomplete_predicates', tag_display_type, file_search_context, search_text = autocomplete_search_text, job_status = job_status, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
|
||||
|
||||
display_tag_service_key = tag_context.display_service_key
|
||||
|
||||
matches = ClientSearchAutocomplete.FilterPredicatesBySearchText( display_tag_service_key, autocomplete_search_text, predicates )
|
||||
|
||||
matches = ClientSearchPredicate.SortPredicates( matches )
|
||||
|
||||
|
||||
return matches
|
||||
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
search = request.parsed_request_args.GetValue( 'search', str )
|
||||
|
||||
tag_display_type_str = request.parsed_request_args.GetValue( 'tag_display_type', str, default_value = 'storage' )
|
||||
|
||||
tag_display_type = ClientTags.TAG_DISPLAY_STORAGE if tag_display_type_str == 'storage' else ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL
|
||||
|
||||
tag_service_key = ClientLocalServerCore.ParseTagServiceKey( request )
|
||||
|
||||
parsed_autocomplete_text = self._GetParsedAutocompleteText( search, tag_service_key )
|
||||
|
||||
matches = self._GetTagMatches( request, tag_display_type, tag_service_key, parsed_autocomplete_text )
|
||||
|
||||
matches = request.client_api_permissions.FilterTagPredicateResponse( matches )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
# TODO: Ok so we could add sibling/parent info here if the tag display type is storage, or in both cases. probably only if client asks for it
|
||||
|
||||
tags = [ { 'value' : match.GetValue(), 'count' : match.GetCount().GetMinCount() } for match in matches ]
|
||||
|
||||
body_dict[ 'tags' ] = tags
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTagsGetTagSiblingsParents( HydrusResourceClientAPIRestrictedAddTags ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
tags = request.parsed_request_args.GetValue( 'tags', list, expected_list_type = str )
|
||||
|
||||
ClientLocalServerCore.CheckTags( tags )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
tags_to_service_keys_to_siblings_and_parents = CG.client_controller.Read( 'tag_siblings_and_parents_lookup', tags )
|
||||
|
||||
tags_dict = {}
|
||||
|
||||
for ( tag, service_keys_to_siblings_parents ) in tags_to_service_keys_to_siblings_and_parents.items():
|
||||
|
||||
tag_dict = {}
|
||||
|
||||
for ( service_key, siblings_parents ) in service_keys_to_siblings_parents.items():
|
||||
|
||||
tag_dict[ service_key.hex() ] = {
|
||||
'siblings': list( siblings_parents[0] ),
|
||||
'ideal_tag': siblings_parents[1],
|
||||
'descendants': list( siblings_parents[2] ),
|
||||
'ancestors': list( siblings_parents[3] )
|
||||
}
|
||||
|
||||
|
||||
tags_dict[ tag ] = tag_dict
|
||||
|
||||
|
||||
body_dict = {
|
||||
'tags' : tags_dict,
|
||||
'services' : ClientLocalServerCore.GetServicesDict()
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTagsCleanTags( HydrusResourceClientAPIRestrictedAddTags ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
tags = request.parsed_request_args.GetValue( 'tags', list, expected_list_type = str )
|
||||
|
||||
tags = list( HydrusTags.CleanTags( tags ) )
|
||||
|
||||
tags = HydrusTags.SortNumericTags( tags )
|
||||
|
||||
body_dict = {
|
||||
'tags' : tags
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,318 @@
|
|||
import time
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLs( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_URLS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsAssociateURL( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
normalise_urls = request.parsed_request_args.GetValue( 'normalise_urls', bool, default_value = True )
|
||||
|
||||
urls_to_add = []
|
||||
|
||||
if 'url_to_add' in request.parsed_request_args:
|
||||
|
||||
url = request.parsed_request_args.GetValue( 'url_to_add', str )
|
||||
|
||||
urls_to_add.append( url )
|
||||
|
||||
|
||||
if 'urls_to_add' in request.parsed_request_args:
|
||||
|
||||
urls = request.parsed_request_args.GetValue( 'urls_to_add', list, expected_list_type = str )
|
||||
|
||||
urls_to_add.extend( urls )
|
||||
|
||||
|
||||
urls_to_delete = []
|
||||
|
||||
if 'url_to_delete' in request.parsed_request_args:
|
||||
|
||||
url = request.parsed_request_args.GetValue( 'url_to_delete', str )
|
||||
|
||||
urls_to_delete.append( url )
|
||||
|
||||
|
||||
if 'urls_to_delete' in request.parsed_request_args:
|
||||
|
||||
urls = request.parsed_request_args.GetValue( 'urls_to_delete', list, expected_list_type = str )
|
||||
|
||||
urls_to_delete.extend( urls )
|
||||
|
||||
|
||||
domain_manager = CG.client_controller.network_engine.domain_manager
|
||||
|
||||
if normalise_urls:
|
||||
|
||||
try:
|
||||
|
||||
urls_to_add = [ domain_manager.NormaliseURL( url ) for url in urls_to_add ]
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( e )
|
||||
|
||||
|
||||
|
||||
if len( urls_to_add ) == 0 and len( urls_to_delete ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not find any URLs to add or delete!' )
|
||||
|
||||
|
||||
applicable_hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
if len( applicable_hashes ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not find any hashes to apply the urls to!' )
|
||||
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage()
|
||||
|
||||
if len( urls_to_add ) > 0:
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( urls_to_add, applicable_hashes ) )
|
||||
|
||||
content_update_package.AddContentUpdate( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_update )
|
||||
|
||||
|
||||
if len( urls_to_delete ) > 0:
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( urls_to_delete, applicable_hashes ) )
|
||||
|
||||
content_update_package.AddContentUpdate( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_update )
|
||||
|
||||
|
||||
if content_update_package.HasContent():
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsGetURLFiles( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
url = request.parsed_request_args.GetValue( 'url', str )
|
||||
|
||||
do_file_system_check = request.parsed_request_args.GetValue( 'doublecheck_file_system', bool, default_value = False )
|
||||
|
||||
if url == '':
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Given URL was empty!' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
normalised_url = CG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( e )
|
||||
|
||||
|
||||
url_statuses = CG.client_controller.Read( 'url_statuses', normalised_url )
|
||||
|
||||
json_happy_url_statuses = []
|
||||
|
||||
we_only_saw_successful = True
|
||||
|
||||
for file_import_status in url_statuses:
|
||||
|
||||
if do_file_system_check:
|
||||
|
||||
file_import_status = ClientImportFiles.CheckFileImportStatus( file_import_status )
|
||||
|
||||
|
||||
d = {
|
||||
'status': file_import_status.status,
|
||||
'hash': HydrusData.BytesToNoneOrHex( file_import_status.hash ),
|
||||
'note': file_import_status.note
|
||||
}
|
||||
|
||||
json_happy_url_statuses.append( d )
|
||||
|
||||
if file_import_status.status not in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
we_only_saw_successful = False
|
||||
|
||||
|
||||
|
||||
body_dict = { 'normalised_url' : normalised_url, 'url_file_statuses' : json_happy_url_statuses }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
if we_only_saw_successful:
|
||||
|
||||
# not likely to change much, so no worries about reducing overhead here
|
||||
response_context.SetMaxAge( 30 )
|
||||
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsGetURLInfo( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
url = request.parsed_request_args.GetValue( 'url', str )
|
||||
|
||||
if url == '':
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Given URL was empty!' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
normalised_url = CG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
( url_type, match_name, can_parse, cannot_parse_reason ) = CG.client_controller.network_engine.domain_manager.GetURLParseCapability( normalised_url )
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( e )
|
||||
|
||||
|
||||
body_dict = { 'normalised_url' : normalised_url, 'url_type' : url_type, 'url_type_string' : HC.url_type_string_lookup[ url_type ], 'match_name' : match_name, 'can_parse' : can_parse }
|
||||
|
||||
if not can_parse:
|
||||
|
||||
body_dict[ 'cannot_parse_reason' ] = cannot_parse_reason
|
||||
|
||||
|
||||
try:
|
||||
|
||||
url_to_fetch = CG.client_controller.network_engine.domain_manager.GetURLToFetch( normalised_url )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( e )
|
||||
|
||||
|
||||
body_dict[ 'request_url' ] = url_to_fetch
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
# max age of ten minutes here
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body, max_age = 600 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsImportURL( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
url = request.parsed_request_args.GetValue( 'url', str )
|
||||
|
||||
if url == '':
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Given URL was empty!' )
|
||||
|
||||
|
||||
filterable_tags = set()
|
||||
|
||||
if 'filterable_tags' in request.parsed_request_args:
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
|
||||
|
||||
filterable_tags = request.parsed_request_args.GetValue( 'filterable_tags', list, expected_list_type = str )
|
||||
|
||||
filterable_tags = HydrusTags.CleanTags( filterable_tags )
|
||||
|
||||
|
||||
additional_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
if 'service_keys_to_additional_tags' in request.parsed_request_args:
|
||||
|
||||
service_keys_to_additional_tags = request.parsed_request_args.GetValue( 'service_keys_to_additional_tags', dict )
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
|
||||
|
||||
for ( service_key, tags ) in service_keys_to_additional_tags.items():
|
||||
|
||||
ClientLocalServerCore.CheckTagService( service_key )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
additional_service_keys_to_tags[ service_key ] = tags
|
||||
|
||||
|
||||
|
||||
destination_page_name = None
|
||||
|
||||
if 'destination_page_name' in request.parsed_request_args:
|
||||
|
||||
destination_page_name = request.parsed_request_args.GetValue( 'destination_page_name', str )
|
||||
|
||||
|
||||
destination_page_key = None
|
||||
|
||||
if 'destination_page_key' in request.parsed_request_args:
|
||||
|
||||
destination_page_key = request.parsed_request_args.GetValue( 'destination_page_key', bytes )
|
||||
|
||||
|
||||
show_destination_page = request.parsed_request_args.GetValue( 'show_destination_page', bool, default_value = False )
|
||||
|
||||
destination_location_context = ClientLocalServerCore.ParseLocalFileDomainLocationContext( request )
|
||||
|
||||
def do_it():
|
||||
|
||||
return CG.client_controller.gui.ImportURLFromAPI( url, filterable_tags, additional_service_keys_to_tags, destination_page_name, destination_page_key, show_destination_page, destination_location_context )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( normalised_url, result_text ) = CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it )
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( e )
|
||||
|
||||
|
||||
time.sleep( 0.05 ) # yield and give the ui time to catch up with new URL pubsubs in case this is being spammed
|
||||
|
||||
body_dict = { 'human_result_text' : result_text, 'normalised_url' : normalised_url }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,112 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedEditRatings( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_EDIT_RATINGS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedEditRatingsSetRating( HydrusResourceClientAPIRestrictedEditRatings ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
rating_service_key = request.parsed_request_args.GetValue( 'rating_service_key', bytes )
|
||||
|
||||
applicable_hashes = set( ClientLocalServerCore.ParseHashes( request ) )
|
||||
|
||||
if len( applicable_hashes ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not find any hashes to apply the ratings to!' )
|
||||
|
||||
|
||||
if 'rating' not in request.parsed_request_args:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you need to give a rating to set it to!' )
|
||||
|
||||
|
||||
rating = request.parsed_request_args[ 'rating' ]
|
||||
|
||||
rating_service = CG.client_controller.services_manager.GetService( rating_service_key )
|
||||
|
||||
rating_service_type = rating_service.GetServiceType()
|
||||
|
||||
none_ok = True
|
||||
|
||||
if rating_service_type == HC.LOCAL_RATING_LIKE:
|
||||
|
||||
expecting_type = bool
|
||||
|
||||
elif rating_service_type == HC.LOCAL_RATING_NUMERICAL:
|
||||
|
||||
expecting_type = int
|
||||
|
||||
elif rating_service_type == HC.LOCAL_RATING_INCDEC:
|
||||
|
||||
expecting_type = int
|
||||
|
||||
none_ok = False
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'That service is not a rating service!' )
|
||||
|
||||
|
||||
if rating is None:
|
||||
|
||||
if not none_ok:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, this service does not allow a null rating!' )
|
||||
|
||||
|
||||
elif not isinstance( rating, expecting_type ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, this service expects a "{}" rating!'.format( expecting_type.__name__ ) )
|
||||
|
||||
|
||||
rating_for_content_update = rating
|
||||
|
||||
if rating_service_type == HC.LOCAL_RATING_LIKE:
|
||||
|
||||
if isinstance( rating, bool ):
|
||||
|
||||
rating_for_content_update = 1.0 if rating else 0.0
|
||||
|
||||
|
||||
elif rating_service_type == HC.LOCAL_RATING_NUMERICAL:
|
||||
|
||||
if isinstance( rating, int ):
|
||||
|
||||
rating_for_content_update = rating_service.ConvertStarsToRating( rating )
|
||||
|
||||
|
||||
elif rating_service_type == HC.LOCAL_RATING_INCDEC:
|
||||
|
||||
if rating < 0:
|
||||
|
||||
rating_for_content_update = 0
|
||||
|
||||
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( rating_for_content_update, applicable_hashes ) )
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdate( rating_service_key, content_update )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,157 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIRestrictedEditTimes( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_EDIT_TIMES )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedEditTimesSetTime( HydrusResourceClientAPIRestrictedEditTimes ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
if len( hashes ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not find any hashes to apply the times to!' )
|
||||
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
if 'timestamp' in request.parsed_request_args:
|
||||
|
||||
timestamp = request.parsed_request_args.GetValueOrNone( 'timestamp', float )
|
||||
|
||||
timestamp_ms = HydrusTime.MillisecondiseS( timestamp )
|
||||
|
||||
elif 'timestamp_ms' in request.parsed_request_args:
|
||||
|
||||
timestamp_ms = request.parsed_request_args.GetValueOrNone( 'timestamp_ms', int )
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you have to specify a timestamp, even if you want to send "null"!' )
|
||||
|
||||
|
||||
location = None
|
||||
|
||||
timestamp_type = request.parsed_request_args.GetValue( 'timestamp_type', int )
|
||||
|
||||
if timestamp_type is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you have to specify the timestamp type!' )
|
||||
|
||||
|
||||
if timestamp_type == HC.TIMESTAMP_TYPE_MODIFIED_DOMAIN:
|
||||
|
||||
domain = request.parsed_request_args.GetValue( 'domain', str )
|
||||
|
||||
if domain == 'local':
|
||||
|
||||
timestamp_type = HC.TIMESTAMP_TYPE_MODIFIED_FILE
|
||||
|
||||
else:
|
||||
|
||||
location = domain
|
||||
|
||||
|
||||
elif timestamp_type == HC.TIMESTAMP_TYPE_LAST_VIEWED:
|
||||
|
||||
canvas_type = request.parsed_request_args.GetValueOrNone( 'canvas_type', int )
|
||||
|
||||
if canvas_type is None:
|
||||
|
||||
canvas_type = CC.CANVAS_MEDIA_VIEWER
|
||||
|
||||
|
||||
if canvas_type not in ( CC.CANVAS_MEDIA_VIEWER, CC.CANVAS_PREVIEW ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, the canvas type needs to be either 0 or 1!' )
|
||||
|
||||
|
||||
location = canvas_type
|
||||
|
||||
elif timestamp_type in ( HC.TIMESTAMP_TYPE_IMPORTED, HC.TIMESTAMP_TYPE_DELETED, HC.TIMESTAMP_TYPE_PREVIOUSLY_IMPORTED ):
|
||||
|
||||
file_service_key = request.parsed_request_args.GetValue( 'file_service_key', bytes )
|
||||
|
||||
if not CG.client_controller.services_manager.ServiceExists( file_service_key ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, do not know that service!' )
|
||||
|
||||
|
||||
if CG.client_controller.services_manager.GetServiceType( file_service_key ) not in HC.REAL_FILE_SERVICES:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you have to specify a file service service key!' )
|
||||
|
||||
|
||||
location = file_service_key
|
||||
|
||||
elif timestamp_type in ( HC.TIMESTAMP_TYPE_MODIFIED_FILE, HC.TIMESTAMP_TYPE_ARCHIVED ):
|
||||
|
||||
pass # simple; no additional location data
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( f'Sorry, do not understand that timestamp type "{timestamp_type}"!' )
|
||||
|
||||
|
||||
if timestamp_type != HC.TIMESTAMP_TYPE_MODIFIED_DOMAIN:
|
||||
|
||||
if timestamp_ms is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you can only delete web domain timestamps (type 0) for now!' )
|
||||
|
||||
else:
|
||||
|
||||
timestamp_data_stub = ClientTime.TimestampData( timestamp_type = timestamp_type, location = location )
|
||||
|
||||
for media_result in media_results:
|
||||
|
||||
result = media_result.GetTimesManager().GetTimestampMSFromStub( timestamp_data_stub )
|
||||
|
||||
if result is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( f'Sorry, if the timestamp type is other than 0 (web domain), then you cannot add new timestamps, only edit existing ones. I did not see the given timestamp type on one of the files you sent, specifically: {media_result.GetHash().hex()}' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
timestamp_data = ClientTime.TimestampData( timestamp_type = timestamp_type, location = location, timestamp_ms = timestamp_ms )
|
||||
|
||||
if timestamp_ms is None:
|
||||
|
||||
action = HC.CONTENT_UPDATE_DELETE
|
||||
|
||||
else:
|
||||
|
||||
action = HC.CONTENT_UPDATE_SET
|
||||
|
||||
|
||||
content_updates = [ ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_TIMESTAMP, action, ( hashes, timestamp_data ) ) ]
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, content_updates )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,907 @@
|
|||
import os
|
||||
import time
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.files import HydrusFileHandling
|
||||
from hydrus.core.files.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientRendering
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
from hydrus.client.search import ClientSearchFileSearchContext
|
||||
from hydrus.client.search import ClientSearchTagContext
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFiles( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesSearchFiles( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
tag_service_key = ClientLocalServerCore.ParseTagServiceKey( request )
|
||||
|
||||
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and location_context.IsAllKnownFiles():
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, search for all known tags over all known files is not supported!' )
|
||||
|
||||
|
||||
include_current_tags = request.parsed_request_args.GetValue( 'include_current_tags', bool, default_value = True )
|
||||
include_pending_tags = request.parsed_request_args.GetValue( 'include_pending_tags', bool, default_value = True )
|
||||
|
||||
tag_context = ClientSearchTagContext.TagContext( service_key = tag_service_key, include_current_tags = include_current_tags, include_pending_tags = include_pending_tags )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( request )
|
||||
|
||||
return_hashes = False
|
||||
return_file_ids = True
|
||||
|
||||
if len( predicates ) == 0:
|
||||
|
||||
hash_ids = []
|
||||
|
||||
else:
|
||||
|
||||
file_search_context = ClientSearchFileSearchContext.FileSearchContext( location_context = location_context, tag_context = tag_context, predicates = predicates )
|
||||
|
||||
file_sort_type = CC.SORT_FILES_BY_IMPORT_TIME
|
||||
|
||||
if 'file_sort_type' in request.parsed_request_args:
|
||||
|
||||
file_sort_type = request.parsed_request_args[ 'file_sort_type' ]
|
||||
|
||||
|
||||
if file_sort_type not in CC.SYSTEM_SORT_TYPES:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, did not understand that sort type!' )
|
||||
|
||||
|
||||
file_sort_asc = False
|
||||
|
||||
if 'file_sort_asc' in request.parsed_request_args:
|
||||
|
||||
file_sort_asc = request.parsed_request_args.GetValue( 'file_sort_asc', bool )
|
||||
|
||||
|
||||
sort_order = CC.SORT_ASC if file_sort_asc else CC.SORT_DESC
|
||||
|
||||
# newest first
|
||||
sort_by = ClientMedia.MediaSort( sort_type = ( 'system', file_sort_type ), sort_order = sort_order )
|
||||
|
||||
if 'return_hashes' in request.parsed_request_args:
|
||||
|
||||
return_hashes = request.parsed_request_args.GetValue( 'return_hashes', bool )
|
||||
|
||||
|
||||
if 'return_file_ids' in request.parsed_request_args:
|
||||
|
||||
return_file_ids = request.parsed_request_args.GetValue( 'return_file_ids', bool )
|
||||
|
||||
|
||||
job_status = ClientThreading.JobStatus( cancellable = True )
|
||||
|
||||
request.disconnect_callables.append( job_status.Cancel )
|
||||
|
||||
hash_ids = CG.client_controller.Read( 'file_query_ids', file_search_context, job_status = job_status, sort_by = sort_by, apply_implicit_limit = False )
|
||||
|
||||
|
||||
request.client_api_permissions.SetLastSearchResults( hash_ids )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
if return_hashes:
|
||||
|
||||
hash_ids_to_hashes = CG.client_controller.Read( 'hash_ids_to_hashes', hash_ids = hash_ids )
|
||||
|
||||
# maintain sort
|
||||
body_dict[ 'hashes' ] = [ hash_ids_to_hashes[ hash_id ].hex() for hash_id in hash_ids ]
|
||||
|
||||
|
||||
if return_file_ids:
|
||||
|
||||
body_dict[ 'file_ids' ] = list( hash_ids )
|
||||
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def ParseAndFetchMediaResult( request: HydrusServerRequest.HydrusRequest ) -> ClientMediaResult.MediaResult:
|
||||
|
||||
try:
|
||||
|
||||
if 'file_id' in request.parsed_request_args:
|
||||
|
||||
file_id = request.parsed_request_args.GetValue( 'file_id', int )
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( ( file_id, ) )
|
||||
|
||||
( media_result, ) = CG.client_controller.Read( 'media_results_from_ids', ( file_id, ) )
|
||||
|
||||
elif 'hash' in request.parsed_request_args:
|
||||
|
||||
request.client_api_permissions.CheckCanSeeAllFiles()
|
||||
|
||||
hash = request.parsed_request_args.GetValue( 'hash', bytes )
|
||||
|
||||
media_result = CG.client_controller.Read( 'media_result', hash )
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Please include a file_id or hash parameter!' )
|
||||
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'One or more of those file identifiers was missing!' )
|
||||
|
||||
|
||||
return media_result
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetFile( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
media_result = ParseAndFetchMediaResult( request )
|
||||
|
||||
if not media_result.GetLocationsManager().IsLocal():
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'The client does not have this file!' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
hash = media_result.GetHash()
|
||||
mime = media_result.GetMime()
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
raise HydrusExceptions.FileMissingException()
|
||||
|
||||
|
||||
except HydrusExceptions.FileMissingException:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'That file seems to be missing!' )
|
||||
|
||||
|
||||
is_attachment = request.parsed_request_args.GetValue( 'download', bool, default_value = False )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, path = path, is_attachment = is_attachment )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetRenderedFile( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
if 'render_format' in request.parsed_request_args:
|
||||
|
||||
format = request.parsed_request_args.GetValue( 'render_format', int )
|
||||
|
||||
if not format in [ HC.IMAGE_PNG, HC.IMAGE_JPEG, HC.IMAGE_WEBP ]:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Invalid render format!' )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
format = HC.IMAGE_PNG
|
||||
|
||||
try:
|
||||
|
||||
media_result: ClientMedia.MediaSingleton
|
||||
|
||||
if 'file_id' in request.parsed_request_args:
|
||||
|
||||
file_id = request.parsed_request_args.GetValue( 'file_id', int )
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( ( file_id, ) )
|
||||
|
||||
( media_result, ) = CG.client_controller.Read( 'media_results_from_ids', ( file_id, ) )
|
||||
|
||||
elif 'hash' in request.parsed_request_args:
|
||||
|
||||
request.client_api_permissions.CheckCanSeeAllFiles()
|
||||
|
||||
hash = request.parsed_request_args.GetValue( 'hash', bytes )
|
||||
|
||||
media_result = CG.client_controller.Read( 'media_result', hash )
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Please include a file_id or hash parameter!' )
|
||||
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'One or more of those file identifiers was missing!' )
|
||||
|
||||
|
||||
if not media_result.IsStaticImage():
|
||||
|
||||
raise HydrusExceptions.BadRequestException('Requested file is not an image!')
|
||||
|
||||
|
||||
renderer: ClientRendering.ImageRenderer = CG.client_controller.GetCache( 'images' ).GetImageRenderer( media_result )
|
||||
|
||||
while not renderer.IsReady():
|
||||
|
||||
if request.disconnected:
|
||||
|
||||
return
|
||||
|
||||
|
||||
time.sleep( 0.01 )
|
||||
|
||||
|
||||
numpy_image = renderer.GetNumPyImage()
|
||||
|
||||
if 'width' in request.parsed_request_args and 'height' in request.parsed_request_args:
|
||||
|
||||
width = request.parsed_request_args.GetValue( 'width', int )
|
||||
height = request.parsed_request_args.GetValue( 'height', int )
|
||||
|
||||
if width < 1:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Width must be greater than 0!' )
|
||||
|
||||
|
||||
if height < 1:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Height must be greater than 0!' )
|
||||
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, ( width, height ) )
|
||||
|
||||
|
||||
if 'render_quality' in request.parsed_request_args:
|
||||
|
||||
quality = request.parsed_request_args.GetValue( 'render_quality', int )
|
||||
|
||||
else:
|
||||
|
||||
if format == HC.IMAGE_PNG:
|
||||
|
||||
quality = 1 # fastest png compression
|
||||
|
||||
else:
|
||||
|
||||
quality = 80
|
||||
|
||||
|
||||
|
||||
body = HydrusImageHandling.GenerateFileBytesForRenderAPI( numpy_image, format, quality )
|
||||
|
||||
is_attachment = request.parsed_request_args.GetValue( 'download', bool, default_value = False )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = format, body = body, is_attachment = is_attachment, max_age = 86400 * 365 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesFileHashes( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
supported_hash_types = ( 'sha256', 'md5', 'sha1', 'sha512' )
|
||||
|
||||
source_hash_type = request.parsed_request_args.GetValue( 'source_hash_type', str, default_value = 'sha256' )
|
||||
|
||||
if source_hash_type not in supported_hash_types:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'I do not support that hash type!' )
|
||||
|
||||
|
||||
desired_hash_type = request.parsed_request_args.GetValue( 'desired_hash_type', str )
|
||||
|
||||
if desired_hash_type not in supported_hash_types:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'I do not support that hash type!' )
|
||||
|
||||
|
||||
source_hashes = set()
|
||||
|
||||
if 'hash' in request.parsed_request_args:
|
||||
|
||||
request_hash = request.parsed_request_args.GetValue( 'hash', bytes )
|
||||
|
||||
source_hashes.add( request_hash )
|
||||
|
||||
|
||||
if 'hashes' in request.parsed_request_args:
|
||||
|
||||
request_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
|
||||
|
||||
source_hashes.update( request_hashes )
|
||||
|
||||
|
||||
if len( source_hashes ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'You have to specify a hash to look up!' )
|
||||
|
||||
|
||||
ClientLocalServerCore.CheckHashLength( source_hashes, hash_type = source_hash_type )
|
||||
|
||||
source_to_desired = CG.client_controller.Read( 'file_hashes', source_hashes, source_hash_type, desired_hash_type )
|
||||
|
||||
encoded_source_to_desired = { source_hash.hex() : desired_hash.hex() for ( source_hash, desired_hash ) in source_to_desired.items() }
|
||||
|
||||
body_dict = {
|
||||
'hashes' : encoded_source_to_desired
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def AddMissingHashToFileMetadata( metadata: list, hash: bytes ):
|
||||
|
||||
metadata_row = {
|
||||
'file_id' : None,
|
||||
'hash' : hash.hex()
|
||||
}
|
||||
|
||||
metadata.append( metadata_row )
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
only_return_identifiers = request.parsed_request_args.GetValue( 'only_return_identifiers', bool, default_value = False )
|
||||
only_return_basic_information = request.parsed_request_args.GetValue( 'only_return_basic_information', bool, default_value = False )
|
||||
hide_service_keys_tags = request.parsed_request_args.GetValue( 'hide_service_keys_tags', bool, default_value = True )
|
||||
detailed_url_information = request.parsed_request_args.GetValue( 'detailed_url_information', bool, default_value = False )
|
||||
include_notes = request.parsed_request_args.GetValue( 'include_notes', bool, default_value = False )
|
||||
include_milliseconds = request.parsed_request_args.GetValue( 'include_milliseconds', bool, default_value = False )
|
||||
include_services_object = request.parsed_request_args.GetValue( 'include_services_object', bool, default_value = True )
|
||||
create_new_file_ids = request.parsed_request_args.GetValue( 'create_new_file_ids', bool, default_value = False )
|
||||
include_blurhash = request.parsed_request_args.GetValue( 'include_blurhash', bool, default_value = False )
|
||||
|
||||
if include_milliseconds:
|
||||
|
||||
time_converter = lambda t: t / 1000
|
||||
|
||||
else:
|
||||
|
||||
time_converter = HydrusTime.SecondiseMS
|
||||
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
hash_ids_to_hashes = CG.client_controller.Read( 'hash_ids_to_hashes', hashes = hashes, create_new_hash_ids = create_new_file_ids )
|
||||
|
||||
hashes_to_hash_ids = { hash : hash_id for ( hash_id, hash ) in hash_ids_to_hashes.items() }
|
||||
|
||||
hash_ids = set( hash_ids_to_hashes.keys() )
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( hash_ids )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
metadata = []
|
||||
|
||||
if only_return_identifiers:
|
||||
|
||||
for hash in hashes:
|
||||
|
||||
if hash in hashes_to_hash_ids:
|
||||
|
||||
metadata_row = {
|
||||
'file_id' : hashes_to_hash_ids[ hash ],
|
||||
'hash' : hash.hex()
|
||||
}
|
||||
|
||||
metadata.append( metadata_row )
|
||||
|
||||
else:
|
||||
|
||||
AddMissingHashToFileMetadata( metadata, hash )
|
||||
|
||||
|
||||
|
||||
elif only_return_basic_information:
|
||||
|
||||
file_info_managers = CG.client_controller.Read( 'file_info_managers_from_ids', hash_ids )
|
||||
|
||||
hashes_to_file_info_managers = { file_info_manager.hash : file_info_manager for file_info_manager in file_info_managers }
|
||||
|
||||
for hash in hashes:
|
||||
|
||||
if hash in hashes_to_file_info_managers:
|
||||
|
||||
file_info_manager = hashes_to_file_info_managers[ hash ]
|
||||
|
||||
metadata_row = {
|
||||
'file_id' : file_info_manager.hash_id,
|
||||
'hash' : file_info_manager.hash.hex(),
|
||||
'size' : file_info_manager.size,
|
||||
'mime' : HC.mime_mimetype_string_lookup[ file_info_manager.mime ],
|
||||
'filetype_human' : HC.mime_string_lookup[ file_info_manager.mime ],
|
||||
'filetype_enum' : file_info_manager.mime,
|
||||
'ext' : HC.mime_ext_lookup[ file_info_manager.mime ],
|
||||
'width' : file_info_manager.width,
|
||||
'height' : file_info_manager.height,
|
||||
'duration' : file_info_manager.duration,
|
||||
'num_frames' : file_info_manager.num_frames,
|
||||
'num_words' : file_info_manager.num_words,
|
||||
'has_audio' : file_info_manager.has_audio
|
||||
}
|
||||
|
||||
filetype_forced = file_info_manager.FiletypeIsForced()
|
||||
|
||||
metadata_row[ 'filetype_forced' ] = filetype_forced
|
||||
|
||||
if filetype_forced:
|
||||
|
||||
metadata_row[ 'original_mime' ] = HC.mime_mimetype_string_lookup[ file_info_manager.original_mime ]
|
||||
|
||||
|
||||
if include_blurhash:
|
||||
|
||||
metadata_row[ 'blurhash' ] = file_info_manager.blurhash
|
||||
|
||||
|
||||
metadata.append( metadata_row )
|
||||
|
||||
else:
|
||||
|
||||
AddMissingHashToFileMetadata( metadata, hash )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results_from_ids', hash_ids )
|
||||
|
||||
hashes_to_media_results = { media_result.GetFileInfoManager().hash : media_result for media_result in media_results }
|
||||
|
||||
services_manager = CG.client_controller.services_manager
|
||||
|
||||
rating_service_keys = services_manager.GetServiceKeys( HC.RATINGS_SERVICES )
|
||||
tag_service_keys = services_manager.GetServiceKeys( HC.ALL_TAG_SERVICES )
|
||||
service_keys_to_types = { service.GetServiceKey() : service.GetServiceType() for service in services_manager.GetServices() }
|
||||
service_keys_to_names = services_manager.GetServiceKeysToNames()
|
||||
|
||||
ipfs_service_keys = services_manager.GetServiceKeys( ( HC.IPFS, ) )
|
||||
|
||||
thumbnail_bounding_dimensions = CG.client_controller.options[ 'thumbnail_dimensions' ]
|
||||
thumbnail_scale_type = CG.client_controller.new_options.GetInteger( 'thumbnail_scale_type' )
|
||||
thumbnail_dpr_percent = CG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
|
||||
|
||||
for hash in hashes:
|
||||
|
||||
if hash in hashes_to_media_results:
|
||||
|
||||
media_result = hashes_to_media_results[ hash ]
|
||||
|
||||
file_info_manager = media_result.GetFileInfoManager()
|
||||
|
||||
mime = file_info_manager.mime
|
||||
width = file_info_manager.width
|
||||
height = file_info_manager.height
|
||||
|
||||
metadata_row = {
|
||||
'file_id' : file_info_manager.hash_id,
|
||||
'hash' : file_info_manager.hash.hex(),
|
||||
'size' : file_info_manager.size,
|
||||
'mime' : HC.mime_mimetype_string_lookup[ mime ],
|
||||
'filetype_human' : HC.mime_string_lookup[ file_info_manager.mime ],
|
||||
'filetype_enum' : file_info_manager.mime,
|
||||
'ext' : HC.mime_ext_lookup[ mime ],
|
||||
'width' : width,
|
||||
'height' : height,
|
||||
'duration' : file_info_manager.duration,
|
||||
'num_frames' : file_info_manager.num_frames,
|
||||
'num_words' : file_info_manager.num_words,
|
||||
'has_audio' : file_info_manager.has_audio,
|
||||
'blurhash' : file_info_manager.blurhash,
|
||||
'pixel_hash' : None if file_info_manager.pixel_hash is None else file_info_manager.pixel_hash.hex()
|
||||
}
|
||||
|
||||
filetype_forced = file_info_manager.FiletypeIsForced()
|
||||
|
||||
metadata_row[ 'filetype_forced' ] = filetype_forced
|
||||
|
||||
if filetype_forced:
|
||||
|
||||
metadata_row[ 'original_mime' ] = HC.mime_mimetype_string_lookup[ file_info_manager.original_mime ]
|
||||
|
||||
|
||||
if file_info_manager.mime in HC.MIMES_WITH_THUMBNAILS:
|
||||
|
||||
if width is not None and height is not None and width > 0 and height > 0:
|
||||
|
||||
( expected_thumbnail_width, expected_thumbnail_height ) = HydrusImageHandling.GetThumbnailResolution( ( width, height ), thumbnail_bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
|
||||
|
||||
metadata_row[ 'thumbnail_width' ] = expected_thumbnail_width
|
||||
metadata_row[ 'thumbnail_height' ] = expected_thumbnail_height
|
||||
|
||||
|
||||
|
||||
if include_notes:
|
||||
|
||||
metadata_row[ 'notes' ] = media_result.GetNotesManager().GetNamesToNotes()
|
||||
|
||||
|
||||
locations_manager = media_result.GetLocationsManager()
|
||||
|
||||
metadata_row[ 'file_services' ] = {
|
||||
'current' : {},
|
||||
'deleted' : {}
|
||||
}
|
||||
|
||||
times_manager = locations_manager.GetTimesManager()
|
||||
|
||||
current = locations_manager.GetCurrent()
|
||||
|
||||
for file_service_key in current:
|
||||
|
||||
metadata_row[ 'file_services' ][ 'current' ][ file_service_key.hex() ] = {
|
||||
'name' : service_keys_to_names[ file_service_key ],
|
||||
'type' : service_keys_to_types[ file_service_key ],
|
||||
'type_pretty' : HC.service_string_lookup[ service_keys_to_types[ file_service_key ] ],
|
||||
'time_imported' : time_converter( times_manager.GetImportedTimestampMS( file_service_key ) )
|
||||
}
|
||||
|
||||
|
||||
deleted = locations_manager.GetDeleted()
|
||||
|
||||
for file_service_key in deleted:
|
||||
|
||||
metadata_row[ 'file_services' ][ 'deleted' ][ file_service_key.hex() ] = {
|
||||
'name' : service_keys_to_names[ file_service_key ],
|
||||
'type' : service_keys_to_types[ file_service_key ],
|
||||
'type_pretty' : HC.service_string_lookup[ service_keys_to_types[ file_service_key ] ],
|
||||
'time_deleted' : time_converter( times_manager.GetDeletedTimestampMS( file_service_key ) ),
|
||||
'time_imported' : time_converter( times_manager.GetPreviouslyImportedTimestampMS( file_service_key ) )
|
||||
}
|
||||
|
||||
|
||||
metadata_row[ 'time_modified' ] = time_converter( times_manager.GetAggregateModifiedTimestampMS() )
|
||||
|
||||
domains_to_file_modified_timestamps_ms = times_manager.GetDomainModifiedTimestampsMS()
|
||||
|
||||
local_modified_timestamp_ms = times_manager.GetFileModifiedTimestampMS()
|
||||
|
||||
if local_modified_timestamp_ms is not None:
|
||||
|
||||
domains_to_file_modified_timestamps_ms[ 'local' ] = local_modified_timestamp_ms
|
||||
|
||||
|
||||
metadata_row[ 'time_modified_details' ] = { domain : time_converter( timestamp_ms ) for ( domain, timestamp_ms ) in domains_to_file_modified_timestamps_ms.items() }
|
||||
|
||||
metadata_row[ 'is_inbox' ] = locations_manager.inbox
|
||||
metadata_row[ 'is_local' ] = locations_manager.IsLocal()
|
||||
metadata_row[ 'is_trashed' ] = locations_manager.IsTrashed()
|
||||
metadata_row[ 'is_deleted' ] = CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY in locations_manager.GetDeleted() or locations_manager.IsTrashed()
|
||||
|
||||
metadata_row[ 'has_transparency' ] = file_info_manager.has_transparency
|
||||
metadata_row[ 'has_exif' ] = file_info_manager.has_exif
|
||||
metadata_row[ 'has_human_readable_embedded_metadata' ] = file_info_manager.has_human_readable_embedded_metadata
|
||||
metadata_row[ 'has_icc_profile' ] = file_info_manager.has_icc_profile
|
||||
|
||||
known_urls = sorted( locations_manager.GetURLs() )
|
||||
|
||||
metadata_row[ 'known_urls' ] = known_urls
|
||||
|
||||
metadata_row[ 'ipfs_multihashes' ] = { ipfs_service_key.hex() : multihash for ( ipfs_service_key, multihash ) in locations_manager.GetServiceFilenames().items() if ipfs_service_key in ipfs_service_keys }
|
||||
|
||||
if detailed_url_information:
|
||||
|
||||
detailed_known_urls = []
|
||||
|
||||
for known_url in known_urls:
|
||||
|
||||
try:
|
||||
|
||||
normalised_url = CG.client_controller.network_engine.domain_manager.NormaliseURL( known_url )
|
||||
|
||||
( url_type, match_name, can_parse, cannot_parse_reason ) = CG.client_controller.network_engine.domain_manager.GetURLParseCapability( normalised_url )
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
detailed_dict = { 'normalised_url' : normalised_url, 'url_type' : url_type, 'url_type_string' : HC.url_type_string_lookup[ url_type ], 'match_name' : match_name, 'can_parse' : can_parse }
|
||||
|
||||
if not can_parse:
|
||||
|
||||
detailed_dict[ 'cannot_parse_reason' ] = cannot_parse_reason
|
||||
|
||||
|
||||
detailed_known_urls.append( detailed_dict )
|
||||
|
||||
|
||||
metadata_row[ 'detailed_known_urls' ] = detailed_known_urls
|
||||
|
||||
|
||||
ratings_manager = media_result.GetRatingsManager()
|
||||
|
||||
ratings_dict = {}
|
||||
|
||||
for rating_service_key in rating_service_keys:
|
||||
|
||||
rating_object = ratings_manager.GetRatingForAPI( rating_service_key )
|
||||
|
||||
ratings_dict[ rating_service_key.hex() ] = rating_object
|
||||
|
||||
|
||||
metadata_row[ 'ratings' ] = ratings_dict
|
||||
|
||||
tags_manager = media_result.GetTagsManager()
|
||||
|
||||
tags_dict = {}
|
||||
|
||||
for tag_service_key in tag_service_keys:
|
||||
|
||||
storage_statuses_to_tags = tags_manager.GetStatusesToTags( tag_service_key, ClientTags.TAG_DISPLAY_STORAGE )
|
||||
|
||||
storage_tags_json_serialisable = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in storage_statuses_to_tags.items() if len( tags ) > 0 }
|
||||
|
||||
display_statuses_to_tags = tags_manager.GetStatusesToTags( tag_service_key, ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL )
|
||||
|
||||
display_tags_json_serialisable = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in display_statuses_to_tags.items() if len( tags ) > 0 }
|
||||
|
||||
tags_dict_object = {
|
||||
'name' : service_keys_to_names[ tag_service_key ],
|
||||
'type' : service_keys_to_types[ tag_service_key ],
|
||||
'type_pretty' : HC.service_string_lookup[ service_keys_to_types[ tag_service_key ] ],
|
||||
'storage_tags' : storage_tags_json_serialisable,
|
||||
'display_tags' : display_tags_json_serialisable
|
||||
}
|
||||
|
||||
tags_dict[ tag_service_key.hex() ] = tags_dict_object
|
||||
|
||||
|
||||
metadata_row[ 'tags' ] = tags_dict
|
||||
|
||||
# Old stuff starts here
|
||||
|
||||
api_service_keys_to_statuses_to_tags = {}
|
||||
|
||||
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_STORAGE )
|
||||
|
||||
for ( service_key, statuses_to_tags ) in service_keys_to_statuses_to_tags.items():
|
||||
|
||||
statuses_to_tags_json_serialisable = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() if len( tags ) > 0 }
|
||||
|
||||
if len( statuses_to_tags_json_serialisable ) > 0:
|
||||
|
||||
api_service_keys_to_statuses_to_tags[ service_key.hex() ] = statuses_to_tags_json_serialisable
|
||||
|
||||
|
||||
|
||||
if not hide_service_keys_tags:
|
||||
|
||||
metadata_row[ 'service_keys_to_statuses_to_tags' ] = api_service_keys_to_statuses_to_tags
|
||||
|
||||
|
||||
#
|
||||
|
||||
api_service_keys_to_statuses_to_tags = {}
|
||||
|
||||
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL )
|
||||
|
||||
for ( service_key, statuses_to_tags ) in service_keys_to_statuses_to_tags.items():
|
||||
|
||||
statuses_to_tags_json_serialisable = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() if len( tags ) > 0 }
|
||||
|
||||
if len( statuses_to_tags_json_serialisable ) > 0:
|
||||
|
||||
api_service_keys_to_statuses_to_tags[ service_key.hex() ] = statuses_to_tags_json_serialisable
|
||||
|
||||
|
||||
|
||||
if not hide_service_keys_tags:
|
||||
|
||||
metadata_row[ 'service_keys_to_statuses_to_display_tags' ] = api_service_keys_to_statuses_to_tags
|
||||
|
||||
|
||||
# old stuff ends here
|
||||
|
||||
#
|
||||
|
||||
metadata.append( metadata_row )
|
||||
|
||||
else:
|
||||
|
||||
AddMissingHashToFileMetadata( metadata, hash )
|
||||
|
||||
|
||||
|
||||
|
||||
body_dict[ 'metadata' ] = metadata
|
||||
|
||||
if include_services_object:
|
||||
|
||||
body_dict[ 'services' ] = ClientLocalServerCore.GetServicesDict()
|
||||
|
||||
|
||||
mime = request.preferred_mime
|
||||
body = ClientLocalServerCore.Dumps( body_dict, mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
media_result = ParseAndFetchMediaResult( request )
|
||||
|
||||
mime = media_result.GetMime()
|
||||
|
||||
if mime in HC.MIMES_WITH_THUMBNAILS:
|
||||
|
||||
try:
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetThumbnailPath( media_result )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
# not _supposed_ to happen, but it seems in odd situations it can
|
||||
raise HydrusExceptions.FileMissingException()
|
||||
|
||||
|
||||
except HydrusExceptions.FileMissingException:
|
||||
|
||||
path = HydrusFileHandling.mimes_to_default_thumbnail_paths[ mime ]
|
||||
|
||||
|
||||
else:
|
||||
|
||||
path = HydrusFileHandling.mimes_to_default_thumbnail_paths[ mime ]
|
||||
|
||||
|
||||
response_mime = HydrusFileHandling.GetThumbnailMime( path )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = response_mime, path = path )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetLocalPath( HydrusResourceClientAPIRestrictedGetFilesSearchFiles ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_SEE_LOCAL_PATHS )
|
||||
|
||||
super()._CheckAPIPermissions( request )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetFilePath( HydrusResourceClientAPIRestrictedGetFilesSearchFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
media_result = ParseAndFetchMediaResult( request )
|
||||
|
||||
if not media_result.GetLocationsManager().IsLocal():
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'The client does not have this file!' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
hash = media_result.GetHash()
|
||||
mime = media_result.GetMime()
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
raise HydrusExceptions.FileMissingException()
|
||||
|
||||
|
||||
except HydrusExceptions.FileMissingException:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'That file seems to be missing!' )
|
||||
|
||||
|
||||
body_dict = {
|
||||
'path' : path
|
||||
}
|
||||
|
||||
mime = request.preferred_mime
|
||||
body = ClientLocalServerCore.Dumps( body_dict, mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesGetThumbnailPath( HydrusResourceClientAPIRestrictedGetFilesSearchFiles ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
include_thumbnail_filetype = request.parsed_request_args.GetValue( 'include_thumbnail_filetype', bool, default_value = False )
|
||||
|
||||
media_result = ParseAndFetchMediaResult( request )
|
||||
|
||||
mime = media_result.GetMime()
|
||||
|
||||
if mime in HC.MIMES_WITH_THUMBNAILS:
|
||||
|
||||
try:
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetThumbnailPath( media_result )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
# not _supposed_ to happen, but it seems in odd situations it can
|
||||
raise HydrusExceptions.FileMissingException()
|
||||
|
||||
|
||||
except HydrusExceptions.FileMissingException:
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'Could not find that thumbnail!' )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, this file type does not have a thumbnail!' )
|
||||
|
||||
|
||||
if include_thumbnail_filetype:
|
||||
|
||||
thumb_mime = HydrusFileHandling.GetThumbnailMime( path )
|
||||
|
||||
body_dict = {
|
||||
'path' : path,
|
||||
'filetype' : HC.mime_mimetype_string_lookup[ thumb_mime ]
|
||||
}
|
||||
|
||||
else:
|
||||
|
||||
body_dict = {
|
||||
'path' : path
|
||||
}
|
||||
|
||||
|
||||
mime = request.preferred_mime
|
||||
body = ClientLocalServerCore.Dumps( body_dict, mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,390 @@
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.networking import ClientNetworkingContexts
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
from hydrus.client.networking import ClientNetworkingFunctions
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookies( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_HEADERS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesGetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
domain = request.parsed_request_args.GetValue( 'domain', str )
|
||||
|
||||
if '.' not in domain:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The value "{}" does not seem to be a domain!'.format( domain ) )
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
session = CG.client_controller.network_engine.session_manager.GetSession( network_context )
|
||||
|
||||
body_cookies_list = []
|
||||
|
||||
for cookie in session.cookies:
|
||||
|
||||
name = cookie.name
|
||||
value = cookie.value
|
||||
domain = cookie.domain
|
||||
path = cookie.path
|
||||
expires = cookie.expires
|
||||
|
||||
body_cookies_list.append( [ name, value, domain, path, expires ] )
|
||||
|
||||
|
||||
body_dict = { 'cookies' : body_cookies_list }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
cookie_rows = request.parsed_request_args.GetValue( 'cookies', list )
|
||||
|
||||
domains_cleared = set()
|
||||
domains_set = set()
|
||||
|
||||
# TODO: This all sucks. replace the rows in this and the _set_ with an Object, and the domains_cleared/set stuff should say more, like count removed from each etc...
|
||||
# refer to get/set_headers for example
|
||||
|
||||
for cookie_row in cookie_rows:
|
||||
|
||||
if len( cookie_row ) != 5:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The cookie "{}" did not come in the format [ name, value, domain, path, expires ]!'.format( cookie_row ) )
|
||||
|
||||
|
||||
( name, value, domain, path, expires ) = cookie_row
|
||||
|
||||
ndp_bad = True in ( not isinstance( var, str ) for var in ( name, domain, path ) )
|
||||
v_bad = value is not None and not isinstance( value, str )
|
||||
e_bad = expires is not None and not isinstance( expires, int )
|
||||
|
||||
if ndp_bad or v_bad or e_bad:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'In the row [ name, value, domain, path, expires ], which I received as "{}", name, domain, and path need to be strings, value needs to be null or a string, and expires needs to be null or an integer!'.format( cookie_row ) )
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
session = CG.client_controller.network_engine.session_manager.GetSession( network_context )
|
||||
|
||||
if value is None:
|
||||
|
||||
domains_cleared.add( domain )
|
||||
|
||||
session.cookies.clear( domain, path, name )
|
||||
|
||||
else:
|
||||
|
||||
domains_set.add( domain )
|
||||
|
||||
ClientNetworkingFunctions.AddCookieToSession( session, name, value, domain, path, expires )
|
||||
|
||||
|
||||
CG.client_controller.network_engine.session_manager.SetSessionDirty( network_context )
|
||||
|
||||
|
||||
if CG.client_controller.new_options.GetBoolean( 'notify_client_api_cookies' ) and len( domains_cleared ) + len( domains_set ) > 0:
|
||||
|
||||
domains_cleared = sorted( domains_cleared )
|
||||
domains_set = sorted( domains_set )
|
||||
|
||||
message = 'Cookies sent from API:'
|
||||
|
||||
if len( domains_cleared ) > 0:
|
||||
|
||||
message = '{} ({} cleared)'.format( message, ', '.join( domains_cleared ) )
|
||||
|
||||
|
||||
if len( domains_set ) > 0:
|
||||
|
||||
message = '{} ({} set)'.format( message, ', '.join( domains_set ) )
|
||||
|
||||
|
||||
job_status = ClientThreading.JobStatus()
|
||||
|
||||
job_status.SetStatusText( message )
|
||||
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
CG.client_controller.pub( 'message', job_status )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
user_agent = request.parsed_request_args.GetValue( 'user-agent', str )
|
||||
|
||||
if user_agent == '':
|
||||
|
||||
from hydrus.client import ClientDefaults
|
||||
|
||||
user_agent = ClientDefaults.DEFAULT_USER_AGENT
|
||||
|
||||
|
||||
CG.client_controller.network_engine.domain_manager.SetCustomHeader( ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT, 'User-Agent', value = user_agent )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def GenerateNetworkContextFromRequest( request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
domain = request.parsed_request_args.GetValueOrNone( 'domain', str )
|
||||
|
||||
if domain is None:
|
||||
|
||||
network_context = ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT
|
||||
|
||||
else:
|
||||
|
||||
if '.' not in domain:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The value "{}" does not seem to be a domain!'.format( domain ) )
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
|
||||
return network_context
|
||||
|
||||
|
||||
def RenderNetworkContextToJSONObject( network_context: ClientNetworkingContexts.NetworkContext ) -> dict:
|
||||
|
||||
result = {
|
||||
'type': network_context.context_type
|
||||
}
|
||||
|
||||
if isinstance( network_context.context_data, bytes ):
|
||||
|
||||
result[ 'data' ] = network_context.context_data.hex()
|
||||
|
||||
elif network_context.context_data is None or isinstance( network_context.context_data, str ):
|
||||
|
||||
result[ 'data' ] = network_context.context_data
|
||||
|
||||
else:
|
||||
|
||||
result[ 'data' ] = repr( network_context.context_data )
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesGetHeaders( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
network_context = GenerateNetworkContextFromRequest( request )
|
||||
|
||||
ncs_to_header_dicts = CG.client_controller.network_engine.domain_manager.GetNetworkContextsToCustomHeaderDicts()
|
||||
|
||||
body_dict = {
|
||||
'network_context': RenderNetworkContextToJSONObject( network_context )
|
||||
}
|
||||
|
||||
headers_dict = ncs_to_header_dicts.get( network_context, {} )
|
||||
|
||||
body_headers_dict = {}
|
||||
|
||||
for ( key, ( value, approved, reason ) ) in headers_dict.items():
|
||||
|
||||
body_headers_dict[ key ] = {
|
||||
'value' : value,
|
||||
'approved' : ClientNetworkingDomain.valid_str_lookup[ approved ],
|
||||
'reason' : reason
|
||||
}
|
||||
|
||||
|
||||
body_dict[ 'headers' ] = body_headers_dict
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
network_context = GenerateNetworkContextFromRequest( request )
|
||||
http_header_objects = request.parsed_request_args.GetValue( 'headers', dict )
|
||||
|
||||
headers_cleared = set()
|
||||
headers_set = set()
|
||||
headers_altered = set()
|
||||
|
||||
for ( key, info_dict ) in http_header_objects.items():
|
||||
|
||||
ncs_to_header_dicts = CG.client_controller.network_engine.domain_manager.GetNetworkContextsToCustomHeaderDicts()
|
||||
|
||||
if network_context in ncs_to_header_dicts:
|
||||
|
||||
headers_dict = ncs_to_header_dicts[ network_context ]
|
||||
|
||||
else:
|
||||
|
||||
headers_dict = {}
|
||||
|
||||
|
||||
approved = None
|
||||
reason = None
|
||||
|
||||
if 'approved' in info_dict:
|
||||
|
||||
approved_str = info_dict[ 'approved' ]
|
||||
|
||||
approved = ClientNetworkingDomain.valid_enum_lookup.get( approved_str, None )
|
||||
|
||||
if approved is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The value "{}" was not in the permitted list!'.format( approved_str ) )
|
||||
|
||||
|
||||
|
||||
if 'reason' in info_dict:
|
||||
|
||||
reason = info_dict[ 'reason' ]
|
||||
|
||||
if not isinstance( reason, str ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The reason "{}" was not a string!'.format( reason ) )
|
||||
|
||||
|
||||
|
||||
if 'value' in info_dict:
|
||||
|
||||
value = info_dict[ 'value' ]
|
||||
|
||||
if value is None:
|
||||
|
||||
if key in headers_dict:
|
||||
|
||||
CG.client_controller.network_engine.domain_manager.DeleteCustomHeader( network_context, key )
|
||||
|
||||
headers_cleared.add( key )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if not isinstance( value, str ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The value "{}" was not a string!'.format( value ) )
|
||||
|
||||
|
||||
do_it = True
|
||||
|
||||
if key in headers_dict:
|
||||
|
||||
old_value = headers_dict[ key ][0]
|
||||
|
||||
if old_value == value:
|
||||
|
||||
do_it = False
|
||||
|
||||
else:
|
||||
|
||||
headers_altered.add( key )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
headers_set.add( key )
|
||||
|
||||
|
||||
if do_it:
|
||||
|
||||
CG.client_controller.network_engine.domain_manager.SetCustomHeader( network_context, key, value = value, approved = approved, reason = reason )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if approved is None and reason is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you have to set a value, approved, or reason parameter!' )
|
||||
|
||||
|
||||
if key not in headers_dict:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, you tried to set approved/reason on "{}" for "{}", but that entry does not exist, so there is no value to set them to! Please give a value!'.format( key, network_context ) )
|
||||
|
||||
|
||||
headers_altered.add( key )
|
||||
|
||||
CG.client_controller.network_engine.domain_manager.SetCustomHeader( network_context, key, approved = approved, reason = reason )
|
||||
|
||||
|
||||
|
||||
if CG.client_controller.new_options.GetBoolean( 'notify_client_api_cookies' ) and len( headers_cleared ) + len( headers_set ) + len( headers_altered ) > 0:
|
||||
|
||||
message_lines = [ 'Headers sent from API:' ]
|
||||
|
||||
if len( headers_cleared ) > 0:
|
||||
|
||||
message_lines.extend( [ 'Cleared: {}'.format( key ) for key in sorted( headers_cleared ) ] )
|
||||
|
||||
|
||||
if len( headers_set ) > 0:
|
||||
|
||||
message_lines.extend( [ 'Set: {}'.format( key ) for key in sorted( headers_set ) ] )
|
||||
|
||||
|
||||
if len( headers_set ) > 0:
|
||||
|
||||
message_lines.extend( [ 'Altered: {}'.format( key ) for key in sorted( headers_altered ) ] )
|
||||
|
||||
|
||||
message = '\n'.join( message_lines )
|
||||
|
||||
job_status = ClientThreading.JobStatus()
|
||||
|
||||
job_status.SetStatusText( message )
|
||||
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
CG.client_controller.pub( 'message', job_status )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,166 @@
|
|||
import threading
|
||||
import time
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientOptions
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
from hydrus.client.search import ClientSearchFileSearchContext
|
||||
from hydrus.client.search import ClientSearchTagContext
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageDatabase( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_DATABASE )
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageDatabaseLockOff( HydrusResourceClientAPIRestrictedManageDatabase ):
|
||||
|
||||
BLOCKED_WHEN_BUSY = False
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
try:
|
||||
|
||||
HG.client_busy.release()
|
||||
|
||||
except threading.ThreadError:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The server is not busy!' )
|
||||
|
||||
|
||||
CG.client_controller.db.PauseAndDisconnect( False )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageDatabaseLockOn( HydrusResourceClientAPIRestrictedManageDatabase ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
locked = HG.client_busy.acquire( False ) # pylint: disable=E1111
|
||||
|
||||
if not locked:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The client was already locked!' )
|
||||
|
||||
|
||||
CG.client_controller.db.PauseAndDisconnect( True )
|
||||
|
||||
TIME_BLOCK = 0.25
|
||||
|
||||
for i in range( int( 5 / TIME_BLOCK ) ):
|
||||
|
||||
if not CG.client_controller.db.IsConnected():
|
||||
|
||||
break
|
||||
|
||||
|
||||
time.sleep( TIME_BLOCK )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageDatabaseMrBones( HydrusResourceClientAPIRestrictedManageDatabase ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
tag_service_key = ClientLocalServerCore.ParseTagServiceKey( request )
|
||||
|
||||
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and location_context.IsAllKnownFiles():
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, search for all known tags over all known files is not supported!' )
|
||||
|
||||
|
||||
tag_context = ClientSearchTagContext.TagContext( service_key = tag_service_key )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( request )
|
||||
|
||||
file_search_context = ClientSearchFileSearchContext.FileSearchContext( location_context = location_context, tag_context = tag_context, predicates = predicates )
|
||||
|
||||
job_status = ClientThreading.JobStatus( cancellable = True )
|
||||
|
||||
request.disconnect_callables.append( job_status.Cancel )
|
||||
|
||||
boned_stats = CG.client_controller.Read( 'boned_stats', file_search_context = file_search_context, job_status = job_status )
|
||||
|
||||
body_dict = { 'boned_stats' : boned_stats }
|
||||
|
||||
mime = request.preferred_mime
|
||||
body = ClientLocalServerCore.Dumps( body_dict, mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageDatabaseGetClientOptions( HydrusResourceClientAPIRestrictedManageDatabase ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
from hydrus.client import ClientDefaults
|
||||
|
||||
OLD_OPTIONS_DEFAULT = ClientDefaults.GetClientDefaultOptions()
|
||||
|
||||
old_options = CG.client_controller.options
|
||||
|
||||
old_options = { key : value for ( key, value ) in old_options.items() if key in OLD_OPTIONS_DEFAULT }
|
||||
|
||||
new_options: ClientOptions.ClientOptions = CG.client_controller.new_options
|
||||
|
||||
options_dict = {
|
||||
'booleans' : new_options.GetAllBooleans(),
|
||||
'strings' : new_options.GetAllStrings(),
|
||||
'noneable_strings' : new_options.GetAllNoneableStrings(),
|
||||
'integers' : new_options.GetAllIntegers(),
|
||||
'noneable_integers' : new_options.GetAllNoneableIntegers(),
|
||||
'keys' : new_options.GetAllKeysHex(),
|
||||
'colors' : new_options.GetAllColours(),
|
||||
'media_zooms' : new_options.GetMediaZooms(),
|
||||
'slideshow_durations' : new_options.GetSlideshowDurations(),
|
||||
'default_file_import_options' : {
|
||||
'loud' : new_options.GetDefaultFileImportOptions('loud').GetSummary(),
|
||||
'quiet' : new_options.GetDefaultFileImportOptions('quiet').GetSummary()
|
||||
},
|
||||
'default_namespace_sorts' : [ sort.ToDictForAPI() for sort in new_options.GetDefaultNamespaceSorts() ],
|
||||
'default_sort' : new_options.GetDefaultSort().ToDictForAPI(),
|
||||
'default_tag_sort' : new_options.GetDefaultTagSort( CC.TAG_PRESENTATION_SEARCH_PAGE ).ToDictForAPI(),
|
||||
'default_tag_sort_search_page' : new_options.GetDefaultTagSort( CC.TAG_PRESENTATION_SEARCH_PAGE ).ToDictForAPI(),
|
||||
'default_tag_sort_search_page_manage_tags' : new_options.GetDefaultTagSort( CC.TAG_PRESENTATION_SEARCH_PAGE_MANAGE_TAGS ).ToDictForAPI(),
|
||||
'default_tag_sort_media_viewer' : new_options.GetDefaultTagSort( CC.TAG_PRESENTATION_MEDIA_VIEWER ).ToDictForAPI(),
|
||||
'default_tag_sort_media_vewier_manage_tags' : new_options.GetDefaultTagSort( CC.TAG_PRESENTATION_MEDIA_VIEWER_MANAGE_TAGS ).ToDictForAPI(),
|
||||
'fallback_sort' : new_options.GetFallbackSort().ToDictForAPI(),
|
||||
'suggested_tags_favourites' : new_options.GetAllSuggestedTagsFavourites(),
|
||||
'default_local_location_context' : new_options.GetDefaultLocalLocationContext().ToDictForAPI()
|
||||
}
|
||||
|
||||
body_dict = {
|
||||
'old_options' : old_options,
|
||||
'options' : options_dict,
|
||||
'services' : ClientLocalServerCore.GetServicesDict()
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,299 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusNetworkVariableHandling
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.media import ClientMediaFileFilter
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationships( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRelationships( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
location_context = ClientLocalServerCore.ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
# maybe in future we'll just get the media results and dump the dict from there, but whatever
|
||||
hashes_to_file_duplicates = CG.client_controller.Read( 'file_relationships_for_api', location_context, hashes )
|
||||
|
||||
body_dict = { 'file_relationships' : hashes_to_file_duplicates }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialsCount( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
(
|
||||
file_search_context_1,
|
||||
file_search_context_2,
|
||||
dupe_search_type,
|
||||
pixel_dupes_preference,
|
||||
max_hamming_distance
|
||||
) = ClientLocalServerCore.ParseDuplicateSearch( request )
|
||||
|
||||
count = CG.client_controller.Read( 'potential_duplicates_count', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
body_dict = { 'potential_duplicates_count' : count }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsGetPotentialPairs( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
(
|
||||
file_search_context_1,
|
||||
file_search_context_2,
|
||||
dupe_search_type,
|
||||
pixel_dupes_preference,
|
||||
max_hamming_distance
|
||||
) = ClientLocalServerCore.ParseDuplicateSearch( request )
|
||||
|
||||
max_num_pairs = request.parsed_request_args.GetValue( 'max_num_pairs', int, default_value = CG.client_controller.new_options.GetInteger( 'duplicate_filter_max_batch_size' ) )
|
||||
|
||||
filtering_pairs_media_results = CG.client_controller.Read( 'duplicate_pairs_for_filtering', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance, max_num_pairs = max_num_pairs )
|
||||
|
||||
filtering_pairs_hashes = [ ( m1.GetHash().hex(), m2.GetHash().hex() ) for ( m1, m2 ) in filtering_pairs_media_results ]
|
||||
|
||||
body_dict = { 'potential_duplicate_pairs' : filtering_pairs_hashes }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsGetRandomPotentials( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
(
|
||||
file_search_context_1,
|
||||
file_search_context_2,
|
||||
dupe_search_type,
|
||||
pixel_dupes_preference,
|
||||
max_hamming_distance
|
||||
) = ClientLocalServerCore.ParseDuplicateSearch( request )
|
||||
|
||||
hashes = CG.client_controller.Read( 'random_potential_duplicate_hashes', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
body_dict = { 'random_potential_duplicate_hashes' : [ hash.hex() for hash in hashes ] }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsRemovePotentials( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'remove_potential_pairs', hashes )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsSetKings( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
for hash in hashes:
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'duplicate_set_king', hash )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships( HydrusResourceClientAPIRestrictedManageFileRelationships ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
database_write_rows = []
|
||||
|
||||
raw_rows = []
|
||||
|
||||
# TODO: now I rewangled this to remove the pair_rows parameter, let's get an object or dict bouncing around so we aren't handling a mega-tuple
|
||||
|
||||
raw_relationship_dicts = request.parsed_request_args.GetValue( 'relationships', list, expected_list_type = dict, default_value = [] )
|
||||
|
||||
for raw_relationship_dict in raw_relationship_dicts:
|
||||
|
||||
duplicate_type = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'relationship', int )
|
||||
hash_a_hex = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'hash_a', str )
|
||||
hash_b_hex = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'hash_b', str )
|
||||
do_default_content_merge = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'do_default_content_merge', bool )
|
||||
delete_a = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'delete_a', bool, default_value = False )
|
||||
delete_b = HydrusNetworkVariableHandling.GetValueFromDict( raw_relationship_dict, 'delete_b', bool, default_value = False )
|
||||
|
||||
raw_rows.append( ( duplicate_type, hash_a_hex, hash_b_hex, do_default_content_merge, delete_a, delete_b ) )
|
||||
|
||||
|
||||
allowed_duplicate_types = {
|
||||
HC.DUPLICATE_FALSE_POSITIVE,
|
||||
HC.DUPLICATE_ALTERNATE,
|
||||
HC.DUPLICATE_BETTER,
|
||||
HC.DUPLICATE_WORSE,
|
||||
HC.DUPLICATE_SAME_QUALITY,
|
||||
HC.DUPLICATE_POTENTIAL
|
||||
}
|
||||
|
||||
all_hashes = set()
|
||||
|
||||
# variable type testing
|
||||
for row in raw_rows:
|
||||
|
||||
( duplicate_type, hash_a_hex, hash_b_hex, do_default_content_merge, delete_first, delete_second ) = row
|
||||
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'relationship', duplicate_type, int, allowed_values = allowed_duplicate_types )
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'hash_a', hash_a_hex, str )
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'hash_b', hash_b_hex, str )
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'do_default_content_merge', do_default_content_merge, bool )
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'delete_first', delete_first, bool )
|
||||
HydrusNetworkVariableHandling.TestVariableType( 'delete_second', delete_second, bool )
|
||||
|
||||
try:
|
||||
|
||||
hash_a = bytes.fromhex( hash_a_hex )
|
||||
hash_b = bytes.fromhex( hash_b_hex )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, did not understand one of the hashes {} or {}!'.format( hash_a_hex, hash_b_hex ) )
|
||||
|
||||
|
||||
ClientLocalServerCore.CheckHashLength( ( hash_a, hash_b ) )
|
||||
|
||||
all_hashes.update( ( hash_a, hash_b ) )
|
||||
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', all_hashes )
|
||||
|
||||
hashes_to_media_results = { media_result.GetHash() : media_result for media_result in media_results }
|
||||
|
||||
for row in raw_rows:
|
||||
|
||||
( duplicate_type, hash_a_hex, hash_b_hex, do_default_content_merge, delete_first, delete_second ) = row
|
||||
|
||||
hash_a = bytes.fromhex( hash_a_hex )
|
||||
hash_b = bytes.fromhex( hash_b_hex )
|
||||
|
||||
content_update_packages = []
|
||||
|
||||
first_media = ClientMedia.MediaSingleton( hashes_to_media_results[ hash_a ] )
|
||||
second_media = ClientMedia.MediaSingleton( hashes_to_media_results[ hash_b ] )
|
||||
|
||||
file_deletion_reason = 'From Client API (duplicates processing).'
|
||||
|
||||
if do_default_content_merge:
|
||||
|
||||
duplicate_content_merge_options = CG.client_controller.new_options.GetDuplicateContentMergeOptions( duplicate_type )
|
||||
|
||||
content_update_packages.append( duplicate_content_merge_options.ProcessPairIntoContentUpdatePackage( first_media, second_media, file_deletion_reason = file_deletion_reason, delete_first = delete_first, delete_second = delete_second ) )
|
||||
|
||||
elif delete_first or delete_second:
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage()
|
||||
|
||||
deletee_media = set()
|
||||
|
||||
if delete_first:
|
||||
|
||||
deletee_media.add( first_media )
|
||||
|
||||
|
||||
if delete_second:
|
||||
|
||||
deletee_media.add( second_media )
|
||||
|
||||
|
||||
for media in deletee_media:
|
||||
|
||||
if media.HasDeleteLocked():
|
||||
|
||||
ClientMediaFileFilter.ReportDeleteLockFailures( [ media ] )
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if media.GetLocationsManager().IsTrashed():
|
||||
|
||||
deletee_service_keys = ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, )
|
||||
|
||||
else:
|
||||
|
||||
local_file_service_keys = CG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, ) )
|
||||
|
||||
deletee_service_keys = media.GetLocationsManager().GetCurrent().intersection( local_file_service_keys )
|
||||
|
||||
|
||||
for deletee_service_key in deletee_service_keys:
|
||||
|
||||
content_update = ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, media.GetHashes(), reason = file_deletion_reason )
|
||||
|
||||
content_update_package.AddContentUpdate( deletee_service_key, content_update )
|
||||
|
||||
|
||||
|
||||
content_update_packages.append( content_update_package )
|
||||
|
||||
|
||||
database_write_rows.append( ( duplicate_type, hash_a, hash_b, content_update_packages ) )
|
||||
|
||||
|
||||
if len( database_write_rows ) > 0:
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'duplicate_pair_status', database_write_rows )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,167 @@
|
|||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePages( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesAddFiles( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it( page_key, media_results ):
|
||||
|
||||
page = CG.client_controller.gui.GetPageFromPageKey( page_key )
|
||||
|
||||
from hydrus.client.gui.pages import ClientGUIPages
|
||||
|
||||
if page is None:
|
||||
|
||||
raise HydrusExceptions.DataMissing()
|
||||
|
||||
|
||||
if not isinstance( page, ClientGUIPages.Page ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'That page key was not for a normal media page!' )
|
||||
|
||||
|
||||
page.AddMediaResults( media_results )
|
||||
|
||||
|
||||
if 'page_key' not in request.parsed_request_args:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'You need a page key for this request!' )
|
||||
|
||||
|
||||
page_key = request.parsed_request_args.GetValue( 'page_key', bytes )
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request )
|
||||
|
||||
media_results = CG.client_controller.Read( 'media_results', hashes, sorted = True )
|
||||
|
||||
try:
|
||||
|
||||
CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it, page_key, media_results )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Could not find that page!' )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesFocusPage( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it( page_key ):
|
||||
|
||||
return CG.client_controller.gui.ShowPage( page_key )
|
||||
|
||||
|
||||
page_key = request.parsed_request_args.GetValue( 'page_key', bytes )
|
||||
|
||||
try:
|
||||
|
||||
CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it, page_key )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Could not find that page!' )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesGetPages( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it():
|
||||
|
||||
return CG.client_controller.gui.GetCurrentSessionPageAPIInfoDict()
|
||||
|
||||
|
||||
page_info_dict = CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it )
|
||||
|
||||
body_dict = { 'pages' : page_info_dict }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it( page_key, simple ):
|
||||
|
||||
return CG.client_controller.gui.GetPageAPIInfoDict( page_key, simple )
|
||||
|
||||
|
||||
page_key = request.parsed_request_args.GetValue( 'page_key', bytes )
|
||||
|
||||
simple = request.parsed_request_args.GetValue( 'simple', bool, default_value = True )
|
||||
|
||||
page_info_dict = CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it, page_key, simple )
|
||||
|
||||
if page_info_dict is None:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Did not find a page for "{}"!'.format( page_key.hex() ) )
|
||||
|
||||
|
||||
body_dict = { 'page_info' : page_info_dict }
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesRefreshPage( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it( page_key ):
|
||||
|
||||
return CG.client_controller.gui.RefreshPage( page_key )
|
||||
|
||||
|
||||
page_key = request.parsed_request_args.GetValue( 'page_key', bytes )
|
||||
|
||||
try:
|
||||
|
||||
CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it, page_key )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Could not find that page!' )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,361 @@
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.gui import ClientGUIPopupMessages
|
||||
from hydrus.client.networking import ClientNetworkingJobs
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
def JobStatusToDict( job_status: ClientThreading.JobStatus ):
|
||||
|
||||
return_dict = {
|
||||
'key' : job_status.GetKey().hex(),
|
||||
'creation_time' : job_status.GetCreationTime(),
|
||||
'status_title' : job_status.GetStatusTitle(),
|
||||
'status_text_1' : job_status.GetStatusText( 1 ),
|
||||
'status_text_2' : job_status.GetStatusText( 2 ),
|
||||
'traceback' : job_status.GetTraceback(),
|
||||
'had_error' : job_status.HadError(),
|
||||
'is_cancellable' : job_status.IsCancellable(),
|
||||
'is_cancelled' : job_status.IsCancelled(),
|
||||
'is_done' : job_status.IsDone(),
|
||||
'is_pausable' : job_status.IsPausable(),
|
||||
'is_paused' : job_status.IsPaused(),
|
||||
'nice_string' : job_status.ToString(),
|
||||
'popup_gauge_1' : job_status.GetIfHasVariable( 'popup_gauge_1' ),
|
||||
'popup_gauge_2' : job_status.GetIfHasVariable( 'popup_gauge_2' ),
|
||||
'attached_files_mergable' : job_status.GetIfHasVariable( 'attached_files_mergable' ),
|
||||
'api_data' : job_status.GetIfHasVariable( 'api_data' )
|
||||
}
|
||||
|
||||
files_object = job_status.GetFiles()
|
||||
|
||||
if files_object is not None:
|
||||
|
||||
( hashes, label ) = files_object
|
||||
|
||||
return_dict[ 'files' ] = {
|
||||
'hashes' : [ hash.hex() for hash in hashes ],
|
||||
'label': label
|
||||
}
|
||||
|
||||
|
||||
user_callable = job_status.GetUserCallable()
|
||||
|
||||
if user_callable is not None:
|
||||
|
||||
return_dict[ 'user_callable_label' ] = user_callable.GetLabel()
|
||||
|
||||
|
||||
network_job: ClientNetworkingJobs.NetworkJob = job_status.GetNetworkJob()
|
||||
|
||||
if network_job is not None:
|
||||
|
||||
( status_text, current_speed, bytes_read, bytes_to_read ) = network_job.GetStatus()
|
||||
|
||||
network_job_dict = {
|
||||
'url' : network_job.GetURL(),
|
||||
'waiting_on_connection_error' : network_job.CurrentlyWaitingOnConnectionError(),
|
||||
'domain_ok' : network_job.DomainOK(),
|
||||
'waiting_on_serverside_bandwidth' : network_job.CurrentlyWaitingOnServersideBandwidth(),
|
||||
'no_engine_yet' : network_job.NoEngineYet(),
|
||||
'has_error' : network_job.HasError(),
|
||||
'total_data_used' : network_job.GetTotalDataUsed(),
|
||||
'is_done' : network_job.IsDone(),
|
||||
'status_text' : status_text,
|
||||
'current_speed' : current_speed,
|
||||
'bytes_read' : bytes_read,
|
||||
'bytes_to_read' : bytes_to_read
|
||||
}
|
||||
|
||||
return_dict[ 'network_job' ] = network_job_dict
|
||||
|
||||
|
||||
return { k: v for k, v in return_dict.items() if v is not None }
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopups( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_POPUPS )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsAddPopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
pausable = request.parsed_request_args.GetValue( 'is_pausable', bool, default_value = False )
|
||||
cancellable = request.parsed_request_args.GetValue( 'is_cancellable', bool, default_value = False )
|
||||
|
||||
job_status = ClientThreading.JobStatus( pausable = pausable, cancellable = cancellable )
|
||||
|
||||
if request.parsed_request_args.GetValue( 'attached_files_mergable', bool, default_value = False ):
|
||||
|
||||
job_status.SetVariable( 'attached_files_mergable', True )
|
||||
|
||||
|
||||
HandlePopupUpdate( job_status, request )
|
||||
|
||||
CG.client_controller.pub( 'message', job_status )
|
||||
|
||||
body_dict = {
|
||||
'job_status': JobStatusToDict( job_status )
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def GetJobStatusFromRequest( request: HydrusServerRequest.HydrusRequest ) -> ClientThreading.JobStatus:
|
||||
|
||||
job_status_key = request.parsed_request_args.GetValue( 'job_status_key', bytes )
|
||||
|
||||
job_status_queue: ClientGUIPopupMessages.JobStatusPopupQueue = CG.client_controller.job_status_popup_queue
|
||||
|
||||
job_status = job_status_queue.GetJobStatus( job_status_key )
|
||||
|
||||
if job_status is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'This job key doesn\'t exist!' )
|
||||
|
||||
|
||||
return job_status
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
user_callable = job_status.GetUserCallable()
|
||||
|
||||
if user_callable is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException('This job doesn\'t have a user callable!')
|
||||
|
||||
|
||||
CG.client_controller.CallBlockingToQt( CG.client_controller.gui, user_callable )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsCancelPopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
if job_status.IsCancellable():
|
||||
|
||||
job_status.Cancel()
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
if job_status.IsDone():
|
||||
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsFinishAndDismissPopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
seconds = request.parsed_request_args.GetValueOrNone( 'seconds', int )
|
||||
|
||||
job_status.FinishAndDismiss( seconds )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsGetPopups( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status_queue: ClientGUIPopupMessages.JobStatusPopupQueue = CG.client_controller.job_status_popup_queue
|
||||
|
||||
only_in_view = request.parsed_request_args.GetValue( 'only_in_view', bool, default_value = False )
|
||||
|
||||
job_statuses = job_status_queue.GetJobStatuses( only_in_view )
|
||||
|
||||
body_dict = {
|
||||
'job_statuses' : [JobStatusToDict( job ) for job in job_statuses]
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def HandleGenericVariable( name: str, type: type ):
|
||||
|
||||
if name in request.parsed_request_args:
|
||||
|
||||
value = request.parsed_request_args.GetValueOrNone( name, type )
|
||||
|
||||
if value is not None:
|
||||
|
||||
job_status.SetVariable( name, value )
|
||||
|
||||
else:
|
||||
|
||||
job_status.DeleteVariable( name )
|
||||
|
||||
|
||||
|
||||
|
||||
if 'status_title' in request.parsed_request_args:
|
||||
|
||||
status_title = request.parsed_request_args.GetValueOrNone( 'status_title', str )
|
||||
|
||||
if status_title is not None:
|
||||
|
||||
job_status.SetStatusTitle( status_title )
|
||||
|
||||
else:
|
||||
|
||||
job_status.DeleteStatusTitle()
|
||||
|
||||
|
||||
|
||||
if 'status_text_1' in request.parsed_request_args:
|
||||
|
||||
status_text = request.parsed_request_args.GetValueOrNone( 'status_text_1', str )
|
||||
|
||||
if status_text is not None:
|
||||
|
||||
job_status.SetStatusText( status_text, 1 )
|
||||
|
||||
else:
|
||||
|
||||
job_status.DeleteStatusText()
|
||||
|
||||
|
||||
|
||||
if 'status_text_2' in request.parsed_request_args:
|
||||
|
||||
status_text_2 = request.parsed_request_args.GetValueOrNone( 'status_text_2', str )
|
||||
|
||||
if status_text_2 is not None:
|
||||
|
||||
job_status.SetStatusText( status_text_2, 2 )
|
||||
|
||||
else:
|
||||
|
||||
job_status.DeleteStatusText( 2 )
|
||||
|
||||
|
||||
|
||||
HandleGenericVariable( 'api_data', dict )
|
||||
|
||||
for name in ['popup_gauge_1', 'popup_gauge_2']:
|
||||
|
||||
if name in request.parsed_request_args:
|
||||
|
||||
value = request.parsed_request_args.GetValueOrNone( name, list, expected_list_type = int )
|
||||
|
||||
if value is not None:
|
||||
|
||||
if len(value) != 2:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The parameter "{}" had an invalid number of items!'.format( name ) )
|
||||
|
||||
|
||||
job_status.SetVariable( name, value )
|
||||
|
||||
else:
|
||||
|
||||
job_status.DeleteVariable( name )
|
||||
|
||||
|
||||
|
||||
|
||||
files_label = request.parsed_request_args.GetValueOrNone( 'files_label', str )
|
||||
|
||||
hashes = ClientLocalServerCore.ParseHashes( request, True )
|
||||
|
||||
if hashes is not None:
|
||||
|
||||
if len(hashes) > 0 and files_label is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( '"files_label" is required to add files to a popup!' )
|
||||
|
||||
|
||||
job_status.SetFiles( hashes, files_label )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( HydrusResourceClientAPIRestrictedManagePopups ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
HandlePopupUpdate( job_status, request )
|
||||
|
||||
body_dict = {
|
||||
'job_status': JobStatusToDict( job_status )
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -0,0 +1,107 @@
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
from hydrus.client import ClientAPI
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.networking.api import ClientLocalServerResources
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageServices( ClientLocalServerResources.HydrusResourceClientAPIRestricted ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs( HydrusResourceClientAPIRestrictedManageServices ):
|
||||
|
||||
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_COMMIT_PENDING )
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageServicesPendingCounts( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
info_type_to_str_lookup = {
|
||||
HC.SERVICE_INFO_NUM_PENDING_MAPPINGS : 'pending_tag_mappings',
|
||||
HC.SERVICE_INFO_NUM_PETITIONED_MAPPINGS : 'petitioned_tag_mappings',
|
||||
HC.SERVICE_INFO_NUM_PENDING_TAG_SIBLINGS : 'pending_tag_siblings',
|
||||
HC.SERVICE_INFO_NUM_PETITIONED_TAG_SIBLINGS : 'petitioned_tag_siblings',
|
||||
HC.SERVICE_INFO_NUM_PENDING_TAG_PARENTS : 'pending_tag_parents',
|
||||
HC.SERVICE_INFO_NUM_PETITIONED_TAG_PARENTS : 'petitioned_tag_parents',
|
||||
HC.SERVICE_INFO_NUM_PENDING_FILES : 'pending_files',
|
||||
HC.SERVICE_INFO_NUM_PETITIONED_FILES : 'petitioned_files',
|
||||
}
|
||||
|
||||
service_keys_to_info_types_to_counts = CG.client_controller.Read( 'nums_pending' )
|
||||
|
||||
body_dict = {
|
||||
'pending_counts' : { service_key.hex() : { info_type_to_str_lookup[ info_type ] : count for ( info_type, count ) in info_types_to_counts.items() } for ( service_key, info_types_to_counts ) in service_keys_to_info_types_to_counts.items() },
|
||||
'services' : ClientLocalServerCore.GetServicesDict()
|
||||
}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageServicesCommitPending( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
service_key = request.parsed_request_args.GetValue( 'service_key', bytes )
|
||||
|
||||
ClientLocalServerCore.CheckUploadableService( service_key )
|
||||
|
||||
def do_it():
|
||||
|
||||
if CG.client_controller.gui.IsCurrentlyUploadingPending( service_key ):
|
||||
|
||||
raise HydrusExceptions.ConflictException( 'Upload is already running.' )
|
||||
|
||||
|
||||
result = CG.client_controller.gui.UploadPending( service_key )
|
||||
|
||||
if not result:
|
||||
|
||||
raise HydrusExceptions.UnprocessableEntity( 'Sorry, could not start for some complex reason--check the client!' )
|
||||
|
||||
|
||||
|
||||
CG.client_controller.CallBlockingToQt( CG.client_controller.gui, do_it )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageServicesForgetPending( HydrusResourceClientAPIRestrictedManageServicesPendingContentJobs ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
service_key = request.parsed_request_args.GetValue( 'service_key', bytes )
|
||||
|
||||
ClientLocalServerCore.CheckUploadableService( service_key )
|
||||
|
||||
CG.client_controller.WriteSynchronous( 'delete_pending', service_key )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body = ClientLocalServerCore.Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
|
@ -105,8 +105,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 590
|
||||
CLIENT_API_VERSION = 70
|
||||
SOFTWARE_VERSION = 591
|
||||
CLIENT_API_VERSION = 71
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -1203,7 +1203,7 @@ mime_string_lookup = {
|
|||
GENERAL_AUDIO : 'audio',
|
||||
GENERAL_IMAGE : 'image',
|
||||
GENERAL_VIDEO : 'video',
|
||||
GENERAL_ANIMATION : 'animation',
|
||||
GENERAL_ANIMATION : 'animation'
|
||||
}
|
||||
|
||||
string_enum_lookup = { s : enum for ( enum, s ) in mime_string_lookup.items() }
|
||||
|
@ -1290,7 +1290,7 @@ mime_mimetype_string_lookup = {
|
|||
GENERAL_AUDIO : 'audio',
|
||||
GENERAL_IMAGE : 'image',
|
||||
GENERAL_VIDEO : 'video',
|
||||
GENERAL_ANIMATION : 'animation',
|
||||
GENERAL_ANIMATION : 'animation'
|
||||
}
|
||||
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_WM ] = '{} or {}'.format( mime_mimetype_string_lookup[ AUDIO_WMA ], mime_mimetype_string_lookup[ VIDEO_WMV ] )
|
||||
|
|
|
@ -9,7 +9,7 @@ class HydrusRequest( Request ):
|
|||
|
||||
def __init__( self, *args, **kwargs ):
|
||||
|
||||
Request.__init__( self, *args, **kwargs )
|
||||
super().__init__( *args, **kwargs )
|
||||
|
||||
self.start_time = HydrusTime.GetNowPrecise()
|
||||
self.parsed_request_args = HydrusNetworkVariableHandling.ParsedRequestArguments()
|
||||
|
|
|
@ -33,9 +33,9 @@ from hydrus.client.media import ClientMediaManagers
|
|||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking import ClientLocalServer
|
||||
from hydrus.client.networking import ClientLocalServerResources
|
||||
from hydrus.client.networking import ClientNetworkingContexts
|
||||
from hydrus.client.networking.api import ClientLocalServer
|
||||
from hydrus.client.networking.api import ClientLocalServerCore
|
||||
from hydrus.client.search import ClientSearchFileSearchContext
|
||||
from hydrus.client.search import ClientSearchPredicate
|
||||
from hydrus.client.search import ClientSearchTagContext
|
||||
|
@ -318,16 +318,23 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
# /request_new_permissions
|
||||
|
||||
def format_request_new_permissions_query( name, basic_permissions ):
|
||||
def format_request_new_permissions_query( name, permits_everything, basic_permissions ):
|
||||
|
||||
return '/request_new_permissions?name={}&basic_permissions={}'.format( urllib.parse.quote( name ), urllib.parse.quote( json.dumps( basic_permissions ) ) )
|
||||
if permits_everything:
|
||||
|
||||
return f'/request_new_permissions?name={urllib.parse.quote( name )}&permits_everything=true'
|
||||
|
||||
else:
|
||||
|
||||
return f'/request_new_permissions?name={urllib.parse.quote( name )}&basic_permissions={urllib.parse.quote( json.dumps( basic_permissions ) )}'
|
||||
|
||||
|
||||
|
||||
# fail as dialog not open
|
||||
|
||||
ClientAPI.api_request_dialog_open = False
|
||||
|
||||
connection.request( 'GET', format_request_new_permissions_query( 'test', [ ClientAPI.CLIENT_API_PERMISSION_ADD_FILES ] ) )
|
||||
connection.request( 'GET', format_request_new_permissions_query( 'test', False, [ ClientAPI.CLIENT_API_PERMISSION_ADD_FILES ] ) )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
|
@ -343,22 +350,22 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
permissions_to_set_up = []
|
||||
|
||||
permissions_to_set_up.append( ( 'everything', list( ClientAPI.ALLOWED_PERMISSIONS ) ) )
|
||||
permissions_to_set_up.append( ( 'add_files', [ ClientAPI.CLIENT_API_PERMISSION_ADD_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'add_tags', [ ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS ] ) )
|
||||
permissions_to_set_up.append( ( 'add_urls', [ ClientAPI.CLIENT_API_PERMISSION_ADD_URLS ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_pages', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_headers', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_HEADERS ] ) )
|
||||
permissions_to_set_up.append( ( 'search_all_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'search_green_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'everything', True, [] ) )
|
||||
permissions_to_set_up.append( ( 'add_files', False, [ ClientAPI.CLIENT_API_PERMISSION_ADD_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'add_tags', False, [ ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS ] ) )
|
||||
permissions_to_set_up.append( ( 'add_urls', False, [ ClientAPI.CLIENT_API_PERMISSION_ADD_URLS ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_pages', False, [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_headers', False, [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_HEADERS ] ) )
|
||||
permissions_to_set_up.append( ( 'search_all_files', False, [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'search_green_files', False, [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
|
||||
set_up_permissions = {}
|
||||
|
||||
for ( name, basic_permissions ) in permissions_to_set_up:
|
||||
for ( name, permits_everything, basic_permissions ) in permissions_to_set_up:
|
||||
|
||||
ClientAPI.api_request_dialog_open = True
|
||||
|
||||
connection.request( 'GET', format_request_new_permissions_query( name, basic_permissions ) )
|
||||
connection.request( 'GET', format_request_new_permissions_query( name, permits_everything, basic_permissions ) )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
|
@ -393,6 +400,15 @@ class TestClientAPI( unittest.TestCase ):
|
|||
api_permissions.SetSearchTagFilter( search_tag_filter )
|
||||
|
||||
|
||||
if 'everything' in name:
|
||||
|
||||
self.assertTrue( api_permissions.PermitsEverything() )
|
||||
|
||||
else:
|
||||
|
||||
self.assertFalse( api_permissions.PermitsEverything() )
|
||||
|
||||
|
||||
self.assertEqual( bytes.fromhex( access_key_hex ), api_permissions.GetAccessKey() )
|
||||
|
||||
set_up_permissions[ name ] = api_permissions
|
||||
|
@ -461,7 +477,18 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
body_dict = json.loads( text )
|
||||
|
||||
self.assertEqual( set( body_dict[ 'basic_permissions' ] ), set( api_permissions.GetBasicPermissions() ) )
|
||||
self.assertEqual( body_dict[ 'name' ], api_permissions.GetName() )
|
||||
self.assertEqual( body_dict[ 'permits_everything' ], api_permissions.PermitsEverything() )
|
||||
|
||||
if api_permissions.PermitsEverything():
|
||||
|
||||
self.assertEqual( set( body_dict[ 'basic_permissions' ] ), set( ClientAPI.ALLOWED_PERMISSIONS ) )
|
||||
|
||||
else:
|
||||
|
||||
self.assertEqual( set( body_dict[ 'basic_permissions' ] ), set( api_permissions.GetBasicPermissions() ) )
|
||||
|
||||
|
||||
self.assertEqual( body_dict[ 'human_description' ], api_permissions.ToHumanString() )
|
||||
|
||||
|
||||
|
@ -4448,7 +4475,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
test_tags_1 = [ 'skirt', 'system:width<400' ]
|
||||
|
||||
test_tag_context_1 = ClientSearchTagContext.TagContext( test_tag_service_key_1 )
|
||||
test_predicates_1 = ClientLocalServerResources.ConvertTagListToPredicates( None, test_tags_1, do_permission_check = False )
|
||||
test_predicates_1 = ClientLocalServerCore.ConvertTagListToPredicates( None, test_tags_1, do_permission_check = False )
|
||||
|
||||
test_file_search_context_1 = ClientSearchFileSearchContext.FileSearchContext( location_context = default_location_context, tag_context = test_tag_context_1, predicates = test_predicates_1 )
|
||||
|
||||
|
@ -4456,7 +4483,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
test_tags_2 = [ 'system:untagged' ]
|
||||
|
||||
test_tag_context_2 = ClientSearchTagContext.TagContext( test_tag_service_key_2 )
|
||||
test_predicates_2 = ClientLocalServerResources.ConvertTagListToPredicates( None, test_tags_2, do_permission_check = False )
|
||||
test_predicates_2 = ClientLocalServerCore.ConvertTagListToPredicates( None, test_tags_2, do_permission_check = False )
|
||||
|
||||
test_file_search_context_2 = ClientSearchFileSearchContext.FileSearchContext( location_context = default_location_context, tag_context = test_tag_context_2, predicates = test_predicates_2 )
|
||||
|
||||
|
@ -5659,7 +5686,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
pretend_request.parsed_request_args = {}
|
||||
pretend_request.client_api_permissions = set_up_permissions[ 'everything' ]
|
||||
|
||||
predicates = ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
self.assertEqual( predicates, [] )
|
||||
|
||||
|
@ -5672,7 +5699,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.InsufficientCredentialsException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
#
|
||||
|
@ -5684,7 +5711,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.InsufficientCredentialsException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
#
|
||||
|
@ -5696,7 +5723,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.InsufficientCredentialsException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
#
|
||||
|
@ -5706,7 +5733,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
pretend_request.parsed_request_args = { 'tags' : [ 'green', '-kino' ] }
|
||||
pretend_request.client_api_permissions = set_up_permissions[ 'search_green_files' ]
|
||||
|
||||
predicates = ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
expected_predicates = []
|
||||
|
||||
|
@ -5722,7 +5749,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
pretend_request.parsed_request_args = { 'tags' : [ 'green', 'system:archive' ] }
|
||||
pretend_request.client_api_permissions = set_up_permissions[ 'search_green_files' ]
|
||||
|
||||
predicates = ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
expected_predicates = []
|
||||
|
||||
|
@ -5738,7 +5765,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
pretend_request.parsed_request_args = { 'tags' : [ 'green', [ 'red', 'blue' ], 'system:archive' ] }
|
||||
pretend_request.client_api_permissions = set_up_permissions[ 'search_green_files' ]
|
||||
|
||||
predicates = ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
predicates = ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
expected_predicates = []
|
||||
|
||||
|
@ -5770,7 +5797,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.BadRequestException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
# bad negated
|
||||
|
@ -5782,7 +5809,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.BadRequestException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
# bad system pred
|
||||
|
@ -5794,7 +5821,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
with self.assertRaises( HydrusExceptions.BadRequestException ):
|
||||
|
||||
ClientLocalServerResources.ParseClientAPISearchPredicates( pretend_request )
|
||||
ClientLocalServerCore.ParseClientAPISearchPredicates( pretend_request )
|
||||
|
||||
|
||||
|
||||
|
@ -6688,7 +6715,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
times_manager = ClientMediaManagers.TimesManager()
|
||||
|
||||
locations_manager = ClientMediaManagers.LocationsManager( set(), set(), set(), set(), times_manager )
|
||||
locations_manager = ClientMediaManagers.LocationsManager( { CC.COMBINED_LOCAL_FILE_SERVICE_KEY, CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY }, set(), set(), set(), times_manager )
|
||||
ratings_manager = ClientMediaManagers.RatingsManager( {} )
|
||||
notes_manager = ClientMediaManagers.NotesManager( {} )
|
||||
file_viewing_stats_manager = ClientMediaManagers.FileViewingStatsManager.STATICGenerateEmptyManager( times_manager )
|
||||
|
@ -6948,6 +6975,42 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( hashlib.sha256( data ).digest(), thumb_hash )
|
||||
|
||||
# file path
|
||||
|
||||
path = '/get_files/file_path?hash={}'.format( hash_hex )
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
self.assertEqual( d[ 'path' ], os.path.join( HG.test_controller.db_dir, 'client_files', f'f{hash_hex[:2]}', f'{hash_hex}.png' ) )
|
||||
|
||||
# thumbnail path
|
||||
|
||||
path = '/get_files/thumbnail_path?hash={}'.format( hash_hex )
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
self.assertEqual( d[ 'path' ], os.path.join( HG.test_controller.db_dir, 'client_files', f't{hash_hex[:2]}', f'{hash_hex}.thumbnail' ) )
|
||||
|
||||
# with "sha256:"" on the front
|
||||
|
||||
path = '/get_files/thumbnail?hash={}{}'.format( 'sha256:', hash_hex )
|
||||
|
|
|
@ -90,6 +90,26 @@ pair_types_to_pools = {}
|
|||
pair_types_to_pools[ HC.CONTENT_TYPE_TAG_PARENTS ] = ( current_parents_pool, pending_parents_pool, to_be_pended_parents_pool, deleted_parents_pool )
|
||||
pair_types_to_pools[ HC.CONTENT_TYPE_TAG_SIBLINGS ] = ( current_siblings_pool, pending_siblings_pool, to_be_pended_siblings_pool, deleted_siblings_pool )
|
||||
|
||||
count_filter_pairs = {}
|
||||
|
||||
count_filter_pairs[ HC.CONTENT_TYPE_TAG_SIBLINGS ] = [
|
||||
( 'has_count_a_ideal_no', 'no_count_b' ),
|
||||
( 'no_count_c_ideal_yes', 'has_count_d' ),
|
||||
( 'no_count_e_ideal_yes', 'has_count_f' ),
|
||||
( 'has_count_g_ideal_yes', 'has_count_h' ),
|
||||
( 'has_count_aa_ideal_yes', 'no_count_bb_ideal_yes' ),
|
||||
( 'no_count_bb_ideal_yes', 'has_count_cc' ),
|
||||
( 'has_count_dd_ideal_no', 'has_count_ee_ideal_no' ),
|
||||
( 'has_count_ee_ideal_no', 'no_count_ff' ),
|
||||
]
|
||||
|
||||
count_filter_pairs[ HC.CONTENT_TYPE_TAG_PARENTS ] = [
|
||||
( 'has_count_a', 'no_count_b' ),
|
||||
( 'no_count_c', 'has_count_d' ),
|
||||
( 'no_count_e', 'has_count_f' ),
|
||||
( 'has_count_g', 'has_count_h' )
|
||||
]
|
||||
|
||||
class TestMigration( unittest.TestCase ):
|
||||
|
||||
@classmethod
|
||||
|
@ -827,9 +847,13 @@ class TestMigration( unittest.TestCase ):
|
|||
test_filters.append( ( free_filter, namespace_filter ) )
|
||||
test_filters.append( ( namespace_filter, namespace_filter ) )
|
||||
|
||||
left_side_needs_count = False
|
||||
right_side_needs_count = False
|
||||
needs_count_service_key = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY
|
||||
|
||||
for ( left_tag_filter, right_tag_filter ) in test_filters:
|
||||
|
||||
source = ClientMigration.MigrationSourceHTPA( self, htpa_path, left_tag_filter, right_tag_filter )
|
||||
source = ClientMigration.MigrationSourceHTPA( self, htpa_path, content_type, left_tag_filter, right_tag_filter, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
expected_data = [ ( left_tag, right_tag ) for ( left_tag, right_tag ) in current if left_tag_filter.TagOK( left_tag ) and right_tag_filter.TagOK( right_tag ) ]
|
||||
|
||||
|
@ -916,11 +940,15 @@ class TestMigration( unittest.TestCase ):
|
|||
test_filters.append( ( free_filter, namespace_filter ) )
|
||||
test_filters.append( ( namespace_filter, namespace_filter ) )
|
||||
|
||||
left_side_needs_count = False
|
||||
right_side_needs_count = False
|
||||
needs_count_service_key = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY
|
||||
|
||||
for ( left_tag_filter, right_tag_filter ) in test_filters:
|
||||
|
||||
for ( service_key, content_lists, content_statuses ) in content_source_tests:
|
||||
|
||||
source = ClientMigration.MigrationSourceTagServicePairs( self, service_key, content_type, left_tag_filter, right_tag_filter, content_statuses )
|
||||
source = ClientMigration.MigrationSourceTagServicePairs( self, service_key, content_type, left_tag_filter, right_tag_filter, content_statuses, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
expected_data = set()
|
||||
|
||||
|
@ -1002,10 +1030,181 @@ class TestMigration( unittest.TestCase ):
|
|||
|
||||
|
||||
|
||||
def test_migration( self ):
|
||||
def _add_count_filter_pairs_to_services( self, content_type ):
|
||||
|
||||
content_updates = []
|
||||
|
||||
for pair in count_filter_pairs[ content_type ]:
|
||||
|
||||
content_updates.append( ClientContentUpdates.ContentUpdate( content_type, HC.CONTENT_UPDATE_ADD, pair ) )
|
||||
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, content_updates )
|
||||
|
||||
self.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
def _add_count_filter_mappings_to_services( self, content_type ):
|
||||
|
||||
content_updates = []
|
||||
|
||||
for ( a, b ) in count_filter_pairs[ content_type ]:
|
||||
|
||||
for tag in ( a, b ):
|
||||
|
||||
if 'has_count' in tag:
|
||||
|
||||
content_updates.append( ClientContentUpdates.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( tag, ( os.urandom( 32 ), ) ) ) )
|
||||
|
||||
|
||||
|
||||
|
||||
content_update_package = ClientContentUpdates.ContentUpdatePackage.STATICCreateFromContentUpdates( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, content_updates )
|
||||
|
||||
self.WriteSynchronous( 'content_updates', content_update_package )
|
||||
|
||||
|
||||
def _test_pairs_htpa_to_list_count_filter( self, content_type ):
|
||||
|
||||
def run_test( source, expected_data ):
|
||||
|
||||
destination = ClientMigration.MigrationDestinationListPairs( self )
|
||||
|
||||
job = ClientMigration.MigrationJob( self, 'test', source, destination )
|
||||
|
||||
job.Run()
|
||||
|
||||
self.assertEqual( set( destination.GetDataReceived() ), set( expected_data ) )
|
||||
|
||||
|
||||
htpa_path = os.path.join( TestController.DB_DIR, 'htpa.db' )
|
||||
|
||||
htpa = HydrusTagArchive.HydrusTagPairArchive( htpa_path )
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_PARENTS:
|
||||
|
||||
htpa.SetPairType( HydrusTagArchive.TAG_PAIR_TYPE_PARENTS )
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
htpa.SetPairType( HydrusTagArchive.TAG_PAIR_TYPE_SIBLINGS )
|
||||
|
||||
|
||||
htpa.BeginBigJob()
|
||||
|
||||
htpa.AddPairs( count_filter_pairs[ content_type ] )
|
||||
|
||||
htpa.CommitBigJob()
|
||||
|
||||
htpa.Optimise()
|
||||
|
||||
htpa.Close()
|
||||
|
||||
del htpa
|
||||
|
||||
#
|
||||
|
||||
repo_service_key = list( self._test_tag_repo_service_keys.values() )[0]
|
||||
|
||||
# test
|
||||
|
||||
left_tag_filter = HydrusTags.TagFilter()
|
||||
right_tag_filter = HydrusTags.TagFilter()
|
||||
|
||||
for left_side_needs_count in ( False, True ):
|
||||
|
||||
for right_side_needs_count in ( False, True ):
|
||||
|
||||
for needs_count_service_key in ( repo_service_key, CC.DEFAULT_LOCAL_TAG_SERVICE_KEY ):
|
||||
|
||||
source = ClientMigration.MigrationSourceHTPA( self, htpa_path, content_type, left_tag_filter, right_tag_filter, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
if needs_count_service_key == repo_service_key:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if not left_side_needs_count and not right_side_needs_count ]
|
||||
|
||||
else:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if ( not left_side_needs_count or 'has_count' in a ) and ( not right_side_needs_count or 'ideal_yes' in a ) ]
|
||||
|
||||
else:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if ( not left_side_needs_count or 'has_count' in a ) and ( not right_side_needs_count or 'has_count' in b ) ]
|
||||
|
||||
|
||||
|
||||
run_test( source, expected_data )
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
os.remove( htpa_path )
|
||||
|
||||
|
||||
def _test_pairs_service_to_list_count_filter( self, content_type ):
|
||||
|
||||
def run_test( source, expected_data ):
|
||||
|
||||
destination = ClientMigration.MigrationDestinationListPairs( self )
|
||||
|
||||
job = ClientMigration.MigrationJob( self, 'test', source, destination )
|
||||
|
||||
job.Run()
|
||||
|
||||
self.assertEqual( set( destination.GetDataReceived() ), set( expected_data ) )
|
||||
|
||||
|
||||
# test filters and content statuses
|
||||
|
||||
repo_service_key = list( self._test_tag_repo_service_keys.values() )[0]
|
||||
|
||||
# test
|
||||
|
||||
content_statuses = ( HC.CONTENT_STATUS_CURRENT, )
|
||||
|
||||
left_tag_filter = HydrusTags.TagFilter()
|
||||
right_tag_filter = HydrusTags.TagFilter()
|
||||
|
||||
for left_side_needs_count in ( False, True ):
|
||||
|
||||
for right_side_needs_count in ( False, True ):
|
||||
|
||||
for needs_count_service_key in ( repo_service_key, CC.DEFAULT_LOCAL_TAG_SERVICE_KEY ):
|
||||
|
||||
source = ClientMigration.MigrationSourceTagServicePairs( self, CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, content_type, left_tag_filter, right_tag_filter, content_statuses, left_side_needs_count, right_side_needs_count, needs_count_service_key )
|
||||
|
||||
if needs_count_service_key == repo_service_key:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if not left_side_needs_count and not right_side_needs_count ]
|
||||
|
||||
else:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG_SIBLINGS:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if ( not left_side_needs_count or 'has_count' in a ) and ( not right_side_needs_count or 'ideal_yes' in a ) ]
|
||||
|
||||
else:
|
||||
|
||||
expected_data = [ ( a, b ) for ( a, b ) in count_filter_pairs[ content_type ] if ( not left_side_needs_count or 'has_count' in a ) and ( not right_side_needs_count or 'has_count' in b ) ]
|
||||
|
||||
|
||||
|
||||
run_test( source, expected_data )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def test_migration_mappings( self ):
|
||||
|
||||
# mappings
|
||||
|
||||
self._clear_db()
|
||||
|
||||
self._set_up_services()
|
||||
self._do_fake_imports()
|
||||
self._add_mappings_to_services()
|
||||
|
@ -1016,14 +1215,56 @@ class TestMigration( unittest.TestCase ):
|
|||
self._test_mappings_service_to_list()
|
||||
self._test_mappings_list_to_service()
|
||||
|
||||
for content_type in ( HC.CONTENT_TYPE_TAG_PARENTS, HC.CONTENT_TYPE_TAG_SIBLINGS ):
|
||||
|
||||
self._add_pairs_to_services( content_type )
|
||||
self._test_pairs_list_to_list( content_type )
|
||||
self._test_pairs_htpa_to_list( content_type )
|
||||
self._test_pairs_list_to_htpa( content_type )
|
||||
self._test_pairs_service_to_list( content_type )
|
||||
self._test_pairs_list_to_service( content_type )
|
||||
|
||||
|
||||
def test_migration_parents( self ):
|
||||
|
||||
self._clear_db()
|
||||
|
||||
self._set_up_services()
|
||||
|
||||
self._add_pairs_to_services( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_list_to_list( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_htpa_to_list( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_list_to_htpa( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_service_to_list( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_list_to_service( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
|
||||
|
||||
def test_migration_parents_count_filter( self ):
|
||||
|
||||
self._clear_db()
|
||||
|
||||
self._set_up_services()
|
||||
|
||||
self._add_count_filter_pairs_to_services( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._add_count_filter_mappings_to_services( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_htpa_to_list_count_filter( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
self._test_pairs_service_to_list_count_filter( HC.CONTENT_TYPE_TAG_PARENTS )
|
||||
|
||||
|
||||
def test_migration_siblings( self ):
|
||||
|
||||
self._clear_db()
|
||||
|
||||
self._set_up_services()
|
||||
|
||||
self._add_pairs_to_services( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_list_to_list( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_htpa_to_list( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_list_to_htpa( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_service_to_list( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_list_to_service( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
|
||||
|
||||
def test_migration_siblings_filter( self ):
|
||||
|
||||
self._clear_db()
|
||||
|
||||
self._set_up_services()
|
||||
|
||||
self._add_count_filter_pairs_to_services( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._add_count_filter_mappings_to_services( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_htpa_to_list_count_filter( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
self._test_pairs_service_to_list_count_filter( HC.CONTENT_TYPE_TAG_SIBLINGS )
|
||||
|
||||
|
||||
|
|
|
@ -886,10 +886,7 @@ class Controller( object ):
|
|||
]
|
||||
|
||||
module_lookup[ 'metadata_migration' ] = [
|
||||
TestClientMetadataMigration
|
||||
]
|
||||
|
||||
module_lookup[ 'migration' ] = [
|
||||
TestClientMetadataMigration,
|
||||
TestClientMigration
|
||||
]
|
||||
|
||||
|
|
|
@ -23,7 +23,6 @@ from hydrus.client import ClientGlobals as CG
|
|||
from hydrus.client import ClientServices
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.networking import ClientLocalServer
|
||||
|
||||
from hydrus.server import ServerFiles
|
||||
from hydrus.server.networking import ServerServer
|
||||
|
|
Loading…
Reference in New Issue