Version 536
This commit is contained in:
parent
fc8d7d2428
commit
50e5482740
|
@ -0,0 +1,12 @@
|
|||
# EditorConfig is awesome: https://EditorConfig.org
|
||||
|
||||
# top-most EditorConfig file
|
||||
root = true
|
||||
|
||||
[*.py]
|
||||
charset = utf-8
|
||||
end_of_line = lf
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
insert_final_newline = true
|
||||
trim_trailing_whitespace = false
|
|
@ -7,6 +7,40 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 536](https://github.com/hydrusnetwork/hydrus/releases/tag/v536)
|
||||
|
||||
### more new filetypes
|
||||
|
||||
* thanks to a user, we have XCF and gzip filetype support!
|
||||
* I rejiggered the new SVG support so there is a firmer server/client split. the new tech needs Qt, which broke the headless Docker server last week at the last minute--now the server has some sensible stubs that safely revert to the default svg thumb and give unknown resolution, and the client patches in full support dynamically
|
||||
* the new SVG code now supports the 'scale to fill' thumbnail option
|
||||
|
||||
### misc
|
||||
|
||||
* I fixed the issue that was causing tags to stay in the tag autocomplete lookup despite going to 0 count. it should not happen for new cases, and **on update, a database routine will run to remove all your existing orphans. if you have ever synced with the PTR, it will take several minutes to run!**
|
||||
* sending the command to set a file as the best in its duplicate group now presents a yes/no dialog to confirm
|
||||
* hitting the shortcut for 'set the focused file as better than the other(s)' when you only have one file now asks if you just want to set that file as the best of its group
|
||||
* fixed an erroneous 'cannot show the best quality file of this file's group here' label in the file relationships menu--a count was off
|
||||
* fixed the 'set up a hydrus.desktop file' setup script to point to the new hydrus_client.sh startup script name
|
||||
* thanks to a user, a situation where certain unhandled URLs that deliver JSON were parsing as mpegs by ffmpeg and causing a weird loop is now caught and stopped. more investigation is needed to fix it properly
|
||||
|
||||
### boring stuff
|
||||
|
||||
* when a problem or file maintenance job causes a new file maintenance job to be queued (e.g. if the client in a metadata scan discovers the resolution of a file was not as expected, let's say it now recognises EXIF rotation, and starts a secondary thumbnail regen job), it now wakes the file maintenance manager immediately, which should help clear out and update for these jobs quickly when you are looking at the problem thumbnails
|
||||
* if you have an image type set to show as an 'open externally' button in the media viewer, then it is now no longer prefetched in the rendering system!
|
||||
* I added a very simple .editorconfig file for the project. since we have a variety of weird files in the directory tree, I've made it cautious and python-specific to start with. we'll expand as needed
|
||||
* I moved the similar files search tree and maintenance tracker from client.caches.db to client.db. while the former table is regeneratable, it isn't a cache or precomputation store, _per se_, so I finally agreed to move it to the main db. if you have a giganto database, it may take an extra minute to update
|
||||
* added a 'requirements_server.txt' to the advanced requirements.txts directory, just for future reference, and trimmed the Server Dockerfile down to reflect it
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, fixed a really stupid typo in the Client API when sending the 'file_id' parameter to set the file
|
||||
* wrote unit tests for file_id and file_ids parameters to stop this sort mistake in future
|
||||
* if you attempt to delete a file over the Client API when one of the given files is delete-locked (this is an advanced option that stops deletion of any archived file), the request now returns a 409 Conflict response, saying which hashes were bad, and does not delete anything
|
||||
* wrote a unit test to catch the new delete lock test
|
||||
* deleted the old-and-deprecated-in-one-week 'pair_rows' parameter-handling code in the set_file_relationships command
|
||||
* the client api version is now 49
|
||||
|
||||
## [Version 535](https://github.com/hydrusnetwork/hydrus/releases/tag/v535)
|
||||
|
||||
### misc
|
||||
|
@ -343,35 +377,3 @@ title: Changelog
|
|||
### misc
|
||||
|
||||
* if twisted fails to load, its exact error is saved, and if you try to launch a server, that error is printed to the log along with the notification popup
|
||||
|
||||
## [Version 526](https://github.com/hydrusnetwork/hydrus/releases/tag/v526)
|
||||
|
||||
### there will be an important update next week
|
||||
|
||||
* next week's release will have two important program changes--I will integrate an OpenCV update, which will require 'extract' users to perform a clean install, and the executables are finally changing from 'client' and 'server' to 'hydrus_client' and 'hydrus_server'! be prepared to update your shortcuts and launch scripts
|
||||
|
||||
### time
|
||||
|
||||
* fixed a stupid logical bug in my new date code, which was throwing errors on system:time predicates that had a month value equal to the current month (e.g. 'x years, 5 months' during May)--sorry! (issue #1362)
|
||||
* when a subscription dies, the popup note about it says the death velocity period in the neat '180 days', as you set in UI, rather than converting to a date and stating the number of months and days using the recent calendar calculation updates
|
||||
* I unified some more 'xxxified date' UI labels to be 'xxxified time'. we're generally moving to the latter format as the ideal while still accepting various combinations for system parsing input
|
||||
|
||||
### shortcuts
|
||||
|
||||
* added 'media play-pause/previous/next' and 'volume up/down/mute' key recognition to the shortcut system. if your keyboard/headphones have media keys, they _should_ be mappable now. note, however, that, at least on Windows, while these capture in the hydrus UI, they seem to have global OS-level hooks, and as far as I can tell Qt can't stop that event propagating, so these may have limited effectiveness if you also have an mp3 player open, since Windows will also send the 'next' call to that etc... it may be there is a nice way to properly register a Qt app as a media thing for Windows to global-hook these events to, but I'm not sure!
|
||||
* also added 'mouse task button' to the mappable buttons. this is apparently a common Mouse6 mapping, so if you have it, knock yourself out
|
||||
* the code in the shortcut system that tries to detect and merge many small scroll wheel events (such as the emulated scroll that a trackpad may generate) now applies to all mouse devices, not just synthesised events. with luck, this will mean that mice that generate like 15 smoothscroll events of one degree instead of one of fifteen degrees for every wheel tick will no longer spam-navigate the media viewer wew
|
||||
|
||||
### misc
|
||||
|
||||
* to save you typing/pasting time, the 'enter your reason' prompts in manage tags, tag siblings, and tag parents now remember the last five custom reasons you enter! you can change the number saved using the new option under _options->tags_, including setting it to 0 to disable the system
|
||||
* fixed pasting tags in the manage tags dialog when the number of tags you are pasting is larger than the number of allowed 'recent tags'. previously it was saying 'did not understand what was in the clipboard', so hooray for the new error reporting
|
||||
* every multi-column list in the program now has a 'reset column widths' item in its header right-click menu! when these reset events happen, the respective lists also resize themselves immediately, no restart required
|
||||
* when you set 'try again' on an import object, it now clears all saved hashes from the import object (including the SHA256 which may have been linked from the database in an 'already in db'/'previously deleted' result). this will ensure the next attempt is not poisoned by these hashes (which can happen for various reasons) in the subsequent attempt. basically 'try again' resets better now (issue #1353)
|
||||
|
||||
### some build stuff
|
||||
|
||||
* the main build script now only uses Node16 sub-Actions (Node12 support is deprecated and being dropped in June)
|
||||
* the main build script no longer uses set-output commands (these are deprecated and being dropped later in the year I think, in favour of some ENV stuff)
|
||||
* tidied some cruft from the main build script
|
||||
* I moved the 'new' python-mpv in the requirements.txts from 1.0.1 to 1.0.3. source users might like to rebuild their venvs again, particularly Windows users who updated to the new mpv dll recently
|
||||
|
|
|
@ -34,6 +34,35 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_536"><a href="#version_536">version 536</a></h2>
|
||||
<ul>
|
||||
<li><h3>more new filetypes</h3></li>
|
||||
<li>thanks to a user, we have XCF and gzip filetype support!</li>
|
||||
<li>I rejiggered the new SVG support so there is a firmer server/client split. the new tech needs Qt, which broke the headless Docker server last week at the last minute--now the server has some sensible stubs that safely revert to the default svg thumb and give unknown resolution, and the client patches in full support dynamically</li>
|
||||
<li>the new SVG code now supports the 'scale to fill' thumbnail option</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>I fixed the issue that was causing tags to stay in the tag autocomplete lookup despite going to 0 count. it should not happen for new cases, and **on update, a database routine will run to remove all your existing orphans. if you have ever synced with the PTR, it will take several minutes to run!**</li>
|
||||
<li>sending the command to set a file as the best in its duplicate group now presents a yes/no dialog to confirm</li>
|
||||
<li>hitting the shortcut for 'set the focused file as better than the other(s)' when you only have one file now asks if you just want to set that file as the best of its group</li>
|
||||
<li>fixed an erroneous 'cannot show the best quality file of this file's group here' label in the file relationships menu--a count was off</li>
|
||||
<li>fixed the 'set up a hydrus.desktop file' setup script to point to the new hydrus_client.sh startup script name</li>
|
||||
<li>thanks to a user, a situation where certain unhandled URLs that deliver JSON were parsing as mpegs by ffmpeg and causing a weird loop is now caught and stopped. more investigation is needed to fix it properly</li>
|
||||
<li><h3>boring stuff</h3></li>
|
||||
<li>when a problem or file maintenance job causes a new file maintenance job to be queued (e.g. if the client in a metadata scan discovers the resolution of a file was not as expected, let's say it now recognises EXIF rotation, and starts a secondary thumbnail regen job), it now wakes the file maintenance manager immediately, which should help clear out and update for these jobs quickly when you are looking at the problem thumbnails</li>
|
||||
<li>if you have an image type set to show as an 'open externally' button in the media viewer, then it is now no longer prefetched in the rendering system!</li>
|
||||
<li>I added a very simple .editorconfig file for the project. since we have a variety of weird files in the directory tree, I've made it cautious and python-specific to start with. we'll expand as needed</li>
|
||||
<li>I moved the similar files search tree and maintenance tracker from client.caches.db to client.db. while the former table is regeneratable, it isn't a cache or precomputation store, _per se_, so I finally agreed to move it to the main db. if you have a giganto database, it may take an extra minute to update</li>
|
||||
<li>added a 'requirements_server.txt' to the advanced requirements.txts directory, just for future reference, and trimmed the Server Dockerfile down to reflect it</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>thanks to a user, fixed a really stupid typo in the Client API when sending the 'file_id' parameter to set the file</li>
|
||||
<li>wrote unit tests for file_id and file_ids parameters to stop this sort mistake in future</li>
|
||||
<li>if you attempt to delete a file over the Client API when one of the given files is delete-locked (this is an advanced option that stops deletion of any archived file), the request now returns a 409 Conflict response, saying which hashes were bad, and does not delete anything</li>
|
||||
<li>wrote a unit test to catch the new delete lock test</li>
|
||||
<li>deleted the old-and-deprecated-in-one-week 'pair_rows' parameter-handling code in the set_file_relationships command</li>
|
||||
<li>the client api version is now 49</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_535"><a href="#version_535">version 535</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -302,6 +302,7 @@ def ToHumanBytes( size ):
|
|||
|
||||
return HydrusData.BaseToHumanBytes( size, sig_figs = sig_figs )
|
||||
|
||||
|
||||
HydrusData.ToHumanBytes = ToHumanBytes
|
||||
|
||||
class Booru( HydrusData.HydrusYAMLBase ):
|
||||
|
|
|
@ -21,6 +21,7 @@ from hydrus.core.networking import HydrusNetworking
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientImageHandling
|
||||
from hydrus.client import ClientPaths
|
||||
from hydrus.client import ClientSVGHandling # important to keep this in, despite not being used, since there's initialisation stuff in here
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
|
@ -2794,3 +2795,8 @@ class FilesMaintenanceManager( object ):
|
|||
self._controller.CallToThreadLongRunning( self.MainLoopBackgroundWork )
|
||||
|
||||
|
||||
def Wake( self ):
|
||||
|
||||
self._wake_background_event.set()
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,96 @@
|
|||
import typing
|
||||
|
||||
from qtpy import QtSvg
|
||||
from qtpy import QtGui as QG
|
||||
from qtpy import QtCore as QC
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSVGHandling
|
||||
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
||||
def LoadSVGRenderer( path: str ):
|
||||
|
||||
renderer = QtSvg.QSvgRenderer()
|
||||
|
||||
try:
|
||||
|
||||
renderer.load( path )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not load SVG file.' )
|
||||
|
||||
|
||||
if not renderer.isValid():
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'SVG file is invalid!' )
|
||||
|
||||
|
||||
return renderer
|
||||
|
||||
|
||||
def GenerateThumbnailBytesFromSVGPath( path: str, target_resolution: typing.Tuple[int, int], clip_rect = None ) -> bytes:
|
||||
|
||||
# TODO: SVGs have no inherent resolution, so all this is pretty stupid. we should render to exactly the res we want and then clip the result, not beforehand
|
||||
|
||||
renderer = LoadSVGRenderer( path )
|
||||
|
||||
# Seems to help for some weird floating point dimension SVGs
|
||||
renderer.setAspectRatioMode( QC.Qt.AspectRatioMode.KeepAspectRatio )
|
||||
|
||||
try:
|
||||
|
||||
if clip_rect is None:
|
||||
|
||||
( target_width, target_height ) = target_resolution
|
||||
|
||||
qt_image = QG.QImage( target_width, target_height, QG.QImage.Format_RGBA8888 )
|
||||
|
||||
else:
|
||||
|
||||
qt_image = QG.QImage( renderer.defaultSize(), QG.QImage.Format_RGBA8888 )
|
||||
|
||||
|
||||
qt_image.fill( QC.Qt.transparent )
|
||||
|
||||
painter = QG.QPainter( qt_image )
|
||||
|
||||
renderer.render( painter )
|
||||
|
||||
painter.end()
|
||||
|
||||
numpy_image = ClientGUIFunctions.ConvertQtImageToNumPy( qt_image )
|
||||
|
||||
if clip_rect is None:
|
||||
|
||||
thumbnail_numpy_image = numpy_image
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = HydrusImageHandling.ClipNumPyImage( numpy_image, clip_rect )
|
||||
|
||||
thumbnail_numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, target_resolution )
|
||||
|
||||
|
||||
return HydrusImageHandling.GenerateThumbnailBytesNumPy( thumbnail_numpy_image )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException()
|
||||
|
||||
|
||||
|
||||
HydrusSVGHandling.GenerateThumbnailBytesFromSVGPath = GenerateThumbnailBytesFromSVGPath
|
||||
|
||||
def GetSVGResolution( path: str ):
|
||||
|
||||
renderer = LoadSVGRenderer( path )
|
||||
|
||||
resolution = renderer.defaultSize().toTuple()
|
||||
|
||||
return resolution
|
||||
|
||||
|
||||
HydrusSVGHandling.GetSVGResolution = GetSVGResolution
|
|
@ -9518,6 +9518,102 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 535:
|
||||
|
||||
try:
|
||||
|
||||
tag_service_ids = self.modules_services.GetServiceIds( HC.REAL_TAG_SERVICES )
|
||||
|
||||
file_service_ids = list( self.modules_services.GetServiceIds( HC.FILE_SERVICES_WITH_SPECIFIC_TAG_LOOKUP_CACHES ) )
|
||||
|
||||
file_service_ids.append( self.modules_services.combined_file_service_id )
|
||||
|
||||
for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ):
|
||||
|
||||
if file_service_id == self.modules_services.combined_file_service_id:
|
||||
|
||||
message = f'cleaning combined tag fast search cache {tag_service_id}'
|
||||
|
||||
else:
|
||||
|
||||
message = f'cleaning specific tag fast search cache {file_service_id}_{tag_service_id}'
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( f'{message} - setting up' )
|
||||
|
||||
tags_table_name = self.modules_tag_search.GetTagsTableName( file_service_id, tag_service_id )
|
||||
|
||||
CHUNK_SIZE = 65536
|
||||
|
||||
for ( chunk_of_tag_ids, num_done, num_to_do ) in HydrusDB.ReadLargeIdQueryInSeparateChunks( self._c, f'SELECT tag_id FROM {tags_table_name};', CHUNK_SIZE ):
|
||||
|
||||
num_string = HydrusData.ConvertValueRangeToPrettyString( num_done, num_to_do )
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( f'{message} - {num_string}' )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( chunk_of_tag_ids, 'tag_id' ) as temp_tag_id_table_name:
|
||||
|
||||
results = self.modules_mappings_counts.GetCountsForTags( ClientTags.TAG_DISPLAY_STORAGE, file_service_id, tag_service_id, temp_tag_id_table_name )
|
||||
|
||||
good_tag_ids = set()
|
||||
|
||||
for ( tag_id, current_count, pending_count ) in results:
|
||||
|
||||
# this should always be true, but w/e
|
||||
if current_count > 0 or pending_count > 0:
|
||||
|
||||
good_tag_ids.add( tag_id )
|
||||
|
||||
|
||||
|
||||
|
||||
# this tag does not exist here mate, it is an orphan
|
||||
# ...or it is a countless sibling/parent, in which case deletetags will filter it out a bit later, so don't panic
|
||||
orphan_tag_ids = set( chunk_of_tag_ids ).difference( good_tag_ids )
|
||||
|
||||
if len( orphan_tag_ids ) > 0:
|
||||
|
||||
self.modules_tag_search.DeleteTags( file_service_id, tag_service_id, orphan_tag_ids )
|
||||
|
||||
|
||||
|
||||
time.sleep( 0.01 )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'The update failed to clear out some tag text search orphans! You might like to run _database->regenerate->tag text search cache_ yourself when you have some time. The error was written to the log--hydev would be interested in seeing it.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( 'migrating some similar file search data' )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS main.shape_vptree ( phash_id INTEGER PRIMARY KEY, parent_id INTEGER, radius INTEGER, inner_id INTEGER, inner_population INTEGER, outer_id INTEGER, outer_population INTEGER );' )
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS main.shape_maintenance_branch_regen ( phash_id INTEGER PRIMARY KEY );' )
|
||||
|
||||
self._Execute( 'INSERT OR IGNORE INTO main.shape_vptree SELECT * FROM external_caches.shape_vptree;' )
|
||||
self._Execute( 'INSERT OR IGNORE INTO main.shape_maintenance_branch_regen SELECT * FROM external_caches.shape_maintenance_branch_regen;' )
|
||||
|
||||
self._CreateIndex( 'main.shape_vptree', [ 'parent_id' ], False )
|
||||
|
||||
self._Execute( 'DROP TABLE IF EXISTS external_caches.shape_vptree;' )
|
||||
self._Execute( 'DROP TABLE IF EXISTS external_caches.shape_maintenance_branch_regen;' )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Migrating the similar files search tree from the caches database to the main failed. It probably occured because the table was missing and needed to be regenerated, which may have or may well happen at a different stage of boot. The error has been written to the log, which hydev may be interested in.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -2,7 +2,7 @@ import sqlite3
|
|||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientFiles
|
||||
|
@ -42,6 +42,8 @@ class ClientDBFilesMaintenanceQueue( ClientDBModule.ClientDBModule ):
|
|||
|
||||
self._ExecuteMany( 'REPLACE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) VALUES ( ?, ?, ? );', ( ( hash_id, job_type, time_can_start ) for hash_id in hash_ids ) )
|
||||
|
||||
HG.client_controller.files_maintenance_manager.Wake()
|
||||
|
||||
|
||||
def AddJobsHashes( self, hashes, job_type, time_can_start = 0 ):
|
||||
|
||||
|
|
|
@ -226,8 +226,8 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
( [ 'hash_id' ], False, 451 )
|
||||
]
|
||||
|
||||
index_generation_dict[ 'external_caches.shape_vptree' ] = [
|
||||
( [ 'parent_id' ], False, 400 )
|
||||
index_generation_dict[ 'main.shape_vptree' ] = [
|
||||
( [ 'parent_id' ], False, 536 )
|
||||
]
|
||||
|
||||
index_generation_dict[ 'main.pixel_hash_map' ] = [
|
||||
|
@ -242,8 +242,8 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
return {
|
||||
'external_master.shape_perceptual_hashes' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER PRIMARY KEY, phash BLOB_BYTES UNIQUE );', 451 ),
|
||||
'external_master.shape_perceptual_hash_map' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER, hash_id INTEGER, PRIMARY KEY ( phash_id, hash_id ) );', 451 ),
|
||||
'external_caches.shape_vptree' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER PRIMARY KEY, parent_id INTEGER, radius INTEGER, inner_id INTEGER, inner_population INTEGER, outer_id INTEGER, outer_population INTEGER );', 400 ),
|
||||
'external_caches.shape_maintenance_branch_regen' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER PRIMARY KEY );', 400 ),
|
||||
'main.shape_vptree' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER PRIMARY KEY, parent_id INTEGER, radius INTEGER, inner_id INTEGER, inner_population INTEGER, outer_id INTEGER, outer_population INTEGER );', 536 ),
|
||||
'main.shape_maintenance_branch_regen' : ( 'CREATE TABLE IF NOT EXISTS {} ( phash_id INTEGER PRIMARY KEY );', 536 ),
|
||||
'main.shape_search_cache' : ( 'CREATE TABLE IF NOT EXISTS {} ( hash_id INTEGER PRIMARY KEY, searched_distance INTEGER );', 451 ),
|
||||
'main.pixel_hash_map' : ( 'CREATE TABLE IF NOT EXISTS {} ( hash_id INTEGER, pixel_hash_id INTEGER, PRIMARY KEY ( hash_id, pixel_hash_id ) );', 465 )
|
||||
}
|
||||
|
@ -514,7 +514,7 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def _RepairRepopulateTables( self, repopulate_table_names, cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper ):
|
||||
|
||||
if 'external_caches.shape_vptree' in repopulate_table_names or 'external_caches.shape_maintenance_branch_regen' in repopulate_table_names:
|
||||
if 'main.shape_vptree' in repopulate_table_names or 'main.shape_maintenance_branch_regen' in repopulate_table_names:
|
||||
|
||||
self.RegenerateTree()
|
||||
|
||||
|
|
|
@ -345,10 +345,15 @@ class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
|
|||
#
|
||||
|
||||
# we always include all chained guys regardless of count
|
||||
chained_tag_ids = self.modules_tag_display.GetChainsMembers( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, tag_ids )
|
||||
chained_tag_ids = self.modules_tag_display.FilterChained( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, tag_ids )
|
||||
|
||||
tag_ids = tag_ids.difference( chained_tag_ids )
|
||||
|
||||
if len( tag_ids ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
tags_table_name = self.GetTagsTableName( file_service_id, tag_service_id )
|
||||
|
|
|
@ -95,7 +95,9 @@ def AddDuplicatesMenu( win: QW.QWidget, menu: QW.QMenu, location_context: Client
|
|||
|
||||
else:
|
||||
|
||||
if file_duplicate_types_to_counts[ HC.DUPLICATE_MEMBER ] == 1:
|
||||
num_other_dupe_members_in_this_domain = file_duplicate_types_to_counts[ HC.DUPLICATE_MEMBER ]
|
||||
|
||||
if num_other_dupe_members_in_this_domain == 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( duplicates_menu, 'cannot show the best quality file of this file\'s group here, it is not in this domain', 'The king of this group has probably been deleted from this domain.' )
|
||||
|
||||
|
|
|
@ -2792,7 +2792,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
hash = media.GetHash()
|
||||
mime = media.GetMime()
|
||||
|
||||
if media.IsStaticImage():
|
||||
if media.IsStaticImage() and ClientGUICanvasMedia.WeAreExpectingToLoadThisMediaFile( media, self.CANVAS_TYPE ):
|
||||
|
||||
if not image_cache.HasImageRenderer( hash ):
|
||||
|
||||
|
@ -3527,7 +3527,7 @@ class CanvasMediaList( ClientMedia.ListeningMediaList, CanvasWithHovers ):
|
|||
hash = media.GetHash()
|
||||
mime = media.GetMime()
|
||||
|
||||
if media.IsStaticImage():
|
||||
if media.IsStaticImage() and ClientGUICanvasMedia.WeAreExpectingToLoadThisMediaFile( media, self.CANVAS_TYPE ):
|
||||
|
||||
if not image_cache.HasImageRenderer( hash ):
|
||||
|
||||
|
|
|
@ -365,6 +365,18 @@ def UserWantsUsToDisplayMedia( media: ClientMedia.MediaSingleton, canvas_type: i
|
|||
return True
|
||||
|
||||
|
||||
def WeAreExpectingToLoadThisMediaFile( media: ClientMedia.MediaSingleton, canvas_type: int ) -> bool:
|
||||
|
||||
( media_show_action, media_start_paused, media_start_with_embed ) = GetShowAction( media, canvas_type )
|
||||
|
||||
if media_show_action in ( CC.MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE, CC.MEDIA_VIEWER_ACTION_SHOW_WITH_MPV, CC.MEDIA_VIEWER_ACTION_SHOW_WITH_QMEDIAPLAYER ):
|
||||
|
||||
return True
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
class Animation( QW.QWidget ):
|
||||
|
||||
launchMediaViewer = QC.Signal()
|
||||
|
|
|
@ -1782,6 +1782,15 @@ class MediaPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.ListeningMed
|
|||
|
||||
if len( worse_flat_media ) == 0:
|
||||
|
||||
message = 'Since you only selected one file, would you rather just set this file as the best file of its group?'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message )
|
||||
|
||||
if result == QW.QDialog.Accepted:
|
||||
|
||||
self._SetDuplicatesFocusedKing( silent = True )
|
||||
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
@ -1804,7 +1813,7 @@ class MediaPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.ListeningMed
|
|||
|
||||
|
||||
|
||||
def _SetDuplicatesFocusedKing( self ):
|
||||
def _SetDuplicatesFocusedKing( self, silent = False ):
|
||||
|
||||
if self._HasFocusSingleton():
|
||||
|
||||
|
@ -1812,7 +1821,30 @@ class MediaPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.ListeningMed
|
|||
|
||||
focused_hash = media.GetHash()
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'duplicate_set_king', focused_hash )
|
||||
# TODO: when media knows its duplicate gubbins, we can test num dupe files and if it is king already and stuff easier here
|
||||
|
||||
do_it = False
|
||||
|
||||
if silent:
|
||||
|
||||
do_it = True
|
||||
|
||||
else:
|
||||
|
||||
message = 'Are you sure you want to set the focused file as the best file of its duplicate group?'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message )
|
||||
|
||||
if result == QW.QDialog.Accepted:
|
||||
|
||||
do_it = True
|
||||
|
||||
|
||||
|
||||
if do_it:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'duplicate_set_king', focused_hash )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
|
|
@ -1740,10 +1740,24 @@ class HydrusResourceClientAPIRestrictedAddFilesDeleteFiles( HydrusResourceClient
|
|||
|
||||
hashes = set( ParseHashes( request ) )
|
||||
|
||||
# expand this to take reason
|
||||
|
||||
location_context.LimitToServiceTypes( HG.client_controller.services_manager.GetServiceType, ( HC.COMBINED_LOCAL_FILE, HC.COMBINED_LOCAL_MEDIA, HC.LOCAL_FILE_DOMAIN ) )
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'delete_lock_for_archived_files' ):
|
||||
|
||||
media_results = HG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
undeletable_media_results = [ m for m in media_results if m.IsDeleteLocked() ]
|
||||
|
||||
if len( undeletable_media_results ) > 0:
|
||||
|
||||
message = 'Sorry, some of the files you selected are currently delete locked. Their hashes are:'
|
||||
message += '\n' * 2
|
||||
message += '\n'.join( sorted( [ m.GetHash().hex() for m in undeletable_media_results ] ) )
|
||||
|
||||
raise HydrusExceptions.ConflictException( message )
|
||||
|
||||
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = reason )
|
||||
|
||||
for service_key in location_context.current_service_keys:
|
||||
|
@ -3784,19 +3798,7 @@ class HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships(
|
|||
|
||||
raw_rows = []
|
||||
|
||||
all_hashes = set()
|
||||
|
||||
pair_rows_old_arg_raw_rows = request.parsed_request_args.GetValue( 'pair_rows', list, expected_list_type = list, default_value = [] )
|
||||
|
||||
for row in pair_rows_old_arg_raw_rows:
|
||||
|
||||
if len( row ) != 6:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'One of the pair rows was the wrong length!' )
|
||||
|
||||
|
||||
raw_rows.append( row )
|
||||
|
||||
# TODO: now I rewangled this to remove the pair_rows parameter, let's get an object or dict bouncing around so we aren't handling a mega-tuple
|
||||
|
||||
raw_relationship_dicts = request.parsed_request_args.GetValue( 'relationships', list, expected_list_type = dict, default_value = [] )
|
||||
|
||||
|
@ -3821,6 +3823,8 @@ class HydrusResourceClientAPIRestrictedManageFileRelationshipsSetRelationships(
|
|||
HC.DUPLICATE_POTENTIAL
|
||||
}
|
||||
|
||||
all_hashes = set()
|
||||
|
||||
# variable type testing
|
||||
for row in raw_rows:
|
||||
|
||||
|
|
|
@ -90,6 +90,7 @@ def date_pred_generator( pred_type, o, v ):
|
|||
|
||||
return ClientSearch.Predicate( pred_type, ( o, date_type, tuple( v ) ) )
|
||||
|
||||
|
||||
def num_file_relationships_pred_generator( o, v, u ):
|
||||
|
||||
u_dict = {
|
||||
|
|
|
@ -100,8 +100,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 535
|
||||
CLIENT_API_VERSION = 48
|
||||
SOFTWARE_VERSION = 536
|
||||
CLIENT_API_VERSION = 49
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -721,25 +721,121 @@ APPLICATION_GZIP = 58
|
|||
APPLICATION_OCTET_STREAM = 100
|
||||
APPLICATION_UNKNOWN = 101
|
||||
|
||||
GENERAL_FILETYPES = { GENERAL_APPLICATION, GENERAL_AUDIO, GENERAL_IMAGE, GENERAL_VIDEO, GENERAL_ANIMATION }
|
||||
GENERAL_FILETYPES = {
|
||||
GENERAL_APPLICATION,
|
||||
GENERAL_AUDIO,
|
||||
GENERAL_IMAGE,
|
||||
GENERAL_VIDEO,
|
||||
GENERAL_ANIMATION
|
||||
}
|
||||
|
||||
SEARCHABLE_MIMES = { IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, IMAGE_SVG, APPLICATION_FLASH, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_OGV, VIDEO_MPEG, APPLICATION_CLIP, APPLICATION_PSD, APPLICATION_SAI2, APPLICATION_KRITA, APPLICATION_XCF, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z, APPLICATION_GZIP, AUDIO_M4A, AUDIO_MP3, AUDIO_REALMEDIA, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_TRUEAUDIO, AUDIO_WMA, VIDEO_WMV, AUDIO_MKV, AUDIO_MP4, AUDIO_WAVPACK }
|
||||
SEARCHABLE_MIMES = {
|
||||
IMAGE_JPEG,
|
||||
IMAGE_PNG,
|
||||
IMAGE_APNG,
|
||||
IMAGE_GIF,
|
||||
IMAGE_WEBP,
|
||||
IMAGE_TIFF,
|
||||
IMAGE_ICON,
|
||||
IMAGE_SVG,
|
||||
APPLICATION_FLASH,
|
||||
VIDEO_AVI,
|
||||
VIDEO_FLV,
|
||||
VIDEO_MOV,
|
||||
VIDEO_MP4,
|
||||
VIDEO_MKV,
|
||||
VIDEO_REALMEDIA,
|
||||
VIDEO_WEBM,
|
||||
VIDEO_OGV,
|
||||
VIDEO_MPEG,
|
||||
APPLICATION_CLIP,
|
||||
APPLICATION_PSD,
|
||||
APPLICATION_SAI2,
|
||||
APPLICATION_KRITA,
|
||||
APPLICATION_XCF,
|
||||
APPLICATION_PDF,
|
||||
APPLICATION_ZIP,
|
||||
APPLICATION_RAR,
|
||||
APPLICATION_7Z,
|
||||
APPLICATION_GZIP,
|
||||
AUDIO_M4A,
|
||||
AUDIO_MP3,
|
||||
AUDIO_REALMEDIA,
|
||||
AUDIO_OGG,
|
||||
AUDIO_FLAC,
|
||||
AUDIO_WAVE,
|
||||
AUDIO_TRUEAUDIO,
|
||||
AUDIO_WMA,
|
||||
VIDEO_WMV,
|
||||
AUDIO_MKV,
|
||||
AUDIO_MP4,
|
||||
AUDIO_WAVPACK
|
||||
}
|
||||
|
||||
STORABLE_MIMES = set( SEARCHABLE_MIMES ).union( { APPLICATION_HYDRUS_UPDATE_CONTENT, APPLICATION_HYDRUS_UPDATE_DEFINITIONS } )
|
||||
|
||||
ALLOWED_MIMES = set( STORABLE_MIMES ).union( { IMAGE_BMP } )
|
||||
|
||||
DECOMPRESSION_BOMB_IMAGES = { IMAGE_JPEG, IMAGE_PNG }
|
||||
DECOMPRESSION_BOMB_IMAGES = {
|
||||
IMAGE_JPEG,
|
||||
IMAGE_PNG
|
||||
}
|
||||
|
||||
IMAGES = { IMAGE_JPEG, IMAGE_PNG, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON }
|
||||
IMAGES = {
|
||||
IMAGE_JPEG,
|
||||
IMAGE_PNG,
|
||||
IMAGE_BMP,
|
||||
IMAGE_WEBP,
|
||||
IMAGE_TIFF,
|
||||
IMAGE_ICON
|
||||
}
|
||||
|
||||
ANIMATIONS = { IMAGE_GIF, IMAGE_APNG }
|
||||
ANIMATIONS = {
|
||||
IMAGE_GIF,
|
||||
IMAGE_APNG
|
||||
}
|
||||
|
||||
AUDIO = { AUDIO_M4A, AUDIO_MP3, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_WMA, AUDIO_REALMEDIA, AUDIO_TRUEAUDIO, AUDIO_MKV, AUDIO_MP4, AUDIO_WAVPACK }
|
||||
AUDIO = {
|
||||
AUDIO_M4A,
|
||||
AUDIO_MP3,
|
||||
AUDIO_OGG,
|
||||
AUDIO_FLAC,
|
||||
AUDIO_WAVE,
|
||||
AUDIO_WMA,
|
||||
AUDIO_REALMEDIA,
|
||||
AUDIO_TRUEAUDIO,
|
||||
AUDIO_MKV,
|
||||
AUDIO_MP4,
|
||||
AUDIO_WAVPACK
|
||||
}
|
||||
|
||||
VIDEO = { VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_OGV, VIDEO_MPEG }
|
||||
VIDEO = {
|
||||
VIDEO_AVI,
|
||||
VIDEO_FLV,
|
||||
VIDEO_MOV,
|
||||
VIDEO_MP4,
|
||||
VIDEO_WMV,
|
||||
VIDEO_MKV,
|
||||
VIDEO_REALMEDIA,
|
||||
VIDEO_WEBM,
|
||||
VIDEO_OGV,
|
||||
VIDEO_MPEG
|
||||
}
|
||||
|
||||
APPLICATIONS = { IMAGE_SVG, APPLICATION_FLASH, APPLICATION_PSD, APPLICATION_CLIP, APPLICATION_SAI2, APPLICATION_KRITA, APPLICATION_XCF, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z, APPLICATION_ZIP }
|
||||
APPLICATIONS = {
|
||||
IMAGE_SVG,
|
||||
APPLICATION_FLASH,
|
||||
APPLICATION_PSD,
|
||||
APPLICATION_CLIP,
|
||||
APPLICATION_SAI2,
|
||||
APPLICATION_KRITA,
|
||||
APPLICATION_XCF,
|
||||
APPLICATION_PDF,
|
||||
APPLICATION_ZIP,
|
||||
APPLICATION_RAR,
|
||||
APPLICATION_7Z,
|
||||
APPLICATION_GZIP
|
||||
}
|
||||
|
||||
general_mimetypes_to_mime_groups = {
|
||||
GENERAL_APPLICATION : APPLICATIONS,
|
||||
|
|
|
@ -79,19 +79,20 @@ def ReadLargeIdQueryInSeparateChunks( cursor, select_statement, chunk_size ):
|
|||
num_to_do = 0
|
||||
|
||||
|
||||
i = 0
|
||||
num_done = 0
|
||||
|
||||
while i < num_to_do:
|
||||
while num_done < num_to_do:
|
||||
|
||||
chunk = [ temp_id for ( temp_id, ) in cursor.execute( 'SELECT temp_id FROM ' + table_name + ' WHERE job_id BETWEEN ? AND ?;', ( i, i + chunk_size - 1 ) ) ]
|
||||
chunk = [ temp_id for ( temp_id, ) in cursor.execute( 'SELECT temp_id FROM ' + table_name + ' WHERE job_id BETWEEN ? AND ?;', ( num_done, num_done + chunk_size - 1 ) ) ]
|
||||
|
||||
i += len( chunk )
|
||||
num_done += len( chunk )
|
||||
|
||||
yield ( chunk, i, num_to_do )
|
||||
yield ( chunk, num_done, num_to_do )
|
||||
|
||||
|
||||
cursor.execute( 'DROP TABLE ' + table_name + ';' )
|
||||
|
||||
|
||||
def VacuumDB( db_path ):
|
||||
|
||||
db = sqlite3.connect( db_path, isolation_level = None, detect_types = sqlite3.PARSE_DECLTYPES )
|
||||
|
|
|
@ -1182,6 +1182,7 @@ def BaseToHumanBytes( size, sig_figs = 3 ):
|
|||
|
||||
return '{} {}B'.format( d, suffix )
|
||||
|
||||
|
||||
ToHumanBytes = BaseToHumanBytes
|
||||
|
||||
def ToHumanInt( num ):
|
||||
|
|
|
@ -4,30 +4,18 @@ import struct
|
|||
|
||||
from hydrus.core import HydrusAudioHandling
|
||||
from hydrus.core import HydrusClipHandling
|
||||
from hydrus.core import HydrusKritaHandling
|
||||
|
||||
try:
|
||||
|
||||
from hydrus.core import HydrusSVGHandling
|
||||
|
||||
SVG_OK = True
|
||||
|
||||
except:
|
||||
|
||||
SVG_OK = False
|
||||
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusDocumentHandling
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFlashHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusKritaHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusSVGHandling
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
||||
|
@ -170,8 +158,8 @@ def GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames,
|
|||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
|
||||
|
||||
elif mime == HC.APPLICATION_KRITA:
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
@ -191,29 +179,27 @@ def GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames,
|
|||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
|
||||
elif mime == HC.IMAGE_SVG:
|
||||
|
||||
|
||||
try:
|
||||
|
||||
if not SVG_OK:
|
||||
|
||||
raise Exception( 'No SVG thumbs' )
|
||||
|
||||
|
||||
thumbnail_bytes = HydrusSVGHandling.GenerateThumbnailBytesFromSVGPath( path, target_resolution, clip_rect = clip_rect )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}":'.format( path ) )
|
||||
HydrusData.PrintException( e )
|
||||
if not isinstance( e, HydrusExceptions.UnsupportedFileException ):
|
||||
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}":'.format( path ) )
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
thumb_path = os.path.join( HC.STATIC_DIR, 'svg.png' )
|
||||
|
||||
thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG, clip_rect = clip_rect )
|
||||
|
||||
|
||||
|
||||
elif mime == HC.APPLICATION_FLASH:
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
@ -381,10 +367,7 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
elif mime == HC.IMAGE_SVG:
|
||||
|
||||
if SVG_OK:
|
||||
|
||||
( width, height ) = HydrusSVGHandling.GetSVGResolution( path )
|
||||
|
||||
( width, height ) = HydrusSVGHandling.GetSVGResolution( path )
|
||||
|
||||
elif mime == HC.APPLICATION_FLASH:
|
||||
|
||||
|
@ -540,6 +523,19 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
|
||||
|
||||
# If the file starts with '{' it is probably JSON
|
||||
# but we can't know for sure so we send it over to be checked
|
||||
if bit_to_check.startswith( b'{' ) or bit_to_check.startswith( b'[' ):
|
||||
|
||||
with open( path, 'rb' ) as f:
|
||||
|
||||
if HydrusText.LooksLikeJSON( f.read() ):
|
||||
|
||||
return HC.APPLICATION_JSON
|
||||
|
||||
|
||||
|
||||
|
||||
if HydrusText.LooksLikeHTML( bit_to_check ):
|
||||
|
||||
return HC.TEXT_HTML
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import typing
|
||||
|
||||
from qtpy import QtSvg
|
||||
from qtpy import QtGui as QG
|
||||
from qtpy import QtCore as QC
|
||||
|
@ -8,59 +9,15 @@ from hydrus.core import HydrusImageHandling
|
|||
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
||||
def LoadSVGRenderer(path: str):
|
||||
|
||||
renderer = QtSvg.QSvgRenderer();
|
||||
|
||||
try:
|
||||
renderer.load(path)
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException('Could not load SVG file.')
|
||||
|
||||
if not renderer.isValid():
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException('SVG file is invalid!')
|
||||
def BaseGenerateThumbnailBytesFromSVGPath( path: str, target_resolution: typing.Tuple[int, int], clip_rect = None ) -> bytes:
|
||||
|
||||
return renderer
|
||||
|
||||
def GenerateThumbnailBytesFromSVGPath(path: str, target_resolution: typing.Tuple[int, int], clip_rect = None) -> bytes:
|
||||
raise HydrusExceptions.UnsupportedFileException()
|
||||
|
||||
# TODO handle clipping
|
||||
|
||||
( target_width, target_height ) = target_resolution
|
||||
|
||||
renderer = LoadSVGRenderer(path)
|
||||
|
||||
# Seems to help for some weird floating point dimension SVGs
|
||||
renderer.setAspectRatioMode(QC.Qt.AspectRatioMode.KeepAspectRatio)
|
||||
def BaseGetSVGResolution( path: str ):
|
||||
|
||||
try:
|
||||
|
||||
qt_image = QG.QImage( target_width, target_height, QG.QImage.Format_RGBA8888 )
|
||||
|
||||
qt_image.fill( QC.Qt.transparent )
|
||||
return ( None, None )
|
||||
|
||||
painter = QG.QPainter(qt_image)
|
||||
|
||||
renderer.render(painter)
|
||||
|
||||
numpy_image = ClientGUIFunctions.ConvertQtImageToNumPy(qt_image)
|
||||
|
||||
painter.end()
|
||||
|
||||
return HydrusImageHandling.GenerateThumbnailBytesNumPy(numpy_image)
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException()
|
||||
|
||||
|
||||
def GetSVGResolution( path: str ):
|
||||
|
||||
renderer = LoadSVGRenderer(path)
|
||||
|
||||
resolution = renderer.defaultSize().toTuple()
|
||||
|
||||
return resolution
|
||||
GenerateThumbnailBytesFromSVGPath = BaseGenerateThumbnailBytesFromSVGPath
|
||||
GetSVGResolution = BaseGetSVGResolution
|
||||
|
|
|
@ -2,7 +2,7 @@ import os
|
|||
import requests
|
||||
import time
|
||||
import traceback
|
||||
requests.Request
|
||||
|
||||
import twisted.internet.ssl
|
||||
from twisted.internet import threads, reactor, defer
|
||||
|
||||
|
|
|
@ -1026,9 +1026,13 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
file_id = random.randint( 10000, 15000 )
|
||||
|
||||
hash = HydrusData.GenerateKey()
|
||||
hashes = { HydrusData.GenerateKey() for i in range( 10 ) }
|
||||
|
||||
file_ids_to_hashes = { file_id : hash for ( file_id, hash ) in zip( random.sample( range( 2000 ), 10 ), hashes ) }
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
@ -1053,7 +1057,35 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
|
||||
|
||||
#
|
||||
# with file_id
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
HG.test_controller.SetRead( 'hash_ids_to_hashes', { file_id : hash } )
|
||||
|
||||
path = '/add_files/delete_files'
|
||||
|
||||
body_dict = { 'file_id' : file_id }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
expected_service_keys_to_content_updates = { CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { hash }, reason = 'Deleted via Client API.' ) ] }
|
||||
|
||||
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
|
||||
|
||||
HG.test_controller.ClearReads( 'hash_ids_to_hashes' )
|
||||
|
||||
# with hashes
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
|
@ -1077,6 +1109,34 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
|
||||
|
||||
# with file_ids
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
HG.test_controller.SetRead( 'hash_ids_to_hashes', file_ids_to_hashes )
|
||||
|
||||
path = '/add_files/delete_files'
|
||||
|
||||
body_dict = { 'file_ids' : list( file_ids_to_hashes.keys() ) }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
expected_service_keys_to_content_updates = { CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = 'Deleted via Client API.' ) ] }
|
||||
|
||||
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
|
||||
|
||||
HG.test_controller.ClearReads( 'hash_ids_to_hashes' )
|
||||
|
||||
# now with a reason
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
@ -1127,6 +1187,42 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self.assertIn( not_existing_service_hex, text ) # error message should be complaining about it
|
||||
|
||||
# test file lock, 409 response
|
||||
|
||||
locked_hash = list( hashes )[0]
|
||||
|
||||
media_result = HelperFunctions.GetFakeMediaResult( locked_hash )
|
||||
|
||||
media_result.GetLocationsManager().inbox = False
|
||||
|
||||
HG.test_controller.new_options.SetBoolean( 'delete_lock_for_archived_files', True )
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
HG.test_controller.SetRead( 'media_results', [ media_result ] )
|
||||
|
||||
path = '/add_files/delete_files'
|
||||
|
||||
body_dict = { 'hashes' : [ h.hex() for h in hashes ] }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 409 )
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertIn( locked_hash.hex(), text ) # error message should be complaining about it
|
||||
|
||||
HG.client_controller.new_options.SetBoolean( 'delete_lock_for_archived_files', False )
|
||||
|
||||
HG.test_controller.ClearReads( 'media_results' )
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
|
|
@ -37,7 +37,7 @@ else
|
|||
exit 1
|
||||
fi
|
||||
|
||||
sed -e "s#Exec=.*#Exec=${INSTALL_DIR}/client.sh#" -e "s#Icon=.*#Icon=${INSTALL_DIR}/static/hydrus.png#" "$DESKTOP_SOURCE_PATH" > "$DESKTOP_DEST_PATH"
|
||||
sed -e "s#Exec=.*#Exec=${INSTALL_DIR}/hydrus_client.sh#" -e "s#Icon=.*#Icon=${INSTALL_DIR}/static/hydrus.png#" "$DESKTOP_SOURCE_PATH" > "$DESKTOP_DEST_PATH"
|
||||
|
||||
echo "Done!"
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@ FROM alpine:3.16
|
|||
ARG UID
|
||||
ARG GID
|
||||
|
||||
RUN apk --no-cache add py3-beautifulsoup4 py3-psutil py3-pysocks py3-requests py3-twisted py3-yaml py3-lz4 ffmpeg py3-pillow py3-numpy py3-openssl py3-cryptography py3-service_identity py3-opencv py3-lxml py3-chardet py3-dateutil py3-pip openssl su-exec
|
||||
RUN apk --no-cache add py3-psutil py3-requests py3-twisted py3-yaml py3-lz4 ffmpeg py3-pillow py3-numpy py3-openssl py3-cryptography py3-service_identity py3-opencv py3-pip openssl su-exec
|
||||
RUN pip install Send2Trash twisted
|
||||
|
||||
RUN set -xe \
|
||||
|
|
|
@ -0,0 +1,21 @@
|
|||
cryptography
|
||||
|
||||
cloudscraper>=1.2.33
|
||||
html5lib>=1.0.1
|
||||
lz4>=3.0.0
|
||||
nose>=1.3.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=6.0.0
|
||||
psutil>=5.0.0
|
||||
pyOpenSSL>=19.1.0
|
||||
PyYAML>=5.0.0
|
||||
Send2Trash>=1.5.0
|
||||
service-identity>=18.1.0
|
||||
six>=1.14.0
|
||||
Twisted>=20.3.0
|
||||
|
||||
opencv-python-headless==4.5.5.64
|
||||
python-mpv==1.0.3
|
||||
requests==2.31.0
|
||||
|
||||
setuptools==65.5.1
|
Loading…
Reference in New Issue