Compare commits

...

114 Commits

Author SHA1 Message Date
mertalev
f039672a2a android improvements 2025-08-13 22:16:43 -04:00
mertalev
3100702e93 better scrolling 2025-08-13 18:04:50 -04:00
mertalev
c988342de1 platform thumbhash 2025-08-13 18:04:50 -04:00
mertalev
84462560e3 cancel in image stream completer 2025-08-13 00:15:27 -04:00
mertalev
f931060670 image provider improvements 2025-08-13 00:10:55 -04:00
Brandon Wees
0d60199514 fix(mobile): newest/oldest album sort (#20743)
* fix(mobile): newest/oldest album sort

* chore: use sqlite to determine album asset timestamps

* Fix missing future

Co-authored-by: Alex <alex.tran1502@gmail.com>

* fix: async handling of sort

* chore: tests

* chore: code review changes

* fix: use created at for newest asset

* fix: use localDateTime for sorting

* chore: cleanup

* chore: use final

* feat: loading indicator

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-12 14:46:50 -05:00
Alex
54960157c0 chore: backup info card styling tweak (#20799)
* chore: backup info card styling tweak

* pr feedback
2025-08-12 16:08:31 +00:00
waclaw66
244d097d01 fix(mobile): enable person age pluralization (#20881)
Enable person age pluralization
2025-08-12 14:55:47 +00:00
renovate[bot]
adb55f3726 fix(deps): update machine-learning (#19803)
* fix(deps): update machine-learning

* typing fixes

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: mertalev <101130780+mertalev@users.noreply.github.com>
2025-08-11 18:07:49 -04:00
Mirek
5d2777a5c6 feat: format date and time in /admin/users/ -> Profile section (#20811)
Matches the format used in the user settings page.

Added a formatting function in utils.
2025-08-11 16:50:34 -05:00
Alex
24db881c14 feat: swipe to delete album (#20765) 2025-08-11 16:49:53 -05:00
Alex
f09bed9ad2 fix: age info cut off (#20872) 2025-08-11 16:42:16 -05:00
Mert
e29cc66361 docs: vectorchord migration instructions, deprecation log on startup (#20867)
* deprecation log, migration docs

* update tests

* fix info boxes
2025-08-11 16:50:48 -04:00
Brandon Wees
669b765662 feat: edit image in beta timeline (#20709)
* feat: edit image in beta timeline

* delete album notifier pull

* feat: sync local after saving image

* feat: queue asset for manual upload after saving

* chore: clarify PlatformException catch
2025-08-11 15:01:31 -05:00
Gabriel Soldani
e7060dc292 fix(web): fix layout loop with single row grids in explore page (#20833) 2025-08-11 02:31:26 +00:00
Nicholas
03a8b6cb38 feat: add i18n formatting to make translation in mobile makefile (#20807)
add i18n formatting to `make translation` in mobile makefile
2025-08-10 21:26:23 -05:00
Min Idzelis
f317cbe221 fix: devcontainer broken by debian Trixie going stable (#20843) 2025-08-10 21:24:00 -05:00
Nicholas
d6d31c6695 fix: change all download icons to mdiDownload for clarity and consistency (#20821)
change all download icons to `mdiDownload` for clarity and consistency
2025-08-10 21:23:21 -05:00
Alex
4b9019e762 fix: return method correctly (#20831) 2025-08-09 23:01:47 -05:00
Jason Rasmussen
13563fc507 chore: update response codes (#20770)
* chore: update response codes

* chore: skip problematic test
2025-08-08 15:56:37 -04:00
Jason Rasmussen
2ce4f8dd3b fix(sql-tools): null default (#20796) 2025-08-08 15:44:39 -04:00
Jason Rasmussen
538d5c81ea feat: reset oauth ids (#20798) 2025-08-08 15:42:38 -04:00
Jason Rasmussen
9ecaa3fa9d feat: more cursed knowledge (#20794) 2025-08-08 10:05:59 -04:00
Alex
b1aacfdbd9 chore: log resume backup process (#20768) 2025-08-07 15:44:49 -05:00
Jason Rasmussen
cfbc24579d feat(web): reset pin code (#20766) 2025-08-07 15:07:31 -04:00
Alex
1d4d8e7a9a chore: bump @immich/ui to 24 (#20767)
chore: bump @ui 24
2025-08-07 14:43:56 -04:00
Alex
7b83b7b2d5 fix: don't show remove from album action from the main timeline (#20757)
* fix: don't show remove from album action from the main timeline

* pr feedback
2025-08-07 23:36:16 +05:30
Jason Rasmussen
a896c5a4dd fix(web): shared-link autocomplete (#20761) 2025-08-07 12:01:05 -05:00
Jason Rasmussen
c74989d304 docs: include openapi.json (#20760) 2025-08-07 12:00:50 -05:00
bo0tzz
1283491cc2 chore: fork PRs can't have previews (#20464)
* chore: fork PRs can't have previews

* chore: fix formatting

* chore: different close message for fork PRs

---------

Co-authored-by: github-actions <41898282+github-actions[bot]@users.noreply.github.com>
2025-08-07 12:14:33 -04:00
Alex
89522daaac fix: invalidate album api on log out (#20756) 2025-08-07 14:19:44 +00:00
mkuehne707
011a667314 feat: batch change date and time relatively (#17717)
Co-authored-by: marcel.kuehne <>
Co-authored-by: Zack Pollard <zackpollard@ymail.com>
2025-08-07 13:42:33 +00:00
Nicholas
df2525ee08 feat(docs): add make dev-docs (#20572) 2025-08-07 14:02:13 +01:00
Arpit Singh
01a9f735c8 fix: avoid unnecessary writes to system metadata repository (#20538)
Co-authored-by: Zack Pollard <zackpollard@ymail.com>
2025-08-07 12:43:23 +00:00
Ben McCann
af10c3bc2f chore: upgrade SvelteKit (#20736) 2025-08-07 13:00:42 +01:00
Aamir Azad
395f2e155d docs: remove warning about the removed repair page (#20746) 2025-08-07 04:49:38 +00:00
Alex
10cbed55c4 fix: crash when rendering heatmap on Android (#20740) 2025-08-06 21:41:42 -05:00
Brandon Wees
325d5f7ba9 fix(mobile): person birthday viewing/editing (#20731)
* fix: edit birthday dialog

* chore: convert age to "x years old" format

* fix: lint

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-06 22:13:56 +00:00
Thomas
746252fe39 fix(web): limit max height of search results (#20727)
The height of the search results element was unrestricted, which meant that the
asset visibility calculations were completely incorrect. The consequence of
this is that assets which should not have been visible, were. In practical
terms, all assets below the viewport were rendered when they shouldn't have
been which is terrible for performance. Limiting the height of the viewport
fixes that calculation and assets are correctly hidden.

The consequence of limiting the height of the viewport is that the intersector
then incorrectly thought the scroll position was always at the end. This has
been fixed by calculating the position of sliding window against the calculated
asset layout container height.

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-06 17:05:23 -05:00
Thomas
f36efd128b fix(web): prevent thumbhashes from covering search bar (#20720)
The thumbhash had a z-index setting which meant it would cover the search bar,
and would always cause weird animations when scrolling up in search results.

This is fixable by removing the z-index and moving it in front the other
elements to get a naturally higher higher z-index preference.
2025-08-06 16:57:51 -05:00
shenlong
f1c494ef97 fix: use create if not exists clause for indexes (#20728)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-06 16:57:33 -05:00
shenlong
9c8c52874a fix: cleanup logger DB in isolates (#20730)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-06 16:57:17 -05:00
shenlong
68b617130a chore: disable android auto backup (#20734)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-06 16:52:42 -05:00
Thomas
89292fecb4 fix(web): use correct sliding window offset for search results (#20726)
The contents of search results are slightly offset by the search bar, search
terms and spacing (margins/padding), and needs to be factored in when
calculating whether an asset is visible or not. The offset was 0, which
meant that assets were removed from view too early.
2025-08-06 16:34:24 -05:00
Thomas
1193a23282 feat(web): don't scroll to visible assets (#20729)
The timeline has been quite aggressive with scrolling to assets, even if they
were right in the middle of the page. If the asset is visible, then we
shouldn't scroll to it. It's really confusing when assets jump around after
being viewed.
2025-08-06 16:31:37 -05:00
Xantin
bbfff64927 docs: update TrueNAS docs (#19990)
Co-authored-by: bo0tzz <git@bo0tzz.me>
Co-authored-by: Nicholas Flamy <30300649+NicholasFlamy@users.noreply.github.com>
2025-08-06 21:16:28 +02:00
shenlong
c5c9a522c1 fix: remove drift map scrubber (#20723)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-06 13:40:06 -05:00
Brandon Wees
3cd7f5ab90 feat: use sqlite for logging (#20414)
* feat: use drift for logging

* fix: tests

* feat: use the truncate limit from constants.ts as default

* chore: move setupAll to top level and restructure

* chore: code review changes

* fix: inherits

* feat: raise log line limit to 2000

* limit getAll to 250 lines

* delete DLog and make LogRepository not a singleton

* fix: drift build settings and `make migration`

* fix: tests

* remove sensitive log

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-06 10:49:29 -05:00
Lauritz Tieste
f2067221c5 fix: disk info is cleared when profile picture is uploaded (#20411)
fix: update disk info on user profile image upload
2025-08-06 10:36:07 -05:00
Thomas
89598cf0be chore(web): remove arbitrary search result limit (#20719)
The search results page can become unstable with large amounts of assets, and
has therefore been limited to displaying just 5000 assets. This limit is
arbitrary and may be too restrictive.
2025-08-06 10:12:52 -05:00
Mert
0121043d7d refactor(mobile): sqlite-based map view (#20665)
* feat(mobile): drift map page

* refactor: map query

* perf: do not filter markers

* fix: refresh timeline by key

* chore: rename

* remove ref listen and global key

* clean code

* remove locked and favorite

* temporary change for stress test

* optimizations

* fix bottom sheet

* cleaner bounds check

* cleanup

* feat: back button

---------

Co-authored-by: wuzihao051119 <wuzihao051119@outlook.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-06 10:05:49 -05:00
patrickgoering
1ca46fbd98 fix: video thumbnail generation for short videos (#20629)
fix video thumbnail generation for short videos

ffmpeg gives conversion failed with error 234 for short mp4 files (less
than 10s) that where converted from m2ts. Longer videos work fine.

It looks like ffmpeg has no frames left to use for generating a
thumbnail.

This change fixes this issue and seems to not change the behaviour for
other mp4 files (same thumbnail before and after change)

This might also fix all mts file thumbnail generation.
2025-08-06 13:10:49 +00:00
shenlong
6ddef3a7e4 fix: server version not fetched after auto login (#20713)
* fix: server version not fetched after auto login

* wrap get info with a try catch

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-06 08:01:16 -05:00
Brandon Wees
0d9ebdc46a fix(mobile): show video controls when in locked view (#20707)
* fix(mobile): show video controls when in locked view

* const constructor
2025-08-06 07:58:54 -05:00
Zack Pollard
fa26d0de33 refactor: new helper methods that work for all sync queries (#20690)
refactor: new helper methods that work for all sync queries
2025-08-06 08:34:12 -04:00
Jason Rasmussen
a5760129f0 fix: custom-url ssr (#20704) 2025-08-05 23:29:01 +02:00
Gaurav Yadav
d430b869ac fix: shared link custom URL photo access authentication (#20534) 2025-08-05 23:22:19 +02:00
Brandon Wees
4179c8a17d fix(mobile): filter people that have less than 3 faces (#20705) 2025-08-05 21:16:13 +00:00
Zack Pollard
0a9cbf01d2 feat: ack sync reset (#20703) 2025-08-05 20:30:19 +00:00
Alex
9567a2a560 fix: delete local asset show twice (#20700)
* chore: better button width

* fix: delete local action show twice
2025-08-05 19:18:57 +00:00
renovate[bot]
58dd6f094c chore(deps): update dependency @types/bcrypt to v6 (#20669)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 19:58:08 +02:00
Zack Pollard
02381343ff fix: album asset sync must sync new assets in a shared album (#20655) 2025-08-05 17:53:51 +01:00
Mert
09a5963eee fix(mobile): catch thumbnail cache miss (#20694)
catch error
2025-08-05 11:32:06 -05:00
Alex
a573a23c83 fix: empty custom header prevent logging in (#20693) 2025-08-05 16:14:21 +00:00
Brandon Wees
7118dca559 feat(mobile): album shared user editing (#20671)
* feat(mobile): album shared user editing

* fix: album leaving

* i18n and options button

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-05 15:31:58 +00:00
Brandon Wees
13d43e193e feat(mobile): use custom headers when connecting in widget (#20666)
* feat(mobile): use custom headers when connecting in widget

* delete log in android widget

* chore: code review changes
2025-08-05 10:29:27 -05:00
Brandon Wees
7a7843467c feat(mobile): remove from album in asset viewer bar (#20672)
* feat: remove from album in asset viewer bar

* chore: move button to bottom bar instead of bottom sheet

* move back to bottom sheet

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-05 15:20:55 +00:00
Mert
9e6fee4064 fix(mobile): use cached thumbnail in full size image provider (#20637) 2025-08-05 10:20:25 -04:00
shenlong
9680f1290d fix: exclude assets that haven't been hashed yet from uploads (#20684)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-05 08:35:25 -05:00
renovate[bot]
ce2ea98926 fix(deps): update typescript-projects (#20396)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Zack Pollard <zackpollard@ymail.com>
2025-08-05 12:45:47 +00:00
renovate[bot]
5c76cc34e1 chore(deps): update node.js to v22.18.0 (#20662)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 11:01:15 +00:00
renovate[bot]
eb2f4c866e chore(deps): update dependency eslint-plugin-unicorn to v60 (#20677)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Zack Pollard <zackpollard@ymail.com>
2025-08-05 10:58:13 +00:00
renovate[bot]
2a370087e8 chore(deps): update dependency @types/node to ^22.17.0 (#20657)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 11:56:53 +01:00
renovate[bot]
272c8a5812 chore(deps): update grafana/grafana docker tag to v12.1.0 (#20661)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 11:56:05 +01:00
renovate[bot]
08fe549ed8 chore(deps): update base-image to v202507291116 (major) (#20668)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 11:54:37 +01:00
renovate[bot]
ae15efdf2a chore(deps): update dependency pigeon to v26 (#20678)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Zack Pollard <zackpollard@ymail.com>
2025-08-05 10:52:03 +00:00
renovate[bot]
8e003f95db chore(deps): update github/codeql-action action to v3.29.5 (#20656)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-05 11:44:50 +01:00
Brandon Wees
3e92e837f1 feat(mobile): create shared link for albums (#20652)
* feat(mobile): create shared link for albums

* translation

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-05 01:51:45 +00:00
Brandon Wees
081307ced2 fix: expand sheet when album search is focused (#20651)
* fix: expand sheet when album search is focused

* convert GeneralBottomSheet to ConsumerStatefulWidget

* fix: cleaning up

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-04 20:35:57 -05:00
Michael
a91bb399f0 feat: add server.versionCheck permission (#20555)
* add server.versionCheck permission

* getVersionCheck is no admin-route
2025-08-04 17:39:05 -05:00
Brandon Wees
42b78c59b5 fix(mobile): disable memory lane when memories are disabled (#20642)
* fix(mobile): disable memory lane when memories are disabled

* Update main_timeline.page.dart

* fix: formatting

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-08-04 22:34:28 +00:00
Brandon Wees
750d21aeba fix(mobile): use storageIndicator setting for beta timeline (#20639)
* fix: use storageIndicator setting for beta timeline

* fix: reactively update the storage indicator icons when setting is changed

* Update drift_trash.page.dart

* override to bool for storageIndicator
2025-08-04 17:25:58 -05:00
Paweł Wojtaszko
990d9ba9a8 fix: adjust margin and gap for trailing elements in control app bar (#20645) 2025-08-04 17:24:19 -05:00
Brandon Wees
4d0c9172e5 fix: not clearing local data when logging out while sync is running (#20646) 2025-08-04 17:14:26 -05:00
Brandon Wees
094e3a2757 fix(mobile): cleanly handle logout when no host is set (#20521)
* fix: cleanly handle logging out when no host is set on API

* move conditional to auth_api repo
2025-08-05 03:11:58 +05:30
Zack Pollard
278668b8c5 fix: improvements to sync and upload when resuming app (#20524)
- App will now kick off hashing after local sync if the lifecycle is in resumed or active state
- We now wait for hashing to complete before we kick off the upload process
2025-08-05 03:11:44 +05:30
cford256
10141504a2 fix: exif rating rounding (#20457)
* fix_Exlif_Metadata_Rating_Rounding_to_Interger

Rounding Exlif Rating Interger
Images support having numbers other than integers for the rating metadata in EXLIF. The database expects it to be an integer though. Trying to upload an image that has a rating other than an integer results in it failing to parse the image and defaulting to showing a corrupted file icon. 

Rather than changing the database type, I would like to round the rating to the nearest integer so that Immich works with images that have a rating like this in their metadata.

* Changing Metadata validateRange to always round.

* Update server/src/services/metadata.service.ts

Co-authored-by: Daniel Dietzler <36593685+danieldietzler@users.noreply.github.com>

---------

Co-authored-by: Daniel Dietzler <36593685+danieldietzler@users.noreply.github.com>
2025-08-04 14:29:51 -05:00
Brandon Wees
67736c8fce fix(mobile): fetch serverConfig before building shared link (#20638)
fix(mobile): fetch serverConfig before trying to pull externalDomain for new shared link
2025-08-04 14:28:43 -05:00
Paweł Wojtaszko
b56a272f64 fix: adjust search bar padding and visibility based on input state (#20598) 2025-08-04 17:46:46 +00:00
shenlong
5901c2e963 fix: hide navigation bar in search page during multi-selection (#20616)
fix: hide navigation bar in search page during multiselect

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-04 12:39:40 -05:00
Alex
be85832b20 fix: add assets to album (#20626)
* fix: add assets to album

* always navigate back to the albums view from album page
2025-08-04 12:25:11 -05:00
bo0tzz
c8f9a72d3e feat: close likely duplicates (#20556)
Co-authored-by: github-actions <41898282+github-actions[bot]@users.noreply.github.com>
2025-08-04 18:15:15 +02:00
Alexandre Garnier
3d633a81c4 fix(mobile): use right translation function for pluralized ICU message format (#20404) 2025-08-04 11:53:11 +05:30
shenlong
4efbf36d82 chore: log asset name on hash failures (#20608) 2025-08-04 06:07:50 +00:00
Alden Bansemer
e2c3c39597 chore: tweak photo sphere fov and zoom speed constants (#20595) 2025-08-04 01:07:11 -05:00
github-actions
007ba1d9ef chore: version v1.137.3 2025-08-01 14:52:24 +00:00
Daniel Dietzler
4d5cd1a6b5 fix: migration if media location is set (#20532) 2025-08-01 14:49:51 +00:00
shenlong
8108f50c4e fix: guard IS_FAVORITE column with SDK check (#20511)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-08-01 05:39:59 -05:00
Alex
1b8354ed36 chore: post release tasks (#20497) 2025-08-01 05:38:52 -05:00
github-actions
9242afb4b0 chore: version v1.137.2 2025-08-01 02:45:16 +00:00
Alex
c5f14adff0 feat: drag to select beta timeline (#20456) 2025-07-31 21:29:01 -05:00
Alex
1378f22368 fix: add to album render empty app bar (#20480)
* fix: add to album render empty app bar

* set current album
2025-07-31 21:28:33 -05:00
Alex
4bd465e752 feat: change grid size with gesture (#20455) 2025-07-31 21:02:28 -05:00
github-actions
a07531be3b chore: version v1.137.1 2025-07-31 23:05:34 +00:00
Daniel Dietzler
3cdc6844a1 fix: automatic media location migration without internal assets (#20489) 2025-07-31 22:58:35 +00:00
github-actions
c3263e50fc chore: version v1.137.0 2025-07-31 20:19:26 +00:00
Alex
7391ea6ff9 chore: large file size grid view styling (#20472)
* chore: large file grid styles

* chore: large file grid styles
2025-07-31 12:52:19 -04:00
Daniel Dietzler
f972b8d514 fix: modal race conditions (#20460) 2025-07-31 07:28:45 -05:00
Jason Rasmussen
6b50d958f4 fix: incorrect next/previous action after folder view refresh (#20447) 2025-07-30 14:50:52 -05:00
Alex
27c456eb75 fix: people navigation (#20450) 2025-07-30 14:47:47 -05:00
Brandon Wees
e7d051db3c feat: drift edit time and date action (#20377)
* feat: drift edit time and date action

* feat: add edit button on asset viewer bottom sheet

* update localDateTime column in addition to createdAt to keep consistency

* fix: dont update local dateTime

Server calcs this anyway and it will be synced when the change is applied. We don't use localDateTime on mobile so there is no reason to update this value

* fix: padding around edit icon in ListTile

Co-authored-by: shenlong <139912620+shenlong-tanwen@users.noreply.github.com>

* chore: format

* fix: hide date edit control when asset does not have a remote

* fix: pull timezones correctly from image

---------

Co-authored-by: shenlong <139912620+shenlong-tanwen@users.noreply.github.com>
2025-07-30 14:40:13 -05:00
Peter Ombodi
86d31d7d29 fix(download): handle completed downloads and refresh albums (#20241)
* fix(download): handle completed downloads and refresh albums

* fix(download): remove use of outdated AlbumService

---------

Co-authored-by: Peter Ombodi <peter.ombodi@gmail.com>
2025-07-30 14:33:55 -05:00
shenlong
f416342eff fix: clear local file cache before upload (#20448)
* clear local file cache before upload

* clear cache during hashing

* fix test

* add button to clear cache manually

* add button to clear cache manually

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-07-30 14:32:38 -05:00
Cédric
d73335ecbc docs: add config example for Authelia (#20223) 2025-07-30 19:13:19 +00:00
363 changed files with 18955 additions and 6164 deletions

2
.github/.nvmrc vendored
View File

@@ -1 +1 @@
22.17.1
22.18.0

96
.github/workflows/close-duplicates.yml vendored Normal file
View File

@@ -0,0 +1,96 @@
on:
issues:
types: [opened]
discussion:
types: [created]
name: Close likely duplicates
permissions: {}
jobs:
get_body:
runs-on: ubuntu-latest
env:
EVENT: ${{ toJSON(github.event) }}
outputs:
body: ${{ steps.get_body.outputs.body }}
steps:
- id: get_body
run: |
BODY=$(echo """$EVENT""" | jq -r '.issue // .discussion | .body' | base64 -w 0)
echo "body=$BODY" >> $GITHUB_OUTPUT
get_checkbox_json:
runs-on: ubuntu-latest
needs: get_body
container:
image: yshavit/mdq:0.7.2
outputs:
json: ${{ steps.get_checkbox.outputs.json }}
steps:
- id: get_checkbox
env:
BODY: ${{ needs.get_body.outputs.body }}
run: |
JSON=$(echo "$BODY" | base64 -d | /mdq --output json '# I have searched | - [?] Yes')
echo "json=$JSON" >> $GITHUB_OUTPUT
close_and_comment:
runs-on: ubuntu-latest
needs: get_checkbox_json
if: ${{ !fromJSON(needs.get_checkbox_json.outputs.json).items[0].list[0].checked }}
permissions:
issues: write
discussions: write
steps:
- name: Close issue
if: ${{ github.event_name == 'issues' }}
env:
GH_TOKEN: ${{ github.token }}
NODE_ID: ${{ github.event.issue.node_id }}
run: |
gh api graphql \
-f issueId="$NODE_ID" \
-f body="This issue has automatically been closed as it is likely a duplicate. We get a lot of duplicate threads each day, which is why we ask you in the template to confirm that you searched for duplicates before opening one." \
-f query='
mutation CommentAndCloseIssue($issueId: ID!, $body: String!) {
addComment(input: {
subjectId: $issueId,
body: $body
}) {
__typename
}
closeIssue(input: {
issueId: $issueId,
stateReason: DUPLICATE
}) {
__typename
}
}'
- name: Close discussion
if: ${{ github.event_name == 'discussion' && github.event.discussion.category.name == 'Feature Request' }}
env:
GH_TOKEN: ${{ github.token }}
NODE_ID: ${{ github.event.discussion.node_id }}
run: |
gh api graphql \
-f discussionId="$NODE_ID" \
-f body="This discussion has automatically been closed as it is likely a duplicate. We get a lot of duplicate threads each day, which is why we ask you in the template to confirm that you searched for duplicates before opening one." \
-f query='
mutation CommentAndCloseDiscussion($discussionId: ID!, $body: String!) {
addDiscussionComment(input: {
discussionId: $discussionId,
body: $body
}) {
__typename
}
closeDiscussion(input: {
discussionId: $discussionId,
reason: DUPLICATE
}) {
__typename
}
}'

View File

@@ -50,7 +50,7 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@4e828ff8d448a8a6e532957b1811f387a63867e8 # v3.29.4
uses: github/codeql-action/init@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@@ -63,7 +63,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@4e828ff8d448a8a6e532957b1811f387a63867e8 # v3.29.4
uses: github/codeql-action/autobuild@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
@@ -76,6 +76,6 @@ jobs:
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@4e828ff8d448a8a6e532957b1811f387a63867e8 # v3.29.4
uses: github/codeql-action/analyze@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
with:
category: '/language:${{matrix.language}}'

View File

@@ -20,7 +20,7 @@ jobs:
remove-label:
runs-on: ubuntu-latest
if: ${{ github.event.action == 'closed' && contains(github.event.pull_request.labels.*.name, 'preview') }}
if: ${{ (github.event.action == 'closed' || github.event.pull_request.head.repo.fork) && contains(github.event.pull_request.labels.*.name, 'preview') }}
permissions:
pull-requests: write
steps:
@@ -33,3 +33,15 @@ jobs:
repo: context.repo.repo,
name: 'preview'
})
- uses: mshick/add-pr-comment@b8f338c590a895d50bcbfa6c5859251edc8952fc # v2.8.2
if: ${{ github.event.pull_request.head.repo.fork }}
with:
message-id: 'preview-status'
message: 'PRs from forks cannot have preview environments.'
- uses: mshick/add-pr-comment@b8f338c590a895d50bcbfa6c5859251edc8952fc # v2.8.2
if: ${{ !github.event.pull_request.head.repo.fork }}
with:
message-id: 'preview-status'
message: 'Preview environment has been removed.'

View File

@@ -129,7 +129,7 @@ jobs:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload SARIF file
uses: github/codeql-action/upload-sarif@4e828ff8d448a8a6e532957b1811f387a63867e8 # v3.29.4
uses: github/codeql-action/upload-sarif@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
with:
sarif_file: results.sarif
category: zizmor

View File

@@ -10,6 +10,9 @@ dev-update:
dev-scale:
@trap 'make dev-down' EXIT; COMPOSE_BAKE=true docker compose -f ./docker/docker-compose.dev.yml up --build -V --scale immich-server=3 --remove-orphans
dev-docs:
npm --prefix docs run start
.PHONY: e2e
e2e:
@trap 'make e2e-down' EXIT; COMPOSE_BAKE=true docker compose -f ./e2e/docker-compose.yml up --build -V --remove-orphans

View File

@@ -1 +1 @@
22.17.1
22.18.0

178
cli/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "@immich/cli",
"version": "2.2.73",
"version": "2.2.77",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@immich/cli",
"version": "2.2.73",
"version": "2.2.77",
"license": "GNU Affero General Public License version 3",
"dependencies": {
"chokidar": "^4.0.3",
@@ -27,7 +27,7 @@
"@types/lodash-es": "^4.17.12",
"@types/micromatch": "^4.0.9",
"@types/mock-fs": "^4.13.1",
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"@vitest/coverage-v8": "^3.0.0",
"byte-size": "^9.0.0",
"cli-progress": "^3.12.0",
@@ -35,7 +35,7 @@
"eslint": "^9.14.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.1.3",
"eslint-plugin-unicorn": "^59.0.0",
"eslint-plugin-unicorn": "^60.0.0",
"globals": "^16.0.0",
"mock-fs": "^5.2.0",
"prettier": "^3.2.5",
@@ -54,14 +54,14 @@
},
"../open-api/typescript-sdk": {
"name": "@immich/sdk",
"version": "1.136.0",
"version": "1.137.3",
"dev": true,
"license": "GNU Affero General Public License version 3",
"dependencies": {
"@oazapfts/runtime": "^1.0.2"
},
"devDependencies": {
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"typescript": "^5.3.3"
}
},
@@ -90,9 +90,9 @@
}
},
"node_modules/@babel/helper-validator-identifier": {
"version": "7.25.9",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.25.9.tgz",
"integrity": "sha512-Ed61U6XJc3CVRfkERJWDz4dJwKe7iLmmJsbOGu9wSloNSFttHV0I8g6UAgb7qnK5ly5bGLPd4oXZlxCdANBOWQ==",
"version": "7.27.1",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.27.1.tgz",
"integrity": "sha512-D2hP9eA+Sqx1kBZgzxZh0y1trbuU+JoDkiEwqhQ36nodYqJwyEIhPSdMNd7lOm/4io72luTPWH20Yda0xOuUow==",
"dev": true,
"license": "MIT",
"engines": {
@@ -632,9 +632,9 @@
}
},
"node_modules/@eslint/core": {
"version": "0.14.0",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.14.0.tgz",
"integrity": "sha512-qIbV0/JZr7iSDjqAc60IqbLdsj9GDt16xQtWD+B78d/HAlvysGdZZ6rpJHGAc2T0FQx1X6thsSPdnoiGKdNtdg==",
"version": "0.15.1",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.15.1.tgz",
"integrity": "sha512-bkOp+iumZCCbt1K1CmWf0R9pM5yKpDv+ZXtvSyQpudrI9kuFLp+bM2WOPXImuD/ceQuaa8f5pj93Y7zyECIGNA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
@@ -682,9 +682,9 @@
}
},
"node_modules/@eslint/js": {
"version": "9.31.0",
"resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.31.0.tgz",
"integrity": "sha512-LOm5OVt7D4qiKCqoiPbA7LWmI+tbw1VbTUowBcUMgQSuM6poJufkFkYDcQpo5KfgD39TnNySV26QjOh7VFpSyw==",
"version": "9.32.0",
"resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.32.0.tgz",
"integrity": "sha512-BBpRFZK3eX6uMLKz8WxFOBIFFcGFJ/g8XuwjTHCqHROSIsopI+ddn/d5Cfh36+7+e5edVS8dbSHnBNhrLEX0zg==",
"dev": true,
"license": "MIT",
"engines": {
@@ -705,13 +705,13 @@
}
},
"node_modules/@eslint/plugin-kit": {
"version": "0.3.1",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.3.1.tgz",
"integrity": "sha512-0J+zgWxHN+xXONWIyPWKFMgVuJoZuGiIFu8yxk7RJjxkzpGmyja5wRFqZIVtjDVOQpV+Rw0iOAjYPE2eQyjr0w==",
"version": "0.3.4",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.3.4.tgz",
"integrity": "sha512-Ul5l+lHEcw3L5+k8POx6r74mxEYKG5kOb6Xpy2gCRW6zweT6TEhAf8vhxGgjhqrd/VO/Dirhsb+1hNpD1ue9hw==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@eslint/core": "^0.14.0",
"@eslint/core": "^0.15.1",
"levn": "^0.4.1"
},
"engines": {
@@ -1355,9 +1355,9 @@
}
},
"node_modules/@types/node": {
"version": "22.16.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.16.5.tgz",
"integrity": "sha512-bJFoMATwIGaxxx8VJPeM8TonI8t579oRvgAuT8zFugJsJZgzqv0Fu8Mhp68iecjzG7cnN3mO2dJQ5uUM2EFrgQ==",
"version": "22.17.0",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.17.0.tgz",
"integrity": "sha512-bbAKTCqX5aNVryi7qXVMi+OkB3w/OyblodicMbvE38blyAz7GxXf6XYhklokijuPwwVg9sDLKRxt0ZHXQwZVfQ==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -1897,9 +1897,9 @@
}
},
"node_modules/browserslist": {
"version": "4.24.4",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.24.4.tgz",
"integrity": "sha512-KDi1Ny1gSePi1vm0q4oxSF8b4DR44GF4BbmS2YdhPLOEqd8pDviZOGH/GsmRwoWJ2+5Lr085X7naowMwKHDG1A==",
"version": "4.25.1",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.25.1.tgz",
"integrity": "sha512-KGj0KoOMXLpSNkkEI6Z6mShmQy0bc1I+T7K9N81k4WWMrfz+6fQ6es80B/YLAeRoKvjYE1YSHHOW1qe9xIVzHw==",
"dev": true,
"funding": [
{
@@ -1917,10 +1917,10 @@
],
"license": "MIT",
"dependencies": {
"caniuse-lite": "^1.0.30001688",
"electron-to-chromium": "^1.5.73",
"caniuse-lite": "^1.0.30001726",
"electron-to-chromium": "^1.5.173",
"node-releases": "^2.0.19",
"update-browserslist-db": "^1.1.1"
"update-browserslist-db": "^1.1.3"
},
"bin": {
"browserslist": "cli.js"
@@ -1981,9 +1981,9 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001713",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001713.tgz",
"integrity": "sha512-wCIWIg+A4Xr7NfhTuHdX+/FKh3+Op3LBbSp2N5Pfx6T/LhdQy3GTyoTg48BReaW/MyMNZAkTadsBtai3ldWK0Q==",
"version": "1.0.30001731",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001731.tgz",
"integrity": "sha512-lDdp2/wrOmTRWuoB5DpfNkC0rJDU8DqRa6nYL6HK6sytw70QMopt/NIc/9SM7ylItlBWfACXk0tEn37UWM/+mg==",
"dev": true,
"funding": [
{
@@ -2035,6 +2035,13 @@
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/change-case": {
"version": "5.4.4",
"resolved": "https://registry.npmjs.org/change-case/-/change-case-5.4.4.tgz",
"integrity": "sha512-HRQyTk2/YPEkt9TnUPbOpr64Uw3KOicFWPVBb+xiHvd6eBx/qPr9xqfBFDT8P2vWsvvz4jbEkfDe71W3VyNu2w==",
"dev": true,
"license": "MIT"
},
"node_modules/check-error": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/check-error/-/check-error-2.1.1.tgz",
@@ -2061,9 +2068,9 @@
}
},
"node_modules/ci-info": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/ci-info/-/ci-info-4.2.0.tgz",
"integrity": "sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==",
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ci-info/-/ci-info-4.3.0.tgz",
"integrity": "sha512-l+2bNRMiQgcfILUi33labAZYIWlH1kWDp+ecNo5iisRKrbm0xcRyCww71/YU0Fkw0mAFpz9bJayXPjey6vkmaQ==",
"dev": true,
"funding": [
{
@@ -2150,13 +2157,13 @@
"license": "MIT"
},
"node_modules/core-js-compat": {
"version": "3.41.0",
"resolved": "https://registry.npmjs.org/core-js-compat/-/core-js-compat-3.41.0.tgz",
"integrity": "sha512-RFsU9LySVue9RTwdDVX/T0e2Y6jRYWXERKElIjpuEOEnxaXffI0X7RUwVzfYLfzuLXSNJDYoRYUAmRUcyln20A==",
"version": "3.45.0",
"resolved": "https://registry.npmjs.org/core-js-compat/-/core-js-compat-3.45.0.tgz",
"integrity": "sha512-gRoVMBawZg0OnxaVv3zpqLLxaHmsubEGyTnqdpI/CEBvX4JadI1dMSHxagThprYRtSVbuQxvi6iUatdPxohHpA==",
"dev": true,
"license": "MIT",
"dependencies": {
"browserslist": "^4.24.4"
"browserslist": "^4.25.1"
},
"funding": {
"type": "opencollective",
@@ -2221,9 +2228,9 @@
"license": "MIT"
},
"node_modules/electron-to-chromium": {
"version": "1.5.137",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.137.tgz",
"integrity": "sha512-/QSJaU2JyIuTbbABAo/crOs+SuAZLS+fVVS10PVrIT9hrRkmZl8Hb0xPSkKRUUWHQtYzXHpQUW3Dy5hwMzGZkA==",
"version": "1.5.195",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.195.tgz",
"integrity": "sha512-URclP0iIaDUzqcAyV1v2PgduJ9N0IdXmWsnPzPfelvBmjmZzEy6xJcjb1cXj+TbYqXgtLrjHEoaSIdTYhw4ezg==",
"dev": true,
"license": "ISC"
},
@@ -2306,9 +2313,9 @@
}
},
"node_modules/eslint": {
"version": "9.31.0",
"resolved": "https://registry.npmjs.org/eslint/-/eslint-9.31.0.tgz",
"integrity": "sha512-QldCVh/ztyKJJZLr4jXNUByx3gR+TDYZCRXEktiZoUR3PGy4qCmSbkxcIle8GEwGpb5JBZazlaJ/CxLidXdEbQ==",
"version": "9.32.0",
"resolved": "https://registry.npmjs.org/eslint/-/eslint-9.32.0.tgz",
"integrity": "sha512-LSehfdpgMeWcTZkWZVIJl+tkZ2nuSkyyB9C27MZqFWXuph7DvaowgcTvKqxvpLW1JZIk8PN7hFY3Rj9LQ7m7lg==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -2318,8 +2325,8 @@
"@eslint/config-helpers": "^0.3.0",
"@eslint/core": "^0.15.0",
"@eslint/eslintrc": "^3.3.1",
"@eslint/js": "9.31.0",
"@eslint/plugin-kit": "^0.3.1",
"@eslint/js": "9.32.0",
"@eslint/plugin-kit": "^0.3.4",
"@humanfs/node": "^0.16.6",
"@humanwhocodes/module-importer": "^1.0.1",
"@humanwhocodes/retry": "^0.4.2",
@@ -2414,65 +2421,39 @@
}
},
"node_modules/eslint-plugin-unicorn": {
"version": "59.0.1",
"resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-59.0.1.tgz",
"integrity": "sha512-EtNXYuWPUmkgSU2E7Ttn57LbRREQesIP1BiLn7OZLKodopKfDXfBUkC/0j6mpw2JExwf43Uf3qLSvrSvppgy8Q==",
"version": "60.0.0",
"resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-60.0.0.tgz",
"integrity": "sha512-QUzTefvP8stfSXsqKQ+vBQSEsXIlAiCduS/V1Em+FKgL9c21U/IIm20/e3MFy1jyCf14tHAhqC1sX8OTy6VUCg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@babel/helper-validator-identifier": "^7.25.9",
"@eslint-community/eslint-utils": "^4.5.1",
"@eslint/plugin-kit": "^0.2.7",
"ci-info": "^4.2.0",
"@babel/helper-validator-identifier": "^7.27.1",
"@eslint-community/eslint-utils": "^4.7.0",
"@eslint/plugin-kit": "^0.3.3",
"change-case": "^5.4.4",
"ci-info": "^4.3.0",
"clean-regexp": "^1.0.0",
"core-js-compat": "^3.41.0",
"core-js-compat": "^3.44.0",
"esquery": "^1.6.0",
"find-up-simple": "^1.0.1",
"globals": "^16.0.0",
"globals": "^16.3.0",
"indent-string": "^5.0.0",
"is-builtin-module": "^5.0.0",
"jsesc": "^3.1.0",
"pluralize": "^8.0.0",
"regexp-tree": "^0.1.27",
"regjsparser": "^0.12.0",
"semver": "^7.7.1",
"semver": "^7.7.2",
"strip-indent": "^4.0.0"
},
"engines": {
"node": "^18.20.0 || ^20.10.0 || >=21.0.0"
"node": "^20.10.0 || >=21.0.0"
},
"funding": {
"url": "https://github.com/sindresorhus/eslint-plugin-unicorn?sponsor=1"
},
"peerDependencies": {
"eslint": ">=9.22.0"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/@eslint/core": {
"version": "0.13.0",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.13.0.tgz",
"integrity": "sha512-yfkgDw1KR66rkT5A8ci4irzDysN7FRpq3ttJolR88OqQikAWqwA8j5VZyas+vjyBNFIJ7MfybJ9plMILI2UrCw==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@types/json-schema": "^7.0.15"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/@eslint/plugin-kit": {
"version": "0.2.8",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.2.8.tgz",
"integrity": "sha512-ZAoA40rNMPwSm+AeHpCq8STiNAwzWLJuP8Xv4CHIc9wv/PSuExjMrmjfYNj682vW0OOiZ1HKxzvjQr9XZIisQA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@eslint/core": "^0.13.0",
"levn": "^0.4.1"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
"eslint": ">=9.29.0"
}
},
"node_modules/eslint-scope": {
@@ -2505,19 +2486,6 @@
"url": "https://opencollective.com/eslint"
}
},
"node_modules/eslint/node_modules/@eslint/core": {
"version": "0.15.1",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.15.1.tgz",
"integrity": "sha512-bkOp+iumZCCbt1K1CmWf0R9pM5yKpDv+ZXtvSyQpudrI9kuFLp+bM2WOPXImuD/ceQuaa8f5pj93Y7zyECIGNA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@types/json-schema": "^7.0.15"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
}
},
"node_modules/espree": {
"version": "10.4.0",
"resolved": "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz",
@@ -3727,9 +3695,9 @@
}
},
"node_modules/semver": {
"version": "7.7.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.1.tgz",
"integrity": "sha512-hlq8tAfn0m/61p4BVRcPzIGr6LKiMwo4VM6dGi6pt4qcRkmNzTcWq6eCEjEh+qXjkMDvPlOFFSGwQjoEa6gyMA==",
"version": "7.7.2",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.2.tgz",
"integrity": "sha512-RF0Fw+rO5AMf9MAyaRXI4AV0Ulj5lMHqVxxdSgiVbixSCXoEmmX/jk0CuJw4+3SqroYO9VoUh+HcuJivvtJemA==",
"dev": true,
"license": "ISC",
"bin": {
@@ -4211,15 +4179,15 @@
}
},
"node_modules/vite": {
"version": "7.0.5",
"resolved": "https://registry.npmjs.org/vite/-/vite-7.0.5.tgz",
"integrity": "sha512-1mncVwJxy2C9ThLwz0+2GKZyEXuC3MyWtAAlNftlZZXZDP3AJt5FmwcMit/IGGaNZ8ZOB2BNO/HFUB+CpN0NQw==",
"version": "7.0.6",
"resolved": "https://registry.npmjs.org/vite/-/vite-7.0.6.tgz",
"integrity": "sha512-MHFiOENNBd+Bd9uvc8GEsIzdkn1JxMmEeYX35tI3fv0sJBUTfW5tQsoaOwuY4KhBI09A3dUJ/DXf2yxPVPUceg==",
"dev": true,
"license": "MIT",
"dependencies": {
"esbuild": "^0.25.0",
"fdir": "^6.4.6",
"picomatch": "^4.0.2",
"picomatch": "^4.0.3",
"postcss": "^8.5.6",
"rollup": "^4.40.0",
"tinyglobby": "^0.2.14"

View File

@@ -1,6 +1,6 @@
{
"name": "@immich/cli",
"version": "2.2.73",
"version": "2.2.77",
"description": "Command Line Interface (CLI) for Immich",
"type": "module",
"exports": "./dist/index.js",
@@ -21,7 +21,7 @@
"@types/lodash-es": "^4.17.12",
"@types/micromatch": "^4.0.9",
"@types/mock-fs": "^4.13.1",
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"@vitest/coverage-v8": "^3.0.0",
"byte-size": "^9.0.0",
"cli-progress": "^3.12.0",
@@ -29,7 +29,7 @@
"eslint": "^9.14.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.1.3",
"eslint-plugin-unicorn": "^59.0.0",
"eslint-plugin-unicorn": "^60.0.0",
"globals": "^16.0.0",
"mock-fs": "^5.2.0",
"prettier": "^3.2.5",
@@ -69,6 +69,6 @@
"micromatch": "^4.0.8"
},
"volta": {
"node": "22.17.1"
"node": "22.18.0"
}
}

View File

@@ -95,7 +95,7 @@ services:
command: ['./run.sh', '-disable-reporting']
ports:
- 3000:3000
image: grafana/grafana:12.0.2-ubuntu@sha256:0512d81cdeaaff0e370a9aa66027b465d1f1f04379c3a9c801a905fabbdbc7a5
image: grafana/grafana:12.1.0-ubuntu@sha256:397aa30dd1af16cb6c5c9879498e467973a7f87eacf949f6d5a29407a3843809
volumes:
- grafana-data:/var/lib/grafana

4
docs/.gitignore vendored
View File

@@ -18,4 +18,6 @@
npm-debug.log*
yarn-debug.log*
yarn-error.log*
yarn.lock
yarn.lock
/static/openapi.json

View File

@@ -1 +1 @@
22.17.1
22.18.0

View File

@@ -64,7 +64,7 @@ Once you have a new OAuth client application configured, Immich can be configure
| Storage Label Claim | string | preferred_username | Claim mapping for the user's storage label**¹** |
| Role Claim | string | immich_role | Claim mapping for the user's role. (should return "user" or "admin")**¹** |
| Storage Quota Claim | string | immich_quota | Claim mapping for the user's storage**¹** |
| Default Storage Quota (GiB) | number | 0 | Default quota for user without storage quota claim (Enter 0 for unlimited quota) |
| Default Storage Quota (GiB) | number | 0 | Default quota for user without storage quota claim (empty for unlimited quota) |
| Button Text | string | Login with OAuth | Text for the OAuth button on the web |
| Auto Register | boolean | true | When true, will automatically register a user the first time they sign in |
| [Auto Launch](#auto-launch) | boolean | false | When true, will skip the login page and automatically start the OAuth login process |
@@ -106,6 +106,89 @@ Immich has a route (`/api/oauth/mobile-redirect`) that is already configured to
## Example Configuration
<details>
<summary>Authelia Example</summary>
### Authelia Example
Here's an example of OAuth configured for Authelia:
This assumes there exist an attribute `immichquota` in the user schema, which is used to set the user's storage quota in Immich.
The configuration concerning the quota is optional.
```yaml
authentication_backend:
ldap:
# The LDAP server configuration goes here.
# See: https://www.authelia.com/c/ldap
attributes:
extra:
immichquota: # The attribute name from LDAP
name: 'immich_quota'
multi_valued: false
value_type: 'integer'
identity_providers:
oidc:
## The other portions of the mandatory OpenID Connect 1.0 configuration go here.
## See: https://www.authelia.com/c/oidc
claims_policies:
immich_policy:
custom_claims:
immich_quota:
attribute: 'immich_quota'
scopes:
immich_scope:
claims:
- 'immich_quota'
clients:
- client_id: 'immich'
client_name: 'Immich'
# https://www.authelia.com/integration/openid-connect/frequently-asked-questions/#how-do-i-generate-a-client-identifier-or-client-secret
client_secret: $pbkdf2-sha512$310000$c8p78n7pUMln0jzvd4aK4Q$JNRBzwAo0ek5qKn50cFzzvE9RXV88h1wJn5KGiHrD0YKtZaR/nCb2CJPOsKaPK0hjf.9yHxzQGZziziccp6Yng'
public: false
require_pkce: false
redirect_uris:
- 'https://example.immich.app/auth/login'
- 'https://example.immich.app/user-settings'
- 'app.immich:///oauth-callback'
scopes:
- 'openid'
- 'profile'
- 'email'
- 'immich_scope'
claims_policy: 'immich_policy'
response_types:
- 'code'
grant_types:
- 'authorization_code'
id_token_signed_response_alg: 'RS256'
userinfo_signed_response_alg: 'RS256'
token_endpoint_auth_method: 'client_secret_post'
```
Configuration of OAuth in Immich System Settings
| Setting | Value |
| ---------------------------------- | ------------------------------------------------------------------- |
| Issuer URL | `https://example.immich.app/.well-known/openid-configuration` |
| Client ID | immich |
| Client Secret | 0v89FXkQOWO\***\*\*\*\*\***\*\*\***\*\*\*\*\***mprbvXD549HH6s1iw... |
| Token Endpoint Auth Method | client_secret_post |
| Scope | openid email profile immich_scope |
| ID Token Signed Response Algorithm | RS256 |
| Userinfo Signed Response Algorithm | RS256 |
| Storage Label Claim | uid |
| Storage Quota Claim | immich_quota |
| Default Storage Quota (GiB) | 0 (empty for unlimited quota) |
| Button Text | Sign in with Authelia (optional) |
| Auto Register | Enabled (optional) |
| Auto Launch | Enabled (optional) |
| Mobile Redirect URI Override | Disable |
| Mobile Redirect URI | |
</details>
<details>
<summary>Authentik Example</summary>
@@ -128,7 +211,7 @@ Configuration of OAuth in Immich System Settings
| Signing Algorithm | RS256 |
| Storage Label Claim | preferred_username |
| Storage Quota Claim | immich_quota |
| Default Storage Quota (GiB) | 0 (0 for unlimited quota) |
| Default Storage Quota (GiB) | 0 (empty for unlimited quota) |
| Button Text | Sign in with Authentik (optional) |
| Auto Register | Enabled (optional) |
| Auto Launch | Enabled (optional) |
@@ -159,7 +242,7 @@ Configuration of OAuth in Immich System Settings
| Signing Algorithm | RS256 |
| Storage Label Claim | preferred_username |
| Storage Quota Claim | immich_quota |
| Default Storage Quota (GiB) | 0 (0 for unlimited quota) |
| Default Storage Quota (GiB) | 0 (empty for unlimited quota) |
| Button Text | Sign in with Google (optional) |
| Auto Register | Enabled (optional) |
| Auto Launch | Enabled |

View File

@@ -2,10 +2,6 @@
Users can deploy a custom reverse proxy that forwards requests to Immich. This way, the reverse proxy can handle TLS termination, load balancing, or other advanced features. All reverse proxies between Immich and the user must forward all headers and set the `Host`, `X-Real-IP`, `X-Forwarded-Proto` and `X-Forwarded-For` headers to their appropriate values. Additionally, your reverse proxy should allow for big enough uploads. By following these practices, you ensure that all custom reverse proxies are fully compatible with Immich.
:::note
The Repair page can take a long time to load. To avoid server timeouts or errors, we recommend specifying a timeout of at least 10 minutes on your proxy server.
:::
:::caution
Immich does not support being served on a sub-path such as `location /immich {`. It has to be served on the root path of a (sub)domain.
:::

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

View File

Before

Width:  |  Height:  |  Size: 4.7 KiB

After

Width:  |  Height:  |  Size: 4.7 KiB

View File

Before

Width:  |  Height:  |  Size: 48 KiB

After

Width:  |  Height:  |  Size: 48 KiB

View File

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

View File

Before

Width:  |  Height:  |  Size: 1.6 KiB

After

Width:  |  Height:  |  Size: 1.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 49 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

View File

Before

Width:  |  Height:  |  Size: 4.0 KiB

After

Width:  |  Height:  |  Size: 4.0 KiB

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.8 KiB

View File

@@ -2,6 +2,9 @@
sidebar_position: 80
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# TrueNAS [Community]
:::note
@@ -9,211 +12,324 @@ This is a community contribution and not officially supported by the Immich team
Community support can be found in the dedicated channel on the [Discord Server](https://discord.immich.app/).
**Please report app issues to the corresponding [Github Repository](https://github.com/truenas/apps/tree/master/trains/community/immich).**
**Please report app issues to the corresponding [GitHub Repository](https://github.com/truenas/apps/tree/master/trains/community/immich).**
:::
:::warning
This guide covers the installation of Immich on TrueNAS Community Edition 24.10.2.2 (Electric Eel) and later.
We recommend keeping TrueNAS Community Edition and Immich relatively up to date with the latest versions to avoid any issues.
If you are using an older version of TrueNAS, we ask that you upgrade to the latest version before installing Immich. Check the [TrueNAS Community Edition Release Notes](https://www.truenas.com/docs/softwarereleases/) for more information on breaking changes, new features, and how to upgrade your system.
:::
Immich can easily be installed on TrueNAS Community Edition via the **Community** train application.
Consider reviewing the TrueNAS [Apps resources](https://apps.truenas.com/getting-started/) if you have not previously configured applications on your system.
TrueNAS Community Edition makes installing and updating Immich easy, but you must use the Immich web portal and mobile app to configure accounts and access libraries.
## First Steps
The Immich app in TrueNAS Community Edition installs, completes the initial configuration, then starts the Immich web portal.
When updates become available, TrueNAS alerts and provides easy updates.
Before installing the Immich app in TrueNAS, review the [Environment Variables](#environment-variables) documentation to see if you want to configure any during installation.
You may also configure environment variables at any time after deploying the application.
### Setting up Storage Datasets
Before beginning app installation, [create the datasets](https://www.truenas.com/docs/scale/scaletutorials/storage/datasets/datasetsscale/) to use in the **Storage Configuration** section during installation.
Immich requires seven datasets: `library`, `upload`, `thumbs`, `profile`, `video`, `backups`, and `pgData`.
You can organize these as one parent with seven child datasets, for example `/mnt/tank/immich/library`, `/mnt/tank/immich/upload`, and so on.
In TrueNAS, Immich requires 2 datasets for the application to function correctly: `data` and `pgData`. You can set the datasets to any names to match your naming conventions or preferences.
You can organize these as one parent with two child datasets, for example `/mnt/tank/immich/data` and `/mnt/tank/immich/pgData`.
<img
src={require('./img/truenas12.webp').default}
width="30%"
src={require('./img/truenas/truenas00.webp').default}
width="40%"
alt="Immich App Widget"
className="border rounded-xl"
/>
:::info Permissions
The **pgData** dataset must be owned by the user `netdata` (UID 999) for postgres to start. The other datasets must be owned by the user `root` (UID 0) or a group that includes the user `root` (UID 0) for immich to have the necessary permissions.
:::info Datasets Permissions
If the **library** dataset uses ACL it must have [ACL mode](https://www.truenas.com/docs/core/coretutorials/storage/pools/permissions/#access-control-lists) set to `Passthrough` if you plan on using a [storage template](/docs/administration/storage-template.mdx) and the dataset is configured for network sharing (its ACL type is set to `SMB/NFSv4`). When the template is applied and files need to be moved from **upload** to **library**, Immich performs `chmod` internally and needs to be allowed to execute the command. [More info.](https://github.com/immich-app/immich/pull/13017)
The **pgData** dataset must be owned by the user `netdata` (UID 999) for Postgres to start.
The `data` dataset must have given the **_modify_** permission to the user who will run Immich.
Since TrueNAS Community Edition 24.10.2.2 and later, Immich can be run as any user and group, the default user being `apps` (UID 568) and the default group being `apps` (GID 568). This user, either `apps` or another user you choose, must have **_modify_** permissions on the **data** dataset.
For an easy setup:
- Create the parent dataset `immich` keeping the default **Generic** preset.
- Select `Dataset Preset` **Apps** instead of **Generic** when creating the `data` dataset. This will automatically give the correct permissions to the dataset. If you want to use another user for Immich, you can keep the **Generic** preset, but you will need to give the **_modify_** permission to that other user.
- For the `pgData` dataset, you can keep the default preset **Generic** as permissions can be set during the installation of the Immich app (See [Storage Configuration](#storage-configuration) section).
:::
:::tip
To improve performance, Immich recommends using SSDs for the database. If you have a pool made of SSDs, you can create the `pgData` dataset on that pool.
Thumbnails can also be stored on the SSDs for faster access. This is an advanced option and not required for Immich to run. More information on how you can use multiple datasets to manage Immich storage in a finer-grained manner can be found in the [Advanced: Multiple Datasets for Immich Storage](#advanced-multiple-datasets-for-immich-storage) section below.
:::
:::warning
If you just created the datasets using the **Apps** preset, you can skip this warning section.
If the **data** dataset uses ACL it must have [ACL mode](https://www.truenas.com/docs/scale/scaletutorials/datasets/permissionsscale/) set to `Passthrough` if you plan on using a [storage template](/docs/administration/storage-template.mdx) and the dataset is configured for network sharing (its ACL type is set to `SMB/NFSv4`). When the template is applied and files need to be moved from **upload** to **library** (internal folder created by Immich within the **data** dataset), Immich performs `chmod` internally and must be allowed to execute the command. [More info.](https://github.com/immich-app/immich/pull/13017)
To change or verify the ACL mode, go to the **Datasets** screen, select the **library** dataset, click on the **Edit** button next to **Dataset Details**, then click on the **Advanced Options** tab, scroll down to the **ACL Mode** section, and select `Passthrough` from the dropdown menu. Click **Save** to apply the changes. If the option is greyed out, set the **ACL Type** to `SMB/NFSv4` first, then you can change the **ACL Mode** to `Passthrough`.
:::
## Installing the Immich Application
To install the **Immich** application, go to **Apps**, click **Discover Apps**, either begin typing Immich into the search field or scroll down to locate the **Immich** application widget.
To install the **Immich** application, go to **Apps**, click **Discover Apps**, and either begin typing Immich into the search field or scroll down to locate the **Immich** application widget.
<div style={{ marginBottom: '2rem', border: '1px solid #ccc', padding: '1rem', borderRadius: '8px' }}>
Click on the widget to open the **Immich** application details screen.
<img
src={require('./img/truenas01.webp').default}
src={require('./img/truenas/truenas01.webp').default}
width="50%"
alt="Immich App Widget"
className="border rounded-xl"
/>
Click on the widget to open the **Immich** application details screen.
</div>
<br/><br/>
<div style={{ marginBottom: '2rem', border: '1px solid #ccc', padding: '1rem', borderRadius: '8px' }}>
Click **Install** to open the Immich application configuration screen.
<img
src={require('./img/truenas02.webp').default}
src={require('./img/truenas/truenas02.webp').default}
width="100%"
alt="Immich App Details Screen"
className="border rounded-xl"
/>
Click **Install** to open the Immich application configuration screen.
<br/><br/>
</div>
Application configuration settings are presented in several sections, each explained below.
To find specific fields click in the **Search Input Fields** search field, scroll down to a particular section or click on the section heading on the navigation area in the upper-right corner.
To find specific fields, click in the **Search Input Fields** search field, scroll down to a particular section, or click on the section heading on the navigation area in the upper-right corner.
### Application Name and Version
<img
src={require('./img/truenas03.webp').default}
src={require('./img/truenas/truenas03.webp').default}
width="100%"
alt="Install Immich Screen"
className="border rounded-xl"
className="border rounded-xl mb-4"
/>
Accept the default value or enter a name in **Application Name** field.
In most cases use the default name, but if adding a second deployment of the application you must change this name.
Keep the default value or enter a name in the **Application Name** field.
Change it if youre deploying a second instance.
Accept the default version number in **Version**.
When a new version becomes available, the application has an update badge.
The **Installed Applications** screen shows the option to update applications.
Immich version within TrueNAS catalog (Different from Immich release version).
### Immich Configuration
<img
src={require('./img/truenas05.webp').default}
src={require('./img/truenas/truenas04.webp').default}
width="40%"
alt="Configuration Settings"
className="border rounded-xl mb-4"
/>
The **Timezone** is set to the system default, which usually matches your local timezone. You can change it to another timezone if you prefer.
**Enable Machine Learning** is enabled by default. It allows Immich to use machine learning features such as face recognition, image search, and smart duplicate detection. Untick this option if you do not want to use these features.
Select the **Machine Learning Image Type** based on the hardware you have. More details here: [Hardware-Accelerated Machine Learning](/docs/features/ml-hardware-acceleration.md)
**Database Password** should be set to a custom value using only the characters `A-Za-z0-9`. This password is used to secure the Postgres database.
**Redis Password** should be set to a custom value using only the characters `A-Za-z0-9`. Preferably, use a different password from the database password.
Keep the **Log Level** to the default `Log` value.
Leave **Hugging Face Endpoint** blank. (This is used to download ML models from a different source.)
Set **Database Storage Type** to the type of storage (**HDD** or **SSD**) that the pool where the **pgData** dataset is located uses.
**Additional Environment Variables** can be left blank.
<details>
<summary>Advanced users: Adding Environment Variables</summary>
Environment variables can be set by clicking the **Add** button and filling in the **Name** and **Value** fields.
<img
src={require('./img/truenas/truenas05.webp').default}
width="40%"
alt="Environment Variables"
className="border rounded-xl"
/>
Accept the default value in **Timezone** or change to match your local timezone.
**Timezone** is only used by the Immich `exiftool` microservice if it cannot be determined from the image metadata.
These are used to add custom configuration options or to enable specific features.
More information on available environment variables can be found in the **[environment variables documentation](/docs/install/environment-variables/)**.
Untick **Enable Machine Learning** if you will not use face recognition, image search, and smart duplicate detection.
:::info
Some environment variables are not available for the TrueNAS Community Edition app as they can be configured through GUI options in the [Edit Immich screen](#edit-app-settings).
Accept the default option or select the **Machine Learning Image Type** for your hardware based on the [Hardware-Accelerated Machine Learning Supported Backends](/docs/features/ml-hardware-acceleration.md#supported-backends).
Some examples are: `IMMICH_VERSION`, `UPLOAD_LOCATION`, `DB_DATA_LOCATION`, `TZ`, `IMMICH_LOG_LEVEL`, `DB_PASSWORD`, `REDIS_PASSWORD`.
:::
Immich's default is `postgres` but you should consider setting the **Database Password** to a custom value using only the characters `A-Za-z0-9`.
</details>
The **Redis Password** should be set to a custom value using only the characters `A-Za-z0-9`.
### User and Group Configuration
Accept the **Log Level** default of **Log**.
Application in TrueNAS runs as a specific user and group. Immich uses the default user `apps` (UID 568) and the default group `apps` (GID 568).
Leave **Hugging Face Endpoint** blank. (This is for downloading ML models from a different source.)
<img
src={require('./img/truenas/truenas06.webp').default}
width="40%"
alt="User and Group Configuration"
className="border rounded-xl"
/>
Leave **Additional Environment Variables** blank or see [Environment Variables](#environment-variables) to set before installing.
- **User ID**: Keep the default value `apps` (UID 568) or define a different one if needed.
- **Group ID**: Keep the default value `apps` (GID 568) or define a different one if needed.
:::warning
If you change the user or group, make sure that the datasets you created for Immich data storage have the correct permissions set for that user and group as specified in the [Setting up Storage Datasets](#setting-up-storage-datasets) section above.
:::
### Network Configuration
<img
src={require('./img/truenas06.webp').default}
src={require('./img/truenas/truenas07.webp').default}
width="40%"
alt="Networking Settings"
className="border rounded-xl"
/>
Accept the default port `30041` in **WebUI Port** or enter a custom port number.
:::info Allowed Port Numbers
Only numbers within the range 9000-65535 may be used on TrueNAS versions below TrueNAS Community Edition 24.10 Electric Eel.
- **Port Bind Mode**: This lets you expose the port to the host system, allowing you to access Immich from outside the TrueNAS system. Keep the default **_Publish port on the host for external access_** value unless you have a specific reason to change it.
Regardless of version, to avoid port conflicts, don't use [ports on this list](https://www.truenas.com/docs/solutions/optimizations/security/#truenas-default-ports).
:::
- **Port Number**: Keep the default port `30041` or enter a custom port number.
- **Host IPs**: Leave the default blank value.
### Storage Configuration
Immich requires seven storage datasets.
<img
src={require('./img/truenas07.webp').default}
width="20%"
alt="Configure Storage ixVolumes"
className="border rounded-xl"
/>
:::note Default Setting (Not recommended)
The default setting for datasets is **ixVolume (dataset created automatically by the system)** but this results in your data being harder to access manually and can result in data loss if you delete the immich app. (Not recommended)
:::danger Default Settings (Not recommended)
The default setting for datasets is **ixVolume (dataset created automatically by the system)**. This is not recommended as this results in your data being harder to access manually and can result in data loss if you delete the immich app. It is also harder to manage snapshots and replication tasks. It is recommended to use the **Host Path (Path that already exists on the system)** option instead.
:::
For each Storage option select **Host Path (Path that already exists on the system)** and then select the matching dataset [created before installing the app](#setting-up-storage-datasets): **Immich Library Storage**: `library`, **Immich Uploads Storage**: `upload`, **Immich Thumbs Storage**: `thumbs`, **Immich Profile Storage**: `profile`, **Immich Video Storage**: `video`, **Immich Backups Storage**: `backups`, **Postgres Data Storage**: `pgData`.
The storage configuration section allows you to set up the storage locations for Immich data. You can select the datasets created in the previous step.
<img
src={require('./img/truenas08.webp').default}
src={require('./img/truenas/truenas08.webp').default}
width="40%"
alt="Configure Storage Host Paths"
alt="Configure Storage Volumes"
className="border rounded-xl"
/>
The image above has example values.
<br/>
For the Data Storage, select **Host Path (Path that already exists on the system)** and then select the dataset you created for Immich data storage, for example, `data`.
### Additional Storage [(External Libraries)](/docs/features/libraries)
The Machine Learning cache can be left with default _Temporary_
For the Postgres Data Storage, select **Host Path (Path that already exists on the system)** and then select the dataset you created for Postgres data storage, for example, `pgData`.
:::info
**Postgres Data Storage**
Once **Host Path** is selected, a checkbox appears with **_Automatic Permissions_**. If you have not set the ownership of the **pgData** dataset to `netdata` (UID 999), tick this box as it will set the user ownership to `netdata` (UID 999) and the group ownership to `docker` (GID 999) automatically. If you have set the ownership of the **pgData** dataset to `netdata` (UID 999), you can leave this box unticked.
:::
### Additional Storage (Advanced Users)
<details>
<summary>External Libraries</summary>
:::danger Advanced Users Only
This feature should only be used by advanced users. If this is your first time installing Immich, then DO NOT mount an external library until you have a working setup. Also, your mount path MUST be something unique and should NOT be your library or upload location or a Linux directory like `/lib`. The picture below shows a valid example.
This feature should only be used by advanced users. If this is your first time installing Immich, then DO NOT mount an external library until you have a working setup.
:::
<img
src={require('./img/truenas10.webp').default}
src={require('./img/truenas/truenas09.webp').default}
width="40%"
alt="Configure Storage Host Paths"
alt="Add External Libraries with Additional Storage"
className="border rounded-xl"
/>
You may configure [External Libraries](/docs/features/libraries) by mounting them using **Additional Storage**.
The **Mount Path** is the location you will need to copy and paste into the External Library settings within Immich.
The **Host Path** is the location on the TrueNAS Community Edition server where your external library is located.
You may configure [external libraries](/docs/features/libraries) by mounting them using **Additional Storage**.
<!-- A section for Labels would go here but I don't know what they do. -->
The dataset that contains your external library files must at least give **read** access to the user running Immich (Default: `apps` (UID 568), `apps` (GID 568)).
If you want to be able to delete files or edit metadata in the external library using Immich, you will need to give the **modify** permission to the user running Immich.
- **Mount Path** is the location you will need to copy and paste into the external library settings within Immich.
- **Host Path** is the location on the TrueNAS Community Edition server where your external library is located.
- **Read Only** is a checkbox that you can tick if you want to prevent Immich from modifying the files in the external library. This is useful if you want to use Immich to view and search your external library without modifying it.
:::warning
Each mount path MUST be something unique and should NOT be your library or upload location or a Linux directory like `/lib`.
A general recommendation is to mount any external libraries to a path beginning with `/mnt` or `/media` followed by a unique name, such as `/mnt/external-libraries` or `/media/my-external-libraries`. If you plan to mount multiple external libraries, you can use paths like `/mnt/external-libraries/library1`, `/mnt/external-libraries/library2`, etc.
:::
</details>
<details>
<summary>Multiple Datasets for Immich Storage</summary>
:::danger Advanced Users Only
This feature should only be used by advanced users.
:::
Immich can use multiple datasets for its storage, allowing you to manage your data more granularly, similar to the old storage configuration. This is useful if you want to separate your data into different datasets for performance or organizational reasons. There is a general guide for this [here](/docs/guides/custom-locations), but read on for the TrueNAS guide.
Each additional dataset has to give the permission **_modify_** to the user who will run Immich (Default: `apps` (UID 568), `apps` (GID 568))
As described in the [Setting up Storage Datasets](#setting-up-storage-datasets) section above, you have to create the datasets with the **Apps** preset to ensure the correct permissions are set, or you can set the permissions manually after creating the datasets.
Immich uses 6 folders for its storage: `library`, `upload`, `thumbs`, `profile`, `encoded-video`, and `backups`. You can create a dataset for each of these folders or only for some of them.
To mount these datasets:
1. Add an **Additional Storage** entry for each dataset you want to use.
2. Select **Type** as **Host Path (Path that already exists on the system)**.
3. Enter the **Mount Path** with `/data/<folder-name>`. The `<folder-name>` is the name of the folder you want to mount, for example, `library`, `upload`, `thumbs`, `profile`, `encoded-video`, or `backups`.
:::danger Important
You have to write the full path, including `/data/`, as Immich expects the data to be in that location.
If you do not include this path, Immich will not be able to find the data and will not write the data to the location you specified.
:::
4. Select the **Host Path** as the dataset you created for that folder, for example, `/mnt/tank/immich/library`, `/mnt/tank/immich/upload`, etc.
<img
src={require('./img/truenas/truenas10.webp').default}
width="40%"
alt="Use Multiple Datasets for Immich Storage with Additional Storage"
className="border rounded-xl"
/>
</details>
<!-- A section for Labels could be added, but I don't think it is needed as they are of no use for Immich. -->
### Resources Configuration
<img
src={require('./img/truenas09.webp').default}
src={require('./img/truenas/truenas11.webp').default}
width="40%"
alt="Resource Limits"
className="border rounded-xl"
/>
Accept the default **CPU** limit of `2` threads or specify the number of threads (CPUs with Multi-/Hyper-threading have 2 threads per core).
- **CPU**: Depending on your system resources, you can keep the default value of `2` threads or specify a different number. Immich recommends at least `8` threads.
Specify the **Memory** limit in MB of RAM. Immich recommends at least 6000 MB (6GB). If you selected **Enable Machine Learning** in **Immich Configuration**, you should probably set this above 8000 MB.
- **Memory**: Limit in MB of RAM. Immich recommends at least 6000 MB (6GB). If you selected **Enable Machine Learning** in **Immich Configuration**, you should probably set this above 8000 MB.
:::info Older TrueNAS Versions
Before TrueNAS Community Edition version 24.10 Electric Eel:
Both **CPU** and **Memory** are limits, not reservations. This means that Immich can use up to the specified amount of CPU threads and RAM, but it will not reserve that amount of resources at all times. The system will allocate resources as needed, and Immich will use less than the specified amount most of the time.
The **CPU** value was specified in a different format with a default of `4000m` which is 4 threads.
- Enable **GPU Configuration** options if you have a GPU or CPU with integrated graphics that you will use for [Hardware Transcoding](/docs/features/hardware-transcoding) and/or [Hardware-Accelerated Machine Learning](/docs/features/ml-hardware-acceleration.md).
The **Memory** value was specified in a different format with a default of `8Gi` which is 8 GiB of RAM. The value was specified in bytes or a number with a measurement suffix. Examples: `129M`, `123Mi`, `1000000000`
:::
Enable **GPU Configuration** options if you have a GPU that you will use for [Hardware Transcoding](/docs/features/hardware-transcoding) and/or [Hardware-Accelerated Machine Learning](/docs/features/ml-hardware-acceleration.md). More info: [GPU Passthrough Docs for TrueNAS Apps](https://apps.truenas.com/managing-apps/installing-apps/#gpu-passthrough)
The process for NVIDIA GPU passthrough requires additional steps.
More details here: [GPU Passthrough Docs for TrueNAS Apps](https://apps.truenas.com/managing-apps/installing-apps/#gpu-passthrough)
### Install
Finally, click **Install**.
The system opens the **Installed Applications** screen with the Immich app in the **Deploying** state.
When the installation completes it changes to **Running**.
When the installation completes, it changes to **Running**.
<img
src={require('./img/truenas04.webp').default}
src={require('./img/truenas/truenas12.webp').default}
width="100%"
alt="Immich Installed"
className="border rounded-xl"
/>
Click **Web Portal** on the **Application Info** widget to open the Immich web interface to set up your account and begin uploading photos.
Click **Web Portal** on the **Application Info** widget, or go to the URL `http://<your-truenas-ip>:30041` in your web browser to open the Immich web interface. This will show you the onboarding process to set up your first user account, which will be an administrator account.
After that, you can start using Immich to upload and manage your photos and videos.
:::tip
For more information on how to use the application once installed, please refer to the [Post Install](/docs/install/post-install.mdx) guide.
@@ -228,23 +344,6 @@ For more information on how to use the application once installed, please refer
- Click **Update** at the very bottom of the page to save changes.
- TrueNAS automatically updates, recreates, and redeploys the Immich container with the updated settings.
## Environment Variables
You can set [Environment Variables](/docs/install/environment-variables) by clicking **Add** on the **Additional Environment Variables** option and filling in the **Name** and **Value**.
<img
src={require('./img/truenas11.webp').default}
width="40%"
alt="Environment Variables"
className="border rounded-xl"
/>
:::info
Some Environment Variables are not available for the TrueNAS Community Edition app. This is mainly because they can be configured through GUI options in the [Edit Immich screen](#edit-app-settings).
Some examples are: `IMMICH_VERSION`, `UPLOAD_LOCATION`, `DB_DATA_LOCATION`, `TZ`, `IMMICH_LOG_LEVEL`, `DB_PASSWORD`, `REDIS_PASSWORD`.
:::
## Updating the App
:::danger
@@ -261,3 +360,116 @@ To update the app to the latest version:
- You may view the Changelog.
- Click **Upgrade** to begin the process and open a counter dialog that shows the upgrade progress.
- When complete, the update badge and buttons disappear and the application Update state on the Installed screen changes from Update Available to Up to date.
## Migration
:::danger
Perform a backup of your Immich data before proceeding with the migration steps below. This is crucial to prevent any data loss if something goes wrong during the migration process.
The migration should also be performed when the Immich app is not running to ensure no data is being written while you are copying the data.
:::
### Migration from Old Storage Configuration
There are two ways to migrate from the old storage configuration to the new one, depending on whether you want to keep the old multiple datasets or if you want to move to a double dataset configuration with a single dataset for Immich data storage and a single dataset for Postgres data storage.
:::note Old TrueNAS Versions Permissions
If you were using an older version of TrueNAS (before 24.10.2.2), the datasets, except the one for **pgData** had only to be owned by the `root` user (UID 0). You might have to add the **modify** permission to the `apps` user (UID 568) or the user you want to run Immich as, to all of them, except **pgData**. The steps to add or change ACL permissions are described in the [TrueNAS documentation](https://www.truenas.com/docs/scale/scaletutorials/datasets/permissionsscale/).
:::
<Tabs groupId="truenas-migration-tabs">
<TabItem value="migrate-new-dataset" label="Migrate data to a new dataset (recommended)" default>
To migrate from the old storage configuration to the new one, you will need to create a new dataset for the Immich data storage and copy the data from the old datasets to the new ones. The steps are as follows:
1. **Stop the Immich app** from the TrueNAS web interface to ensure no data is being written while you are copying the data.
2. **Create a new dataset** for the Immich data storage, for example, `data`. As described in the [Setting up Storage Datasets](#setting-up-storage-datasets) section above, create the dataset with the **Apps** preset to ensure the correct permissions are set.
3. **Copy the data** from the old datasets to the new dataset. We advise using the `rsync` command to copy the data, as it will preserve the permissions and ownership of the files. The following commands are examples:
```bash
rsync -av /mnt/tank/immich/library/ /mnt/tank/immich/data/library/
rsync -av /mnt/tank/immich/upload/ /mnt/tank/immich/data/upload/
rsync -av /mnt/tank/immich/thumbs/ /mnt/tank/immich/data/thumbs/
rsync -av /mnt/tank/immich/profile/ /mnt/tank/immich/data/profile/
rsync -av /mnt/tank/immich/video/ /mnt/tank/immich/data/encoded-video/
rsync -av /mnt/tank/immich/backups/ /mnt/tank/immich/data/backups/
```
Make sure to replace `/mnt/tank/immich/` with the correct path to your old datasets and `/mnt/tank/immich/data/` with the correct path to your new dataset.
:::tip
If you were using **ixVolume (dataset created automatically by the system)** for Immich data storage, the path to the data should be `/mnt/.ix-apps/app_mounts/immich/`. You have to use this path instead of `/mnt/tank/immich/` in the `rsync` command above, for example:
```bash
rsync -av /mnt/.ix-apps/app_mounts/immich/library/ /mnt/tank/immich/data/library/
```
If you were also using an ixVolume for Postgres data storage, you also should, first create the pgData dataset, as described in the [Setting up Storage Datasets](#setting-up-storage-datasets) section above, and then you can use the following command to copy the Postgres data:
```bash
rsync -av /mnt/.ix-apps/app_mounts/immich/pgData/ /mnt/tank/immich/pgData/
```
:::
:::warning
Make sure that for each folder, the `.immich` file is copied as well, as it contains important metadata for Immich. If for some reason the `.immich` file is not copied, you can copy it manually with the `rsync` command, for example:
```bash
rsync -av /mnt/tank/immich/library/.immich /mnt/tank/immich/data/library/
```
Replace `library` with the name of the folder where you are copying the file.
:::
4. **Update the permissions** as the permissions of the data that have been copied has been preserved, to ensure that the `apps` user (UID 568) has the correct permissions on all the copied data. If you just created the dataset with the **Apps** preset, from the TrueNAS web interface, go to the **Datasets** screen, select the **data** dataset, click on the **Edit** button next to **Permissions**, tick the "Apply permissions recursively" checkbox, and click **Save**. This will apply the correct permissions to all the copied data.
5. **Update the Immich app** to use the new dataset:
- Go to the **Installed Applications** screen and select Immich from the list of installed applications.
- Click **Edit** on the **Application Info** widget.
- In the **Storage Configuration** section, untick the **Use Old Storage Configuration (Deprecated)** checkbox.
- For the **Data Storage**, select **Host Path (Path that already exists on the system)** and then select the new dataset you created for Immich data storage, for example, `data`.
- For the **Postgres Data Storage**, verify that it is still set to the dataset you created for Postgres data storage, for example, `pgData`.
- Click **Update** at the bottom of the page to save changes.
6. **Start the Immich app** from the TrueNAS web interface.
This will recreate the Immich container with the new storage configuration and start the app.
If everything went well, you should now be able to access Immich with the new storage configuration. You can verify that the data has been copied correctly by checking the Immich web interface and ensuring that all your photos and videos are still available. You may delete the old datasets, if you no longer need them, using the TrueNAS web interface.
If you were using **ixVolume (dataset created automatically by the system)** or folders for Immich data storage, you can delete the old datasets using the following commands:
```bash
rm -r /mnt/.ix-apps/app_mounts/immich/library
rm -r /mnt/.ix-apps/app_mounts/immich/uploads
rm -r /mnt/.ix-apps/app_mounts/immich/thumbs
rm -r /mnt/.ix-apps/app_mounts/immich/profile
rm -r /mnt/.ix-apps/app_mounts/immich/video
rm -r /mnt/.ix-apps/app_mounts/immich/backups
```
</TabItem>
<TabItem value="migrate-old-dataset" label="Keep the existing datasets">
To migrate from the old storage configuration to the new one without creating new datasets.
1. **Stop the Immich app** from the TrueNAS web interface to ensure no data is being written while you are updating the app.
2. **Update the datasets permissions**: Ensure that the datasets used for Immich data storage (`library`, `upload`, `thumbs`, `profile`, `video`, `backups`) have the correct permissions set for the user who will run Immich. The user should have ***modify*** permissions on these datasets. The default user for Immich is `apps` (UID 568) and the default group is `apps` (GID 568). If you are using a different user, make sure to set the permissions accordingly. You can do this from the TrueNAS web interface by going to the **Datasets** screen, selecting each dataset, clicking on the **Edit** button next to **Permissions**, and adding the user with ***modify*** permissions.
3. **Update the Immich app** to use the existing datasets:
- Go to the **Installed Applications** screen and select Immich from the list of installed applications.
- Click **Edit** on the **Application Info** widget.
- In the **Storage Configuration** section, untick the **Use Old Storage Configuration (Deprecated)** checkbox.
- For the **Data Storage**, you can keep the **ixVolume (dataset created automatically by the system)** as no data will be directly written to it. We recommend selecting **Host Path (Path that already exists on the system)** and then select a **new** dataset you created for Immich data storage, for example, `data`.
- For the **Postgres Data Storage**, keep **Host Path (Path that already exists on the system)** and then select the existing dataset you used for Postgres data storage, for example, `pgData`.
- Following the instructions in the [Multiple Datasets for Immich Storage](#additional-storage-advanced-users) section, you can add, **for each old dataset**, a new Additional Storage with the following settings:
- **Type**: `Host Path (Path that already exists on the system)`
- **Mount Path**: `/data/<folder-name>` (e.g. `/data/library`)
- **Host Path**: `/mnt/<your-pool-name>/<dataset-name>` (e.g. `/mnt/tank/immich/library`)
:::danger Ensure using the correct paths names
Make sure to replace `<folder-name>` with the actual name of the folder used by Immich: `library`, `upload`, `thumbs`, `profile`, `encoded-video`, and `backups`. Also, replace `<your-pool-name>` and `<dataset-name>` with the actual names of your pool and dataset.
:::
- **Read Only**: Keep it unticked as Immich needs to write to these datasets.
- Click **Update** at the bottom of the page to save changes.
4. **Start the Immich app** from the TrueNAS web interface. This will recreate the Immich container with the new storage configuration and start the app. If everything went well, you should now be able to access Immich with the new storage configuration. You can verify that the data is still available by checking the Immich web interface and ensuring that all your photos and videos are still accessible.
</TabItem>
</Tabs>

View File

@@ -27,3 +27,102 @@ docker image prune
[watchtower]: https://containrrr.dev/watchtower/
[breaking]: https://github.com/immich-app/immich/discussions?discussions_q=label%3Achangelog%3Abreaking-change+sort%3Adate_created
[releases]: https://github.com/immich-app/immich/releases
## Migrating to VectorChord
:::info
If you deploy Immich using Docker Compose, see `ghcr.io/immich-app/postgres` in the `docker-compose.yml` file and have not explicitly set the `DB_VECTOR_EXTENSION` environmental variable, your Immich database is already using VectorChord and this section does not apply to you.
:::
:::important
If you do not deploy Immich using Docker Compose and see a deprecation warning for pgvecto.rs on server startup, you should refer to the maintainers of the Immich distribution for guidance (if using a turnkey solution) or adapt the instructions for your specific setup.
:::
Immich has migrated off of the deprecated pgvecto.rs database extension to its successor, [VectorChord](https://github.com/tensorchord/VectorChord), which comes with performance improvements in almost every aspect. This section will guide you on how to make this change in a Docker Compose setup.
Before making any changes, please [back up your database](/docs/administration/backup-and-restore). While every effort has been made to make this migration as smooth as possible, theres always a chance that something can go wrong.
After making a backup, please modify your `docker-compose.yml` file with the following information.
```diff
[...]
database:
container_name: immich_postgres
- image: docker.io/tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:739cdd626151ff1f796dc95a6591b55a714f341c737e27f045019ceabf8e8c52
+ image: ghcr.io/immich-app/postgres:14-vectorchord0.4.3-pgvectors0.2.0
environment:
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_USER: ${DB_USERNAME}
POSTGRES_DB: ${DB_DATABASE_NAME}
POSTGRES_INITDB_ARGS: '--data-checksums'
+ # Uncomment the DB_STORAGE_TYPE: 'HDD' var if your database isn't stored on SSDs
+ # DB_STORAGE_TYPE: 'HDD'
volumes:
# Do not edit the next line. If you want to change the database storage location on your system, edit the value of DB_DATA_LOCATION in the .env file
- ${DB_DATA_LOCATION}:/var/lib/postgresql/data
- healthcheck:
- test: >-
- pg_isready --dbname="$${POSTGRES_DB}" --username="$${POSTGRES_USER}" || exit 1;
- Chksum="$$(psql --dbname="$${POSTGRES_DB}" --username="$${POSTGRES_USER}" --tuples-only --no-align
- --command='SELECT COALESCE(SUM(checksum_failures), 0) FROM pg_stat_database')";
- echo "checksum failure count is $$Chksum";
- [ "$$Chksum" = '0' ] || exit 1
- interval: 5m
- start_interval: 30s
- start_period: 5m
- command: >-
- postgres
- -c shared_preload_libraries=vectors.so
- -c 'search_path="$$user", public, vectors'
- -c logging_collector=on
- -c max_wal_size=2GB
- -c shared_buffers=512MB
- -c wal_compression=on
+ shm_size: 128mb
restart: always
[...]
```
:::important
If you deviated from the defaults of pg14 or pgvectors0.2.0, you must adjust the pg major version and pgvecto.rs version. If you are still using the default `docker.io/tensorchord/pgvecto-rs:pg14-v0.2.0` image, you can just follow the changes above. For example, if the previous image is `docker.io/tensorchord/pgvecto-rs:pg16-v0.3.0`, the new image should be `ghcr.io/immich-app/postgres:16-vectorchord0.3.0-pgvectors0.3.0` instead of the image specified in the diff.
:::
After making these changes, you can start Immich as normal. Immich will make some changes to the DB during startup, which can take seconds to minutes to finish, depending on hardware and library size. In particular, its normal for the server logs to be seemingly stuck at `Reindexing clip_index` and `Reindexing face_index`for some time if you have over 100k assets in Immich and/or Immich is on a relatively weak server. If you see these logs and there are no errors, just give it time.
:::danger
After switching to VectorChord, you should not downgrade Immich below 1.133.0.
:::
Please dont hesitate to contact us on [GitHub](https://github.com/immich-app/immich/discussions) or [Discord](https://discord.immich.app/) if you encounter migration issues.
### VectorChord FAQ
#### I have a separate PostgreSQL instance shared with multiple services. How can I switch to VectorChord?
Please see the [standalone PostgreSQL documentation](/docs/administration/postgres-standalone#migrating-to-vectorchord) for migration instructions. The migration path will be different depending on whether youre currently using pgvecto.rs or pgvector, as well as whether Immich has superuser DB permissions.
#### Why are so many lines removed from the `docker-compose.yml` file? Does this mean the health check is removed?
These lines are now incorporated into the image itself along with some additional tuning.
#### What does this change mean for my existing DB backups?
The new DB image includes pgvector and pgvecto.rs in addition to VectorChord, so you can use this image to restore from existing backups that used either of these extensions. However, backups made after switching to VectorChord require an image containing VectorChord to restore successfully.
#### Do I still need pgvecto.rs installed after migrating to VectorChord?
pgvecto.rs only needs to be available during the migration, or if you need to restore from a backup that used pgvecto.rs. For a leaner DB and a smaller image, you can optionally switch to an image variant that doesnt have pgvecto.rs installed after youve performed the migration and started Immich: `ghcr.io/immich-app/postgres:14-vectorchord0.4.3`, changing the PostgreSQL version as appropriate.
#### Why does it matter whether my database is on an SSD or an HDD?
These storage mediums have different performance characteristics. As a result, the optimal settings for an SSD are not the same as those for an HDD. Either configuration is compatible with SSD and HDD, but using the right configuration will make Immich snappier. As a general tip, we recommend users store the database on an SSD whenever possible.
#### Can I use the new database image as a general PostgreSQL image outside of Immich?
Its a standard PostgreSQL container image that additionally contains the VectorChord, pgvector, and (optionally) pgvecto.rs extensions. If you were using the previous pgvecto.rs image for other purposes, you can similarly do so with this image.
#### If pgvecto.rs and pgvector still work, why should I switch to VectorChord?
VectorChord is faster, more stable, uses less RAM, and (with the settings Immich uses) offers higher-quality results than pgvector and pgvecto.rs. This translates to better search and facial recognition experiences. In addition, pgvecto.rs support will be dropped in the future, so changing it sooner will avoid disruption.

View File

@@ -7,7 +7,8 @@
"format": "prettier --check .",
"format:fix": "prettier --write .",
"start": "docusaurus start --port 3005",
"build": "docusaurus build",
"copy:openapi": "jq -c < ../open-api/immich-openapi-specs.json > ./static/openapi.json || exit 0",
"build": "npm run copy:openapi && docusaurus build",
"swizzle": "docusaurus swizzle",
"deploy": "docusaurus deploy",
"clear": "docusaurus clear",
@@ -59,6 +60,6 @@
"node": ">=20"
},
"volta": {
"node": "22.17.1"
"node": "22.18.0"
}
}

View File

@@ -16,6 +16,9 @@ import {
mdiCloudKeyOutline,
mdiRegex,
mdiCodeJson,
mdiClockOutline,
mdiAccountOutline,
mdiRestart,
} from '@mdi/js';
import Layout from '@theme/Layout';
import React from 'react';
@@ -26,6 +29,42 @@ const withLanguage = (date: Date) => (language: string) => date.toLocaleDateStri
type Item = Omit<TimelineItem, 'done' | 'getDateLabel'> & { date: Date };
const items: Item[] = [
{
icon: mdiClockOutline,
iconColor: 'gray',
title: 'setTimeout is cursed',
description:
'The setTimeout method in JavaScript is cursed when used with small values because the implementation may or may not actually wait the specified time.',
link: {
url: 'https://github.com/immich-app/immich/pull/20655',
text: '#20655',
},
date: new Date(2025, 7, 4),
},
{
icon: mdiAccountOutline,
iconColor: '#DAB1DA',
title: 'PostgreSQL USER is cursed',
description:
'The USER keyword in PostgreSQL is cursed because you can select from it like a table, which leads to confusion if you have a table name user as well.',
link: {
url: 'https://github.com/immich-app/immich/pull/19891',
text: '#19891',
},
date: new Date(2025, 7, 4),
},
{
icon: mdiRestart,
iconColor: '#8395e3',
title: 'PostgreSQL RESET is cursed',
description:
'PostgreSQL RESET is cursed because it is impossible to RESET a PostgreSQL extension parameter if the extension has been uninstalled.',
link: {
url: 'https://github.com/immich-app/immich/pull/19363',
text: '#19363',
},
date: new Date(2025, 5, 20),
},
{
icon: mdiRegex,
iconColor: 'purple',

View File

@@ -1,4 +1,20 @@
[
{
"label": "v1.137.3",
"url": "https://v1.137.3.archive.immich.app"
},
{
"label": "v1.137.2",
"url": "https://v1.137.2.archive.immich.app"
},
{
"label": "v1.137.1",
"url": "https://v1.137.1.archive.immich.app"
},
{
"label": "v1.137.0",
"url": "https://v1.137.0.archive.immich.app"
},
{
"label": "v1.136.0",
"url": "https://v1.136.0.archive.immich.app"

View File

@@ -1 +1 @@
22.17.1
22.18.0

188
e2e/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "immich-e2e",
"version": "1.136.0",
"version": "1.137.3",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "immich-e2e",
"version": "1.136.0",
"version": "1.137.3",
"license": "GNU Affero General Public License version 3",
"devDependencies": {
"@eslint/eslintrc": "^3.1.0",
@@ -16,7 +16,7 @@
"@playwright/test": "^1.44.1",
"@socket.io/component-emitter": "^3.1.2",
"@types/luxon": "^3.4.2",
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"@types/oidc-provider": "^9.0.0",
"@types/pg": "^8.15.1",
"@types/pngjs": "^6.0.4",
@@ -25,7 +25,7 @@
"eslint": "^9.14.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.1.3",
"eslint-plugin-unicorn": "^59.0.0",
"eslint-plugin-unicorn": "^60.0.0",
"exiftool-vendored": "^28.3.1",
"globals": "^16.0.0",
"jose": "^5.6.3",
@@ -46,7 +46,7 @@
},
"../cli": {
"name": "@immich/cli",
"version": "2.2.73",
"version": "2.2.77",
"dev": true,
"license": "GNU Affero General Public License version 3",
"dependencies": {
@@ -68,7 +68,7 @@
"@types/lodash-es": "^4.17.12",
"@types/micromatch": "^4.0.9",
"@types/mock-fs": "^4.13.1",
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"@vitest/coverage-v8": "^3.0.0",
"byte-size": "^9.0.0",
"cli-progress": "^3.12.0",
@@ -76,7 +76,7 @@
"eslint": "^9.14.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.1.3",
"eslint-plugin-unicorn": "^59.0.0",
"eslint-plugin-unicorn": "^60.0.0",
"globals": "^16.0.0",
"mock-fs": "^5.2.0",
"prettier": "^3.2.5",
@@ -95,14 +95,14 @@
},
"../open-api/typescript-sdk": {
"name": "@immich/sdk",
"version": "1.136.0",
"version": "1.137.3",
"dev": true,
"license": "GNU Affero General Public License version 3",
"dependencies": {
"@oazapfts/runtime": "^1.0.2"
},
"devDependencies": {
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"typescript": "^5.3.3"
}
},
@@ -131,9 +131,9 @@
}
},
"node_modules/@babel/helper-validator-identifier": {
"version": "7.25.9",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.25.9.tgz",
"integrity": "sha512-Ed61U6XJc3CVRfkERJWDz4dJwKe7iLmmJsbOGu9wSloNSFttHV0I8g6UAgb7qnK5ly5bGLPd4oXZlxCdANBOWQ==",
"version": "7.27.1",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.27.1.tgz",
"integrity": "sha512-D2hP9eA+Sqx1kBZgzxZh0y1trbuU+JoDkiEwqhQ36nodYqJwyEIhPSdMNd7lOm/4io72luTPWH20Yda0xOuUow==",
"dev": true,
"license": "MIT",
"engines": {
@@ -684,9 +684,9 @@
}
},
"node_modules/@eslint/core": {
"version": "0.14.0",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.14.0.tgz",
"integrity": "sha512-qIbV0/JZr7iSDjqAc60IqbLdsj9GDt16xQtWD+B78d/HAlvysGdZZ6rpJHGAc2T0FQx1X6thsSPdnoiGKdNtdg==",
"version": "0.15.1",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.15.1.tgz",
"integrity": "sha512-bkOp+iumZCCbt1K1CmWf0R9pM5yKpDv+ZXtvSyQpudrI9kuFLp+bM2WOPXImuD/ceQuaa8f5pj93Y7zyECIGNA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
@@ -734,9 +734,9 @@
}
},
"node_modules/@eslint/js": {
"version": "9.31.0",
"resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.31.0.tgz",
"integrity": "sha512-LOm5OVt7D4qiKCqoiPbA7LWmI+tbw1VbTUowBcUMgQSuM6poJufkFkYDcQpo5KfgD39TnNySV26QjOh7VFpSyw==",
"version": "9.32.0",
"resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.32.0.tgz",
"integrity": "sha512-BBpRFZK3eX6uMLKz8WxFOBIFFcGFJ/g8XuwjTHCqHROSIsopI+ddn/d5Cfh36+7+e5edVS8dbSHnBNhrLEX0zg==",
"dev": true,
"license": "MIT",
"engines": {
@@ -757,13 +757,13 @@
}
},
"node_modules/@eslint/plugin-kit": {
"version": "0.3.1",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.3.1.tgz",
"integrity": "sha512-0J+zgWxHN+xXONWIyPWKFMgVuJoZuGiIFu8yxk7RJjxkzpGmyja5wRFqZIVtjDVOQpV+Rw0iOAjYPE2eQyjr0w==",
"version": "0.3.4",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.3.4.tgz",
"integrity": "sha512-Ul5l+lHEcw3L5+k8POx6r74mxEYKG5kOb6Xpy2gCRW6zweT6TEhAf8vhxGgjhqrd/VO/Dirhsb+1hNpD1ue9hw==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@eslint/core": "^0.14.0",
"@eslint/core": "^0.15.1",
"levn": "^0.4.1"
},
"engines": {
@@ -1999,9 +1999,9 @@
}
},
"node_modules/@types/luxon": {
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/@types/luxon/-/luxon-3.6.2.tgz",
"integrity": "sha512-R/BdP7OxEMc44l2Ex5lSXHoIXTB2JLNa3y2QISIbr58U/YcsffyQrYW//hZSdrfxrjRZj3GcUoxMPGdO8gSYuw==",
"version": "3.7.1",
"resolved": "https://registry.npmjs.org/@types/luxon/-/luxon-3.7.1.tgz",
"integrity": "sha512-H3iskjFIAn5SlJU7OuxUmTEpebK6TKB8rxZShDslBMZJ5u9S//KM1sbdAisiSrqwLQncVjnpi2OK2J51h+4lsg==",
"dev": true,
"license": "MIT"
},
@@ -2020,9 +2020,9 @@
"license": "MIT"
},
"node_modules/@types/node": {
"version": "22.16.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.16.5.tgz",
"integrity": "sha512-bJFoMATwIGaxxx8VJPeM8TonI8t579oRvgAuT8zFugJsJZgzqv0Fu8Mhp68iecjzG7cnN3mO2dJQ5uUM2EFrgQ==",
"version": "22.17.0",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.17.0.tgz",
"integrity": "sha512-bbAKTCqX5aNVryi7qXVMi+OkB3w/OyblodicMbvE38blyAz7GxXf6XYhklokijuPwwVg9sDLKRxt0ZHXQwZVfQ==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -2030,9 +2030,9 @@
}
},
"node_modules/@types/oidc-provider": {
"version": "9.1.1",
"resolved": "https://registry.npmjs.org/@types/oidc-provider/-/oidc-provider-9.1.1.tgz",
"integrity": "sha512-sG4UcE4AbUwAsEpyrcyoqZ383wJiQObZU+gTa1Iv288+l09HwSr88hBZE2IBLlXS+RKmLId0i4B430PBFO/XRA==",
"version": "9.1.2",
"resolved": "https://registry.npmjs.org/@types/oidc-provider/-/oidc-provider-9.1.2.tgz",
"integrity": "sha512-JAreXkbWsZR72Gt3eigG652wq1qBcjhuy421PXU2a8PS0mM00XlG+UdXbM/QPihM3ko0YF8cwvt0H2kacXGcsg==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -2042,9 +2042,9 @@
}
},
"node_modules/@types/pg": {
"version": "8.15.4",
"resolved": "https://registry.npmjs.org/@types/pg/-/pg-8.15.4.tgz",
"integrity": "sha512-I6UNVBAoYbvuWkkU3oosC8yxqH21f4/Jc4DK71JLG3dT2mdlGe1z+ep/LQGXaKaOgcvUrsQoPRqfgtMcvZiJhg==",
"version": "8.15.5",
"resolved": "https://registry.npmjs.org/@types/pg/-/pg-8.15.5.tgz",
"integrity": "sha512-LF7lF6zWEKxuT3/OR8wAZGzkg4ENGXFNyiV/JeOt9z5B+0ZVwbql9McqX5c/WStFq1GaGso7H1AzP/qSzmlCKQ==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -2741,9 +2741,9 @@
}
},
"node_modules/browserslist": {
"version": "4.24.4",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.24.4.tgz",
"integrity": "sha512-KDi1Ny1gSePi1vm0q4oxSF8b4DR44GF4BbmS2YdhPLOEqd8pDviZOGH/GsmRwoWJ2+5Lr085X7naowMwKHDG1A==",
"version": "4.25.1",
"resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.25.1.tgz",
"integrity": "sha512-KGj0KoOMXLpSNkkEI6Z6mShmQy0bc1I+T7K9N81k4WWMrfz+6fQ6es80B/YLAeRoKvjYE1YSHHOW1qe9xIVzHw==",
"dev": true,
"funding": [
{
@@ -2761,10 +2761,10 @@
],
"license": "MIT",
"dependencies": {
"caniuse-lite": "^1.0.30001688",
"electron-to-chromium": "^1.5.73",
"caniuse-lite": "^1.0.30001726",
"electron-to-chromium": "^1.5.173",
"node-releases": "^2.0.19",
"update-browserslist-db": "^1.1.1"
"update-browserslist-db": "^1.1.3"
},
"bin": {
"browserslist": "cli.js"
@@ -2862,9 +2862,9 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001713",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001713.tgz",
"integrity": "sha512-wCIWIg+A4Xr7NfhTuHdX+/FKh3+Op3LBbSp2N5Pfx6T/LhdQy3GTyoTg48BReaW/MyMNZAkTadsBtai3ldWK0Q==",
"version": "1.0.30001731",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001731.tgz",
"integrity": "sha512-lDdp2/wrOmTRWuoB5DpfNkC0rJDU8DqRa6nYL6HK6sytw70QMopt/NIc/9SM7ylItlBWfACXk0tEn37UWM/+mg==",
"dev": true,
"funding": [
{
@@ -2916,6 +2916,13 @@
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/change-case": {
"version": "5.4.4",
"resolved": "https://registry.npmjs.org/change-case/-/change-case-5.4.4.tgz",
"integrity": "sha512-HRQyTk2/YPEkt9TnUPbOpr64Uw3KOicFWPVBb+xiHvd6eBx/qPr9xqfBFDT8P2vWsvvz4jbEkfDe71W3VyNu2w==",
"dev": true,
"license": "MIT"
},
"node_modules/check-error": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/check-error/-/check-error-2.1.1.tgz",
@@ -2937,9 +2944,9 @@
}
},
"node_modules/ci-info": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/ci-info/-/ci-info-4.2.0.tgz",
"integrity": "sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==",
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ci-info/-/ci-info-4.3.0.tgz",
"integrity": "sha512-l+2bNRMiQgcfILUi33labAZYIWlH1kWDp+ecNo5iisRKrbm0xcRyCww71/YU0Fkw0mAFpz9bJayXPjey6vkmaQ==",
"dev": true,
"funding": [
{
@@ -3112,13 +3119,13 @@
}
},
"node_modules/core-js-compat": {
"version": "3.41.0",
"resolved": "https://registry.npmjs.org/core-js-compat/-/core-js-compat-3.41.0.tgz",
"integrity": "sha512-RFsU9LySVue9RTwdDVX/T0e2Y6jRYWXERKElIjpuEOEnxaXffI0X7RUwVzfYLfzuLXSNJDYoRYUAmRUcyln20A==",
"version": "3.45.0",
"resolved": "https://registry.npmjs.org/core-js-compat/-/core-js-compat-3.45.0.tgz",
"integrity": "sha512-gRoVMBawZg0OnxaVv3zpqLLxaHmsubEGyTnqdpI/CEBvX4JadI1dMSHxagThprYRtSVbuQxvi6iUatdPxohHpA==",
"dev": true,
"license": "MIT",
"dependencies": {
"browserslist": "^4.24.4"
"browserslist": "^4.25.1"
},
"funding": {
"type": "opencollective",
@@ -3271,9 +3278,9 @@
"license": "MIT"
},
"node_modules/electron-to-chromium": {
"version": "1.5.137",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.137.tgz",
"integrity": "sha512-/QSJaU2JyIuTbbABAo/crOs+SuAZLS+fVVS10PVrIT9hrRkmZl8Hb0xPSkKRUUWHQtYzXHpQUW3Dy5hwMzGZkA==",
"version": "1.5.195",
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.195.tgz",
"integrity": "sha512-URclP0iIaDUzqcAyV1v2PgduJ9N0IdXmWsnPzPfelvBmjmZzEy6xJcjb1cXj+TbYqXgtLrjHEoaSIdTYhw4ezg==",
"dev": true,
"license": "ISC"
},
@@ -3464,9 +3471,9 @@
}
},
"node_modules/eslint": {
"version": "9.31.0",
"resolved": "https://registry.npmjs.org/eslint/-/eslint-9.31.0.tgz",
"integrity": "sha512-QldCVh/ztyKJJZLr4jXNUByx3gR+TDYZCRXEktiZoUR3PGy4qCmSbkxcIle8GEwGpb5JBZazlaJ/CxLidXdEbQ==",
"version": "9.32.0",
"resolved": "https://registry.npmjs.org/eslint/-/eslint-9.32.0.tgz",
"integrity": "sha512-LSehfdpgMeWcTZkWZVIJl+tkZ2nuSkyyB9C27MZqFWXuph7DvaowgcTvKqxvpLW1JZIk8PN7hFY3Rj9LQ7m7lg==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -3476,8 +3483,8 @@
"@eslint/config-helpers": "^0.3.0",
"@eslint/core": "^0.15.0",
"@eslint/eslintrc": "^3.3.1",
"@eslint/js": "9.31.0",
"@eslint/plugin-kit": "^0.3.1",
"@eslint/js": "9.32.0",
"@eslint/plugin-kit": "^0.3.4",
"@humanfs/node": "^0.16.6",
"@humanwhocodes/module-importer": "^1.0.1",
"@humanwhocodes/retry": "^0.4.2",
@@ -3572,65 +3579,39 @@
}
},
"node_modules/eslint-plugin-unicorn": {
"version": "59.0.1",
"resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-59.0.1.tgz",
"integrity": "sha512-EtNXYuWPUmkgSU2E7Ttn57LbRREQesIP1BiLn7OZLKodopKfDXfBUkC/0j6mpw2JExwf43Uf3qLSvrSvppgy8Q==",
"version": "60.0.0",
"resolved": "https://registry.npmjs.org/eslint-plugin-unicorn/-/eslint-plugin-unicorn-60.0.0.tgz",
"integrity": "sha512-QUzTefvP8stfSXsqKQ+vBQSEsXIlAiCduS/V1Em+FKgL9c21U/IIm20/e3MFy1jyCf14tHAhqC1sX8OTy6VUCg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@babel/helper-validator-identifier": "^7.25.9",
"@eslint-community/eslint-utils": "^4.5.1",
"@eslint/plugin-kit": "^0.2.7",
"ci-info": "^4.2.0",
"@babel/helper-validator-identifier": "^7.27.1",
"@eslint-community/eslint-utils": "^4.7.0",
"@eslint/plugin-kit": "^0.3.3",
"change-case": "^5.4.4",
"ci-info": "^4.3.0",
"clean-regexp": "^1.0.0",
"core-js-compat": "^3.41.0",
"core-js-compat": "^3.44.0",
"esquery": "^1.6.0",
"find-up-simple": "^1.0.1",
"globals": "^16.0.0",
"globals": "^16.3.0",
"indent-string": "^5.0.0",
"is-builtin-module": "^5.0.0",
"jsesc": "^3.1.0",
"pluralize": "^8.0.0",
"regexp-tree": "^0.1.27",
"regjsparser": "^0.12.0",
"semver": "^7.7.1",
"semver": "^7.7.2",
"strip-indent": "^4.0.0"
},
"engines": {
"node": "^18.20.0 || ^20.10.0 || >=21.0.0"
"node": "^20.10.0 || >=21.0.0"
},
"funding": {
"url": "https://github.com/sindresorhus/eslint-plugin-unicorn?sponsor=1"
},
"peerDependencies": {
"eslint": ">=9.22.0"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/@eslint/core": {
"version": "0.13.0",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.13.0.tgz",
"integrity": "sha512-yfkgDw1KR66rkT5A8ci4irzDysN7FRpq3ttJolR88OqQikAWqwA8j5VZyas+vjyBNFIJ7MfybJ9plMILI2UrCw==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@types/json-schema": "^7.0.15"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
}
},
"node_modules/eslint-plugin-unicorn/node_modules/@eslint/plugin-kit": {
"version": "0.2.8",
"resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.2.8.tgz",
"integrity": "sha512-ZAoA40rNMPwSm+AeHpCq8STiNAwzWLJuP8Xv4CHIc9wv/PSuExjMrmjfYNj682vW0OOiZ1HKxzvjQr9XZIisQA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@eslint/core": "^0.13.0",
"levn": "^0.4.1"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
"eslint": ">=9.29.0"
}
},
"node_modules/eslint-scope": {
@@ -3663,19 +3644,6 @@
"url": "https://opencollective.com/eslint"
}
},
"node_modules/eslint/node_modules/@eslint/core": {
"version": "0.15.1",
"resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.15.1.tgz",
"integrity": "sha512-bkOp+iumZCCbt1K1CmWf0R9pM5yKpDv+ZXtvSyQpudrI9kuFLp+bM2WOPXImuD/ceQuaa8f5pj93Y7zyECIGNA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"@types/json-schema": "^7.0.15"
},
"engines": {
"node": "^18.18.0 || ^20.9.0 || >=21.1.0"
}
},
"node_modules/espree": {
"version": "10.4.0",
"resolved": "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz",

View File

@@ -1,6 +1,6 @@
{
"name": "immich-e2e",
"version": "1.136.0",
"version": "1.137.3",
"description": "",
"main": "index.js",
"type": "module",
@@ -26,7 +26,7 @@
"@playwright/test": "^1.44.1",
"@socket.io/component-emitter": "^3.1.2",
"@types/luxon": "^3.4.2",
"@types/node": "^22.16.5",
"@types/node": "^22.17.0",
"@types/oidc-provider": "^9.0.0",
"@types/pg": "^8.15.1",
"@types/pngjs": "^6.0.4",
@@ -35,7 +35,7 @@
"eslint": "^9.14.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.1.3",
"eslint-plugin-unicorn": "^59.0.0",
"eslint-plugin-unicorn": "^60.0.0",
"exiftool-vendored": "^28.3.1",
"globals": "^16.0.0",
"jose": "^5.6.3",
@@ -54,6 +54,6 @@
"vitest": "^3.0.0"
},
"volta": {
"node": "22.17.1"
"node": "22.18.0"
}
}

View File

@@ -683,7 +683,7 @@ describe('/albums', () => {
.set('Authorization', `Bearer ${user1.accessToken}`)
.send({ role: AlbumUserRole.Editor });
expect(status).toBe(200);
expect(status).toBe(204);
// Get album to verify the role change
const { body } = await request(app)

View File

@@ -555,7 +555,7 @@ describe('/asset', () => {
expect(body).toMatchObject({ id: user1Assets[0].id, livePhotoVideoId: null });
});
it('should update date time original when sidecar file contains DateTimeOriginal', async () => {
it.skip('should update date time original when sidecar file contains DateTimeOriginal', async () => {
const sidecarData = `<?xpacket begin='?' id='W5M0MpCehiHzreSzNTczkc9d'?>
<x:xmpmeta xmlns:x='adobe:ns:meta/' x:xmptk='Image::ExifTool 12.40'>
<rdf:RDF xmlns:rdf='http://www.w3.org/1999/02/22-rdf-syntax-ns#'>
@@ -854,6 +854,30 @@ describe('/asset', () => {
});
});
describe('PUT /assets', () => {
it('should update date time original relatively', async () => {
const { status, body } = await request(app)
.put(`/assets/`)
.set('Authorization', `Bearer ${user1.accessToken}`)
.send({ ids: [user1Assets[0].id], dateTimeRelative: -1441 });
expect(body).toEqual({});
expect(status).toEqual(204);
const result = await request(app)
.get(`/assets/${user1Assets[0].id}`)
.set('Authorization', `Bearer ${user1.accessToken}`)
.send();
expect(result.body).toMatchObject({
id: user1Assets[0].id,
exifInfo: expect.objectContaining({
dateTimeOriginal: '2023-11-19T01:10:00+00:00',
}),
});
});
});
describe('POST /assets', () => {
beforeAll(setupTests, 30_000);

View File

@@ -116,7 +116,7 @@ describe('/partners', () => {
.delete(`/partners/${user3.userId}`)
.set('Authorization', `Bearer ${user1.accessToken}`);
expect(status).toBe(200);
expect(status).toBe(204);
});
it('should throw a bad request if partner not found', async () => {

View File

@@ -9,7 +9,7 @@ import {
} from '@immich/sdk';
import { createUserDto, uuidDto } from 'src/fixtures';
import { errorDto } from 'src/responses';
import { app, asBearerAuth, shareUrl, utils } from 'src/utils';
import { app, asBearerAuth, baseUrl, shareUrl, utils } from 'src/utils';
import request from 'supertest';
import { beforeAll, describe, expect, it } from 'vitest';
@@ -78,6 +78,7 @@ describe('/shared-links', () => {
type: SharedLinkType.Album,
albumId: metadataAlbum.id,
showMetadata: true,
slug: 'metadata-album',
}),
utils.createSharedLink(user1.accessToken, {
type: SharedLinkType.Album,
@@ -138,6 +139,17 @@ describe('/shared-links', () => {
});
});
describe('GET /s/:slug', () => {
it('should work for slug auth', async () => {
const resp = await request(baseUrl).get(`/s/${linkWithMetadata.slug}`);
expect(resp.status).toBe(200);
expect(resp.header['content-type']).toContain('text/html');
expect(resp.text).toContain(
`<meta name="description" content="${metadataAlbum.assets.length} shared photos &amp; videos" />`,
);
});
});
describe('GET /shared-links', () => {
it('should require authentication', async () => {
const { status, body } = await request(app).get('/shared-links');
@@ -473,7 +485,7 @@ describe('/shared-links', () => {
.delete(`/shared-links/${linkWithAlbum.id}`)
.set('Authorization', `Bearer ${user1.accessToken}`);
expect(status).toBe(200);
expect(status).toBe(204);
});
});
});

View File

@@ -304,7 +304,7 @@ describe('/users', () => {
const { status } = await request(app)
.delete(`/users/me/license`)
.set('Authorization', `Bearer ${nonAdmin.accessToken}`);
expect(status).toBe(200);
expect(status).toBe(204);
});
});
});

View File

@@ -355,6 +355,9 @@
"trash_number_of_days_description": "Number of days to keep the assets in trash before permanently removing them",
"trash_settings": "Trash Settings",
"trash_settings_description": "Manage trash settings",
"unlink_all_oauth_accounts": "Unlink all OAuth accounts",
"unlink_all_oauth_accounts_description": "Remember to unlink all OAuth accounts before migrating to a new provider.",
"unlink_all_oauth_accounts_prompt": "Are you sure you want to unlink all OAuth accounts? This will reset the OAuth ID for each user and cannot be undone.",
"user_cleanup_job": "User cleanup",
"user_delete_delay": "<b>{user}</b>'s account and assets will be scheduled for permanent deletion in {delay, plural, one {# day} other {# days}}.",
"user_delete_delay_settings": "Delete delay",
@@ -653,6 +656,7 @@
"clear": "Clear",
"clear_all": "Clear all",
"clear_all_recent_searches": "Clear all recent searches",
"clear_file_cache": "Clear File Cache",
"clear_message": "Clear message",
"clear_value": "Clear value",
"client_cert_dialog_msg_confirm": "OK",
@@ -723,6 +727,7 @@
"create_new_user": "Create new user",
"create_shared_album_page_share_add_assets": "ADD ASSETS",
"create_shared_album_page_share_select_photos": "Select Photos",
"create_shared_link": "Create shared link",
"create_tag": "Create tag",
"create_tag_description": "Create a new tag. For nested tags, please enter the full path of the tag including forward slashes.",
"create_user": "Create user",
@@ -747,6 +752,7 @@
"date_of_birth_saved": "Date of birth saved successfully",
"date_range": "Date range",
"day": "Day",
"days": "Days",
"deduplicate_all": "Deduplicate All",
"deduplication_criteria_1": "Image size in bytes",
"deduplication_criteria_2": "Count of EXIF data",
@@ -831,9 +837,12 @@
"edit": "Edit",
"edit_album": "Edit album",
"edit_avatar": "Edit avatar",
"edit_birthday": "Edit Birthday",
"edit_birthday": "Edit birthday",
"edit_date": "Edit date",
"edit_date_and_time": "Edit date and time",
"edit_date_and_time_action_prompt": "{count} date and time edited",
"edit_date_and_time_by_offset": "Change date by offset",
"edit_date_and_time_by_offset_interval": "New date range: {from} - {to}",
"edit_description": "Edit description",
"edit_description_prompt": "Please select a new description:",
"edit_exclusion_pattern": "Edit exclusion pattern",
@@ -906,6 +915,7 @@
"failed_to_load_notifications": "Failed to load notifications",
"failed_to_load_people": "Failed to load people",
"failed_to_remove_product_key": "Failed to remove product key",
"failed_to_reset_pin_code": "Failed to reset PIN code",
"failed_to_stack_assets": "Failed to stack assets",
"failed_to_unstack_assets": "Failed to un-stack assets",
"failed_to_update_notification_status": "Failed to update notification status",
@@ -914,6 +924,7 @@
"paths_validation_failed": "{paths, plural, one {# path} other {# paths}} failed validation",
"profile_picture_transparent_pixels": "Profile pictures cannot have transparent pixels. Please zoom in and/or move the image.",
"quota_higher_than_disk_size": "You set a quota higher than the disk size",
"something_went_wrong": "Something went wrong",
"unable_to_add_album_users": "Unable to add users to album",
"unable_to_add_assets_to_shared_link": "Unable to add assets to shared link",
"unable_to_add_comment": "Unable to add comment",
@@ -1004,9 +1015,6 @@
"exif_bottom_sheet_location": "LOCATION",
"exif_bottom_sheet_people": "PEOPLE",
"exif_bottom_sheet_person_add_person": "Add name",
"exif_bottom_sheet_person_age_months": "Age {months} months",
"exif_bottom_sheet_person_age_year_months": "Age 1 year, {months} months",
"exif_bottom_sheet_person_age_years": "Age {years}",
"exit_slideshow": "Exit Slideshow",
"expand_all": "Expand all",
"experimental_settings_new_asset_list_subtitle": "Work in progress",
@@ -1053,6 +1061,7 @@
"folder_not_found": "Folder not found",
"folders": "Folders",
"folders_feature_description": "Browsing the folder view for the photos and videos on the file system",
"forgot_pin_code_question": "Forgot your PIN?",
"forward": "Forward",
"gcast_enabled": "Google Cast",
"gcast_enabled_description": "This feature loads external resources from Google in order to work.",
@@ -1107,6 +1116,7 @@
"home_page_upload_err_limit": "Can only upload a maximum of 30 assets at a time, skipping",
"host": "Host",
"hour": "Hour",
"hours": "Hours",
"id": "ID",
"idle": "Idle",
"ignore_icloud_photos": "Ignore iCloud photos",
@@ -1171,6 +1181,7 @@
"latest_version": "Latest Version",
"latitude": "Latitude",
"leave": "Leave",
"leave_album": "Leave album",
"lens_model": "Lens model",
"let_others_respond": "Let others respond",
"level": "Level",
@@ -1250,7 +1261,7 @@
"manage_your_devices": "Manage your logged-in devices",
"manage_your_oauth_connection": "Manage your OAuth connection",
"map": "Map",
"map_assets_in_bounds": "{count, plural, one {# photo} other {# photos}}",
"map_assets_in_bounds": "{count, plural, =0 {No photos in this area} one {# photo} other {# photos}}",
"map_cannot_get_user_location": "Cannot get user's location",
"map_location_dialog_yes": "Yes",
"map_location_picker_page_use_location": "Use this location",
@@ -1258,7 +1269,6 @@
"map_location_service_disabled_title": "Location Service disabled",
"map_marker_for_images": "Map marker for images taken in {city}, {country}",
"map_marker_with_image": "Map marker with image",
"map_no_assets_in_bounds": "No photos in this area",
"map_no_location_permission_content": "Location permission is needed to display assets from your current location. Do you want to allow it now?",
"map_no_location_permission_title": "Location Permission denied",
"map_settings": "Map settings",
@@ -1295,6 +1305,7 @@
"merged_people_count": "Merged {count, plural, one {# person} other {# people}}",
"minimize": "Minimize",
"minute": "Minute",
"minutes": "Minutes",
"missing": "Missing",
"model": "Model",
"month": "Month",
@@ -1368,6 +1379,7 @@
"oauth": "OAuth",
"official_immich_resources": "Official Immich Resources",
"offline": "Offline",
"offset": "Offset",
"ok": "Ok",
"oldest_first": "Oldest first",
"on_this_device": "On this device",
@@ -1445,6 +1457,9 @@
"permission_onboarding_permission_limited": "Permission limited. To let Immich backup and manage your entire gallery collection, grant photo and video permissions in Settings.",
"permission_onboarding_request": "Immich requires permission to view your photos and videos.",
"person": "Person",
"person_age_months": "{months, plural, one {# month} other {# months}} old",
"person_age_year_months": "1 year, {months, plural, one {# month} other {# months}} old",
"person_age_years": "{years, plural, other {# years}} old",
"person_birthdate": "Born on {date}",
"person_hidden": "{name}{hidden, select, true { (hidden)} other {}}",
"photo_shared_all_users": "Looks like you shared your photos with all users or you don't have any user to share with.",
@@ -1590,6 +1605,9 @@
"reset_password": "Reset password",
"reset_people_visibility": "Reset people visibility",
"reset_pin_code": "Reset PIN code",
"reset_pin_code_description": "If you forgot your PIN code, you can contact the server administrator to reset it",
"reset_pin_code_success": "Successfully reset PIN code",
"reset_pin_code_with_password": "You can always reset your PIN code with your password",
"reset_sqlite": "Reset SQLite Database",
"reset_sqlite_confirmation": "Are you sure you want to reset the SQLite database? You will need to log out and log in again to resync the data",
"reset_sqlite_success": "Successfully reset the SQLite database",
@@ -1838,6 +1856,7 @@
"sort_created": "Date created",
"sort_items": "Number of items",
"sort_modified": "Date modified",
"sort_newest": "Newest photo",
"sort_oldest": "Oldest photo",
"sort_people_by_similarity": "Sort people by similarity",
"sort_recent": "Most recent photo",

View File

@@ -1,6 +1,6 @@
ARG DEVICE=cpu
FROM python:3.11-bookworm@sha256:ce3b954c9285a7a145cba620bae03db836ab890b6b9e0d05a3ca522ea00dfbc9 AS builder-cpu
FROM python:3.11-bookworm@sha256:c1239cb82bf08176c4c90421ab425a1696257b098d9ce21e68de9319c255a47d AS builder-cpu
FROM builder-cpu AS builder-openvino
@@ -59,7 +59,7 @@ ENV PYTHONDONTWRITEBYTECODE=1 \
RUN apt-get update && apt-get install -y --no-install-recommends g++
COPY --from=ghcr.io/astral-sh/uv:latest@sha256:9653efd4380d5a0e5511e337dcfc3b8ba5bc4e6ea7fa3be7716598261d5503fa /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:latest@sha256:67b2bcccdc103d608727d1b577e58008ef810f751ed324715eb60b3f0c040d30 /uv /uvx /bin/
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
@@ -68,11 +68,11 @@ RUN if [ "$DEVICE" = "rocm" ]; then \
uv pip install /opt/onnxruntime_rocm-*.whl; \
fi
FROM python:3.11-slim-bookworm@sha256:9e1912aab0a30bbd9488eb79063f68f42a68ab0946cbe98fecf197fe5b085506 AS prod-cpu
FROM python:3.11-slim-bookworm@sha256:0ce77749ac83174a31d5e107ce0cfa6b28a2fd6b0615e029d9d84b39c48976ee AS prod-cpu
ENV LD_PRELOAD=/usr/lib/libmimalloc.so.2
FROM python:3.11-slim-bookworm@sha256:9e1912aab0a30bbd9488eb79063f68f42a68ab0946cbe98fecf197fe5b085506 AS prod-openvino
FROM python:3.11-slim-bookworm@sha256:0ce77749ac83174a31d5e107ce0cfa6b28a2fd6b0615e029d9d84b39c48976ee AS prod-openvino
RUN apt-get update && \
apt-get install --no-install-recommends -yqq ocl-icd-libopencl1 wget && \

View File

@@ -36,7 +36,7 @@ def to_numpy(img: Image.Image) -> NDArray[np.float32]:
def normalize(
img: NDArray[np.float32], mean: float | NDArray[np.float32], std: float | NDArray[np.float32]
) -> NDArray[np.float32]:
return np.divide(img - mean, std, dtype=np.float32)
return (img - mean) / std
def get_pil_resampling(resample: str) -> Image.Resampling:
@@ -58,11 +58,13 @@ def decode_pil(image_bytes: bytes | IO[bytes] | Image.Image) -> Image.Image:
def decode_cv2(image_bytes: NDArray[np.uint8] | bytes | Image.Image) -> NDArray[np.uint8]:
if isinstance(image_bytes, bytes):
image_bytes = decode_pil(image_bytes) # pillow is much faster than cv2
if isinstance(image_bytes, Image.Image):
return pil_to_cv2(image_bytes)
return image_bytes
match image_bytes:
case bytes() | memoryview() | bytearray():
return pil_to_cv2(decode_pil(image_bytes)) # pillow is much faster than cv2
case Image.Image():
return pil_to_cv2(image_bytes)
case _:
return image_bytes
def clean_text(text: str, canonicalize: bool = False) -> str:

View File

@@ -112,8 +112,4 @@ def has_profiling(obj: Any) -> TypeGuard[HasProfiling]:
return hasattr(obj, "profiling") and isinstance(obj.profiling, dict)
def is_ndarray(obj: Any, dtype: "type[np._DTypeScalar_co]") -> "TypeGuard[npt.NDArray[np._DTypeScalar_co]]":
return isinstance(obj, np.ndarray) and obj.dtype == dtype
T = TypeVar("T")

View File

@@ -12,6 +12,7 @@ dependencies = [
"gunicorn>=21.1.0",
"huggingface-hub>=0.20.1,<1.0",
"insightface>=0.7.3,<1.0",
"numpy<2",
"opencv-python-headless>=4.7.0.72,<5.0",
"orjson>=3.9.5",
"pillow>=9.5.0,<11.0",

3247
machine-learning/uv.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,9 @@
cmake_minimum_required(VERSION 3.10.2)
project("native_buffer")
add_library(native_buffer SHARED
src/main/cpp/native_buffer.c)
find_library(log-lib log)
target_link_libraries(native_buffer ${log-lib})

View File

@@ -83,6 +83,12 @@ android {
}
}
namespace 'app.alextran.immich'
externalNativeBuild {
cmake {
path "CMakeLists.txt"
}
}
}
flutter {

View File

@@ -27,7 +27,7 @@
<application android:label="Immich" android:name=".ImmichApp" android:usesCleartextTraffic="true"
android:icon="@mipmap/ic_launcher" android:requestLegacyExternalStorage="true"
android:largeHeap="true" android:enableOnBackInvokedCallback="false">
android:largeHeap="true" android:enableOnBackInvokedCallback="false" android:allowBackup="false">
<service
android:name="androidx.work.impl.foreground.SystemForegroundService"

View File

@@ -0,0 +1,52 @@
#include <jni.h>
#include <stdlib.h>
JNIEXPORT jlong JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_00024Companion_allocateNative(
JNIEnv *env, jclass clazz, jint size)
{
void *ptr = malloc(size);
return (jlong)ptr;
}
JNIEXPORT jlong JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_allocateNative(
JNIEnv *env, jclass clazz, jint size)
{
void *ptr = malloc(size);
return (jlong)ptr;
}
JNIEXPORT void JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_00024Companion_freeNative(
JNIEnv *env, jclass clazz, jlong address)
{
if (address != 0)
{
free((void *)address);
}
}
JNIEXPORT void JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_freeNative(
JNIEnv *env, jclass clazz, jlong address)
{
if (address != 0)
{
free((void *)address);
}
}
JNIEXPORT jobject JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_00024Companion_wrapAsBuffer(
JNIEnv *env, jclass clazz, jlong address, jint capacity)
{
return (*env)->NewDirectByteBuffer(env, (void*)address, capacity);
}
JNIEXPORT jobject JNICALL
Java_app_alextran_immich_images_ThumbnailsImpl_wrapAsBuffer(
JNIEnv *env, jclass clazz, jlong address, jint capacity)
{
return (*env)->NewDirectByteBuffer(env, (void*)address, capacity);
}

View File

@@ -1,7 +1,20 @@
package app.alextran.immich
import android.content.Context
import com.bumptech.glide.GlideBuilder
import com.bumptech.glide.annotation.GlideModule
import com.bumptech.glide.load.engine.bitmap_recycle.BitmapPoolAdapter
import com.bumptech.glide.load.engine.cache.DiskCacheAdapter
import com.bumptech.glide.load.engine.cache.MemoryCacheAdapter
import com.bumptech.glide.module.AppGlideModule
@GlideModule
class AppGlideModule : AppGlideModule()
class AppGlideModule : AppGlideModule() {
override fun applyOptions(context: Context, builder: GlideBuilder) {
super.applyOptions(context, builder)
// disable caching as this is already done on the Flutter side
builder.setMemoryCache(MemoryCacheAdapter())
builder.setDiskCache(DiskCacheAdapter.Factory())
builder.setBitmapPool(BitmapPoolAdapter())
}
}

View File

@@ -3,6 +3,8 @@ package app.alextran.immich
import android.os.Build
import android.os.ext.SdkExtensions
import androidx.annotation.NonNull
import app.alextran.immich.images.ThumbnailApi
import app.alextran.immich.images.ThumbnailsImpl
import app.alextran.immich.sync.NativeSyncApi
import app.alextran.immich.sync.NativeSyncApiImpl26
import app.alextran.immich.sync.NativeSyncApiImpl30
@@ -22,6 +24,7 @@ class MainActivity : FlutterFragmentActivity() {
} else {
NativeSyncApiImpl30(this)
}
NativeSyncApi.setUp(flutterEngine.dartExecutor.binaryMessenger, nativeSyncApiImpl)
NativeSyncApi.setUp(flutterEngine.dartExecutor.binaryMessenger, nativeSyncApiImpl)
ThumbnailApi.setUp(flutterEngine.dartExecutor.binaryMessenger, ThumbnailsImpl(this))
}
}

View File

@@ -0,0 +1,164 @@
package app.alextran.immich.images;
// Copyright (c) 2023 Evan Wallace
// Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
import java.nio.ByteBuffer;
// modified to use native allocations
public final class ThumbHash {
/**
* Decodes a ThumbHash to an RGBA image. RGB is not be premultiplied by A.
*
* @param hash The bytes of the ThumbHash.
* @return The width, height, and pixels of the rendered placeholder image.
*/
public static Image thumbHashToRGBA(byte[] hash) {
// Read the constants
int header24 = (hash[0] & 255) | ((hash[1] & 255) << 8) | ((hash[2] & 255) << 16);
int header16 = (hash[3] & 255) | ((hash[4] & 255) << 8);
float l_dc = (float) (header24 & 63) / 63.0f;
float p_dc = (float) ((header24 >> 6) & 63) / 31.5f - 1.0f;
float q_dc = (float) ((header24 >> 12) & 63) / 31.5f - 1.0f;
float l_scale = (float) ((header24 >> 18) & 31) / 31.0f;
boolean hasAlpha = (header24 >> 23) != 0;
float p_scale = (float) ((header16 >> 3) & 63) / 63.0f;
float q_scale = (float) ((header16 >> 9) & 63) / 63.0f;
boolean isLandscape = (header16 >> 15) != 0;
int lx = Math.max(3, isLandscape ? hasAlpha ? 5 : 7 : header16 & 7);
int ly = Math.max(3, isLandscape ? header16 & 7 : hasAlpha ? 5 : 7);
float a_dc = hasAlpha ? (float) (hash[5] & 15) / 15.0f : 1.0f;
float a_scale = (float) ((hash[5] >> 4) & 15) / 15.0f;
// Read the varying factors (boost saturation by 1.25x to compensate for quantization)
int ac_start = hasAlpha ? 6 : 5;
int ac_index = 0;
Channel l_channel = new Channel(lx, ly);
Channel p_channel = new Channel(3, 3);
Channel q_channel = new Channel(3, 3);
Channel a_channel = null;
ac_index = l_channel.decode(hash, ac_start, ac_index, l_scale);
ac_index = p_channel.decode(hash, ac_start, ac_index, p_scale * 1.25f);
ac_index = q_channel.decode(hash, ac_start, ac_index, q_scale * 1.25f);
if (hasAlpha) {
a_channel = new Channel(5, 5);
a_channel.decode(hash, ac_start, ac_index, a_scale);
}
float[] l_ac = l_channel.ac;
float[] p_ac = p_channel.ac;
float[] q_ac = q_channel.ac;
float[] a_ac = hasAlpha ? a_channel.ac : null;
// Decode using the DCT into RGB
float ratio = thumbHashToApproximateAspectRatio(hash);
int w = Math.round(ratio > 1.0f ? 32.0f : 32.0f * ratio);
int h = Math.round(ratio > 1.0f ? 32.0f / ratio : 32.0f);
int size = w * h * 4;
long pointer = ThumbnailsImpl.allocateNative(size);
ByteBuffer rgba = ThumbnailsImpl.wrapAsBuffer(pointer, size);
int cx_stop = Math.max(lx, hasAlpha ? 5 : 3);
int cy_stop = Math.max(ly, hasAlpha ? 5 : 3);
float[] fx = new float[cx_stop];
float[] fy = new float[cy_stop];
for (int y = 0, i = 0; y < h; y++) {
for (int x = 0; x < w; x++, i += 4) {
float l = l_dc, p = p_dc, q = q_dc, a = a_dc;
// Precompute the coefficients
for (int cx = 0; cx < cx_stop; cx++)
fx[cx] = (float) Math.cos(Math.PI / w * (x + 0.5f) * cx);
for (int cy = 0; cy < cy_stop; cy++)
fy[cy] = (float) Math.cos(Math.PI / h * (y + 0.5f) * cy);
// Decode L
for (int cy = 0, j = 0; cy < ly; cy++) {
float fy2 = fy[cy] * 2.0f;
for (int cx = cy > 0 ? 0 : 1; cx * ly < lx * (ly - cy); cx++, j++)
l += l_ac[j] * fx[cx] * fy2;
}
// Decode P and Q
for (int cy = 0, j = 0; cy < 3; cy++) {
float fy2 = fy[cy] * 2.0f;
for (int cx = cy > 0 ? 0 : 1; cx < 3 - cy; cx++, j++) {
float f = fx[cx] * fy2;
p += p_ac[j] * f;
q += q_ac[j] * f;
}
}
// Decode A
if (hasAlpha)
for (int cy = 0, j = 0; cy < 5; cy++) {
float fy2 = fy[cy] * 2.0f;
for (int cx = cy > 0 ? 0 : 1; cx < 5 - cy; cx++, j++)
a += a_ac[j] * fx[cx] * fy2;
}
// Convert to RGB
float b = l - 2.0f / 3.0f * p;
float r = (3.0f * l - b + q) / 2.0f;
float g = r - q;
rgba.put(i, (byte) Math.max(0, Math.round(255.0f * Math.min(1, r))));
rgba.put(i + 1, (byte) Math.max(0, Math.round(255.0f * Math.min(1, g))));
rgba.put(i + 2, (byte) Math.max(0, Math.round(255.0f * Math.min(1, b))));
rgba.put(i + 3, (byte) Math.max(0, Math.round(255.0f * Math.min(1, a))));
}
}
return new Image(w, h, pointer);
}
/**
* Extracts the approximate aspect ratio of the original image.
*
* @param hash The bytes of the ThumbHash.
* @return The approximate aspect ratio (i.e. width / height).
*/
public static float thumbHashToApproximateAspectRatio(byte[] hash) {
byte header = hash[3];
boolean hasAlpha = (hash[2] & 0x80) != 0;
boolean isLandscape = (hash[4] & 0x80) != 0;
int lx = isLandscape ? hasAlpha ? 5 : 7 : header & 7;
int ly = isLandscape ? header & 7 : hasAlpha ? 5 : 7;
return (float) lx / (float) ly;
}
public static final class Image {
public int width;
public int height;
public long pointer;
public Image(int width, int height, long pointer) {
this.width = width;
this.height = height;
this.pointer = pointer;
}
}
private static final class Channel {
int nx;
int ny;
float[] ac;
Channel(int nx, int ny) {
this.nx = nx;
this.ny = ny;
int n = 0;
for (int cy = 0; cy < ny; cy++)
for (int cx = cy > 0 ? 0 : 1; cx * ny < nx * (ny - cy); cx++)
n++;
ac = new float[n];
}
int decode(byte[] hash, int start, int index, float scale) {
for (int i = 0; i < ac.length; i++) {
int data = hash[start + (index >> 1)] >> ((index & 1) << 2);
ac[i] = ((float) (data & 15) / 7.5f - 1.0f) * scale;
index++;
}
return index;
}
}
}

View File

@@ -0,0 +1,139 @@
// Autogenerated from Pigeon (v26.0.0), do not edit directly.
// See also: https://pub.dev/packages/pigeon
@file:Suppress("UNCHECKED_CAST", "ArrayInDataClass")
package app.alextran.immich.images
import android.util.Log
import io.flutter.plugin.common.BasicMessageChannel
import io.flutter.plugin.common.BinaryMessenger
import io.flutter.plugin.common.EventChannel
import io.flutter.plugin.common.MessageCodec
import io.flutter.plugin.common.StandardMethodCodec
import io.flutter.plugin.common.StandardMessageCodec
import java.io.ByteArrayOutputStream
import java.nio.ByteBuffer
private object ThumbnailsPigeonUtils {
fun wrapResult(result: Any?): List<Any?> {
return listOf(result)
}
fun wrapError(exception: Throwable): List<Any?> {
return if (exception is FlutterError) {
listOf(
exception.code,
exception.message,
exception.details
)
} else {
listOf(
exception.javaClass.simpleName,
exception.toString(),
"Cause: " + exception.cause + ", Stacktrace: " + Log.getStackTraceString(exception)
)
}
}
}
/**
* Error class for passing custom error details to Flutter via a thrown PlatformException.
* @property code The error code.
* @property message The error message.
* @property details The error details. Must be a datatype supported by the api codec.
*/
class FlutterError (
val code: String,
override val message: String? = null,
val details: Any? = null
) : Throwable()
private open class ThumbnailsPigeonCodec : StandardMessageCodec() {
override fun readValueOfType(type: Byte, buffer: ByteBuffer): Any? {
return super.readValueOfType(type, buffer)
}
override fun writeValue(stream: ByteArrayOutputStream, value: Any?) {
super.writeValue(stream, value)
}
}
/** Generated interface from Pigeon that represents a handler of messages from Flutter. */
interface ThumbnailApi {
fun requestImage(assetId: String, requestId: Long, width: Long, height: Long, isVideo: Boolean, callback: (Result<Map<String, Long>>) -> Unit)
fun cancelImageRequest(requestId: Long)
fun getThumbhash(thumbhash: String, callback: (Result<Map<String, Long>>) -> Unit)
companion object {
/** The codec used by ThumbnailApi. */
val codec: MessageCodec<Any?> by lazy {
ThumbnailsPigeonCodec()
}
/** Sets up an instance of `ThumbnailApi` to handle messages through the `binaryMessenger`. */
@JvmOverloads
fun setUp(binaryMessenger: BinaryMessenger, api: ThumbnailApi?, messageChannelSuffix: String = "") {
val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else ""
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.ThumbnailApi.requestImage$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { message, reply ->
val args = message as List<Any?>
val assetIdArg = args[0] as String
val requestIdArg = args[1] as Long
val widthArg = args[2] as Long
val heightArg = args[3] as Long
val isVideoArg = args[4] as Boolean
api.requestImage(assetIdArg, requestIdArg, widthArg, heightArg, isVideoArg) { result: Result<Map<String, Long>> ->
val error = result.exceptionOrNull()
if (error != null) {
reply.reply(ThumbnailsPigeonUtils.wrapError(error))
} else {
val data = result.getOrNull()
reply.reply(ThumbnailsPigeonUtils.wrapResult(data))
}
}
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.ThumbnailApi.cancelImageRequest$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { message, reply ->
val args = message as List<Any?>
val requestIdArg = args[0] as Long
val wrapped: List<Any?> = try {
api.cancelImageRequest(requestIdArg)
listOf(null)
} catch (exception: Throwable) {
ThumbnailsPigeonUtils.wrapError(exception)
}
reply.reply(wrapped)
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.ThumbnailApi.getThumbhash$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { message, reply ->
val args = message as List<Any?>
val thumbhashArg = args[0] as String
api.getThumbhash(thumbhashArg) { result: Result<Map<String, Long>> ->
val error = result.exceptionOrNull()
if (error != null) {
reply.reply(ThumbnailsPigeonUtils.wrapError(error))
} else {
val data = result.getOrNull()
reply.reply(ThumbnailsPigeonUtils.wrapResult(data))
}
}
}
} else {
channel.setMessageHandler(null)
}
}
}
}
}

View File

@@ -0,0 +1,255 @@
package app.alextran.immich.images
import android.content.ContentResolver
import android.content.ContentUris
import android.content.Context
import android.graphics.*
import android.net.Uri
import android.os.Build
import android.os.CancellationSignal
import android.os.OperationCanceledException
import android.provider.MediaStore
import android.provider.MediaStore.Images
import android.provider.MediaStore.Video
import android.util.Size
import java.nio.ByteBuffer
import kotlin.math.*
import java.util.concurrent.Executors
import com.bumptech.glide.Glide
import com.bumptech.glide.Priority
import com.bumptech.glide.load.DecodeFormat
import java.util.Base64
import java.util.HashMap
import java.util.concurrent.CancellationException
import java.util.concurrent.ConcurrentHashMap
import java.util.concurrent.Future
data class Request(
val requestId: Long,
val taskFuture: Future<*>,
val cancellationSignal: CancellationSignal,
val callback: (Result<Map<String, Long>>) -> Unit
)
class ThumbnailsImpl(context: Context) : ThumbnailApi {
private val ctx: Context = context.applicationContext
private val resolver: ContentResolver = ctx.contentResolver
private val requestThread = Executors.newSingleThreadExecutor()
private val threadPool =
Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() / 2 + 1)
private val requestMap = ConcurrentHashMap<Long, Request>()
companion object {
val PROJECTION = arrayOf(
MediaStore.MediaColumns.DATE_MODIFIED,
MediaStore.Files.FileColumns.MEDIA_TYPE,
)
const val SELECTION = "${MediaStore.MediaColumns._ID} = ?"
val URI: Uri = MediaStore.Files.getContentUri("external")
const val MEDIA_TYPE_IMAGE = MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE
const val MEDIA_TYPE_VIDEO = MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO
val CANCELLED = Result.success<Map<String, Long>>(mapOf())
val OPTIONS = BitmapFactory.Options().apply { inPreferredConfig = Bitmap.Config.ARGB_8888 }
init {
System.loadLibrary("native_buffer")
}
@JvmStatic
external fun allocateNative(size: Int): Long
@JvmStatic
external fun freeNative(pointer: Long)
@JvmStatic
external fun wrapAsBuffer(address: Long, capacity: Int): ByteBuffer
}
override fun getThumbhash(thumbhash: String, callback: (Result<Map<String, Long>>) -> Unit) {
threadPool.execute {
try {
val bytes = Base64.getDecoder().decode(thumbhash)
val image = ThumbHash.thumbHashToRGBA(bytes)
val res = mapOf(
"pointer" to image.pointer,
"width" to image.width.toLong(),
"height" to image.height.toLong()
)
callback(Result.success(res))
} catch (e: Exception) {
callback(Result.failure(e))
}
}
}
override fun requestImage(
assetId: String,
requestId: Long,
width: Long,
height: Long,
isVideo: Boolean,
callback: (Result<Map<String, Long>>) -> Unit
) {
val signal = CancellationSignal()
val task = threadPool.submit {
try {
getThumbnailBufferInternal(assetId, width, height, isVideo, callback, signal)
} catch (e: Exception) {
when (e) {
is OperationCanceledException -> callback(CANCELLED)
is CancellationException -> callback(CANCELLED)
else -> callback(Result.failure(e))
}
} finally {
requestMap.remove(requestId)
}
}
val request = Request(requestId, task, signal, callback)
requestMap[requestId] = request
}
override fun cancelImageRequest(requestId: Long) {
val request = requestMap.remove(requestId) ?: return
request.taskFuture.cancel(false)
request.cancellationSignal.cancel()
if (request.taskFuture.isCancelled) {
requestThread.execute {
try {
request.callback(CANCELLED)
} catch (_: Exception) {
}
}
}
}
private fun getThumbnailBufferInternal(
assetId: String,
width: Long,
height: Long,
isVideo: Boolean,
callback: (Result<Map<String, Long>>) -> Unit,
signal: CancellationSignal
) {
signal.throwIfCanceled()
val targetWidth = width.toInt()
val targetHeight = height.toInt()
val id = assetId.toLong()
signal.throwIfCanceled()
val bitmap = if (isVideo) {
decodeVideoThumbnail(id, targetWidth, targetHeight, signal)
} else {
decodeImage(id, targetWidth, targetHeight, signal)
}
processBitmap(bitmap, callback, signal)
}
private fun processBitmap(
bitmap: Bitmap,
callback: (Result<Map<String, Long>>) -> Unit,
signal: CancellationSignal
) {
signal.throwIfCanceled()
val actualWidth = bitmap.width
val actualHeight = bitmap.height
val size = actualWidth * actualHeight * 4
val pointer = allocateNative(size)
try {
signal.throwIfCanceled()
val buffer = wrapAsBuffer(pointer, size)
bitmap.copyPixelsToBuffer(buffer)
bitmap.recycle()
signal.throwIfCanceled()
val res = mapOf(
"pointer" to pointer,
"width" to actualWidth.toLong(),
"height" to actualHeight.toLong()
)
callback(Result.success(res))
} catch (e: Exception) {
freeNative(pointer)
callback(if (e is OperationCanceledException) CANCELLED else Result.failure(e))
}
}
private fun decodeImage(
id: Long,
targetWidth: Int,
targetHeight: Int,
signal: CancellationSignal
): Bitmap {
signal.throwIfCanceled()
val uri = ContentUris.withAppendedId(Images.Media.EXTERNAL_CONTENT_URI, id)
if (targetHeight > 768 || targetWidth > 768) {
return decodeSource(uri, targetWidth, targetHeight, signal)
}
return if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
resolver.loadThumbnail(uri, Size(targetWidth, targetHeight), signal)
} else {
signal.setOnCancelListener { Images.Thumbnails.cancelThumbnailRequest(resolver, id) }
Images.Thumbnails.getThumbnail(resolver, id, Images.Thumbnails.MINI_KIND, OPTIONS)
}
}
private fun decodeVideoThumbnail(
id: Long,
targetWidth: Int,
targetHeight: Int,
signal: CancellationSignal
): Bitmap {
signal.throwIfCanceled()
return if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
val uri = ContentUris.withAppendedId(Video.Media.EXTERNAL_CONTENT_URI, id)
resolver.loadThumbnail(uri, Size(targetWidth, targetHeight), signal)
} else {
signal.setOnCancelListener { Video.Thumbnails.cancelThumbnailRequest(resolver, id) }
Video.Thumbnails.getThumbnail(resolver, id, Video.Thumbnails.MINI_KIND, OPTIONS)
}
}
private fun decodeSource(
uri: Uri,
targetWidth: Int,
targetHeight: Int,
signal: CancellationSignal
): Bitmap {
signal.throwIfCanceled()
return if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
val source = ImageDecoder.createSource(resolver, uri)
signal.throwIfCanceled()
ImageDecoder.decodeBitmap(source) { decoder, info, _ ->
val sampleSize =
getSampleSize(info.size.width, info.size.height, targetWidth, targetHeight)
decoder.setTargetSampleSize(sampleSize)
decoder.allocator = ImageDecoder.ALLOCATOR_SOFTWARE
decoder.setTargetColorSpace(ColorSpace.get(ColorSpace.Named.SRGB))
}
} else {
val ref = Glide.with(ctx)
.asBitmap()
.priority(Priority.IMMEDIATE)
.load(uri)
.disallowHardwareConfig()
.format(DecodeFormat.PREFER_ARGB_8888)
.submit(targetWidth, targetHeight)
signal.setOnCancelListener { Glide.with(ctx).clear(ref) }
ref.get()
}
}
private fun getSampleSize(fullWidth: Int, fullHeight: Int, reqWidth: Int, reqHeight: Int): Int {
return 1 shl max(
0, floor(
min(
log2(fullWidth / (2.0 * reqWidth)),
log2(fullHeight / (2.0 * reqHeight)),
)
).toInt()
)
}
}

View File

@@ -1,4 +1,4 @@
// Autogenerated from Pigeon (v25.3.2), do not edit directly.
// Autogenerated from Pigeon (v26.0.0), do not edit directly.
// See also: https://pub.dev/packages/pigeon
@file:Suppress("UNCHECKED_CAST", "ArrayInDataClass")

View File

@@ -29,21 +29,24 @@ open class NativeSyncApiImplBase(context: Context) {
MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO.toString()
)
const val BUCKET_SELECTION = "(${MediaStore.Files.FileColumns.BUCKET_ID} = ?)"
val ASSET_PROJECTION = arrayOf(
MediaStore.MediaColumns._ID,
MediaStore.MediaColumns.DATA,
MediaStore.MediaColumns.DISPLAY_NAME,
MediaStore.MediaColumns.DATE_TAKEN,
MediaStore.MediaColumns.DATE_ADDED,
MediaStore.MediaColumns.DATE_MODIFIED,
MediaStore.Files.FileColumns.MEDIA_TYPE,
MediaStore.MediaColumns.BUCKET_ID,
MediaStore.MediaColumns.WIDTH,
MediaStore.MediaColumns.HEIGHT,
MediaStore.MediaColumns.DURATION,
MediaStore.MediaColumns.ORIENTATION,
MediaStore.MediaColumns.IS_FAVORITE,
)
val ASSET_PROJECTION = buildList {
add(MediaStore.MediaColumns._ID)
add(MediaStore.MediaColumns.DATA)
add(MediaStore.MediaColumns.DISPLAY_NAME)
add(MediaStore.MediaColumns.DATE_TAKEN)
add(MediaStore.MediaColumns.DATE_ADDED)
add(MediaStore.MediaColumns.DATE_MODIFIED)
add(MediaStore.Files.FileColumns.MEDIA_TYPE)
add(MediaStore.MediaColumns.BUCKET_ID)
add(MediaStore.MediaColumns.WIDTH)
add(MediaStore.MediaColumns.HEIGHT)
add(MediaStore.MediaColumns.DURATION)
add(MediaStore.MediaColumns.ORIENTATION)
// IS_FAVORITE is only available on Android 11 and above
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.R) {
add(MediaStore.MediaColumns.IS_FAVORITE)
}
}.toTypedArray()
const val HASH_BUFFER_SIZE = 2 * 1024 * 1024
}
@@ -78,7 +81,7 @@ open class NativeSyncApiImplBase(context: Context) {
val durationColumn = c.getColumnIndexOrThrow(MediaStore.MediaColumns.DURATION)
val orientationColumn =
c.getColumnIndexOrThrow(MediaStore.MediaColumns.ORIENTATION)
val favoriteColumn = c.getColumnIndexOrThrow(MediaStore.MediaColumns.IS_FAVORITE)
val favoriteColumn = c.getColumnIndex(MediaStore.MediaColumns.IS_FAVORITE)
while (c.moveToNext()) {
val id = c.getLong(idColumn).toString()
@@ -107,7 +110,7 @@ open class NativeSyncApiImplBase(context: Context) {
else c.getLong(durationColumn) / 1000
val bucketId = c.getString(bucketIdColumn)
val orientation = c.getInt(orientationColumn)
val isFavorite = c.getInt(favoriteColumn) != 0;
val isFavorite = if (favoriteColumn == -1) false else c.getInt(favoriteColumn) != 0
val asset = PlatformAsset(
id,

View File

@@ -24,14 +24,23 @@ class ImmichAPI(cfg: ServerConfig) {
val serverURL = prefs.getString("widget_server_url", "") ?: ""
val sessionKey = prefs.getString("widget_auth_token", "") ?: ""
val customHeadersJSON = prefs.getString("widget_custom_headers", "") ?: ""
if (serverURL.isBlank() || sessionKey.isBlank()) {
return null
}
var customHeaders: Map<String, String> = HashMap<String, String>()
if (customHeadersJSON.isNotBlank()) {
val stringMapType = object : TypeToken<Map<String, String>>() {}.type
customHeaders = Gson().fromJson(customHeadersJSON, stringMapType)
}
return ServerConfig(
serverURL,
sessionKey
sessionKey,
customHeaders
)
}
}
@@ -50,11 +59,19 @@ class ImmichAPI(cfg: ServerConfig) {
return URL(urlString.toString())
}
private fun HttpURLConnection.applyCustomHeaders() {
serverConfig.customHeaders.forEach { (key, value) ->
setRequestProperty(key, value)
}
}
suspend fun fetchSearchResults(filters: SearchFilters): List<Asset> = withContext(Dispatchers.IO) {
val url = buildRequestURL("/search/random")
val connection = (url.openConnection() as HttpURLConnection).apply {
requestMethod = "POST"
setRequestProperty("Content-Type", "application/json")
applyCustomHeaders()
doOutput = true
}
@@ -75,6 +92,7 @@ class ImmichAPI(cfg: ServerConfig) {
val url = buildRequestURL("/memories", listOf("for" to iso8601))
val connection = (url.openConnection() as HttpURLConnection).apply {
requestMethod = "GET"
applyCustomHeaders()
}
val response = connection.inputStream.bufferedReader().readText()
@@ -94,6 +112,7 @@ class ImmichAPI(cfg: ServerConfig) {
val url = buildRequestURL("/albums")
val connection = (url.openConnection() as HttpURLConnection).apply {
requestMethod = "GET"
applyCustomHeaders()
}
val response = connection.inputStream.bufferedReader().readText()

View File

@@ -55,7 +55,11 @@ data class WidgetEntry (
val deeplink: String?
)
data class ServerConfig(val serverEndpoint: String, val sessionKey: String)
data class ServerConfig(
val serverEndpoint: String,
val sessionKey: String,
val customHeaders: Map<String, String>
)
// MARK: Widget State Keys
val kImageUUID = stringPreferencesKey("uuid")

View File

@@ -35,8 +35,8 @@ platform :android do
task: 'bundle',
build_type: 'Release',
properties: {
"android.injected.version.code" => 205,
"android.injected.version.name" => "1.136.0",
"android.injected.version.code" => 3002,
"android.injected.version.name" => "1.137.3",
}
)
upload_to_play_store(skip_upload_apk: true, skip_upload_images: true, skip_upload_screenshots: true, aab: '../build/app/outputs/bundle/release/app-release.aab')

View File

@@ -19,6 +19,7 @@ targets:
- lib/infrastructure/entities/*.dart
- lib/infrastructure/entities/*.drift
- lib/infrastructure/repositories/db.repository.dart
- lib/infrastructure/repositories/logger_db.repository.dart
drift_dev:modular:
enabled: true
options: *drift_options

File diff suppressed because one or more lines are too long

View File

@@ -4,6 +4,7 @@ import 'package:easy_localization/easy_localization.dart';
import 'package:flutter_test/flutter_test.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/infrastructure/repositories/logger_db.repository.dart';
import 'package:immich_mobile/main.dart' as app;
import 'package:immich_mobile/providers/db.provider.dart';
import 'package:immich_mobile/providers/infrastructure/db.provider.dart';
@@ -40,16 +41,14 @@ class ImmichTestHelper {
await EasyLocalization.ensureInitialized();
// Clear all data from Isar (reuse existing instance if available)
final db = await Bootstrap.initIsar();
await Bootstrap.initDomain(db);
final logDb = DriftLogger();
await Bootstrap.initDomain(db, logDb);
await Store.clear();
await db.writeTxn(() => db.clear());
// Load main Widget
await tester.pumpWidget(
ProviderScope(
overrides: [
dbProvider.overrideWithValue(db),
isarProvider.overrideWithValue(db),
],
overrides: [dbProvider.overrideWithValue(db), isarProvider.overrideWithValue(db)],
child: const app.MainWidget(),
),
);
@@ -59,18 +58,11 @@ class ImmichTestHelper {
}
@isTest
void immichWidgetTest(
String description,
Future<void> Function(WidgetTester, ImmichTestHelper) test,
) {
testWidgets(
description,
(widgetTester) async {
await ImmichTestHelper.loadApp(widgetTester);
await test(widgetTester, ImmichTestHelper(widgetTester));
},
semanticsEnabled: false,
);
void immichWidgetTest(String description, Future<void> Function(WidgetTester, ImmichTestHelper) test) {
testWidgets(description, (widgetTester) async {
await ImmichTestHelper.loadApp(widgetTester);
await test(widgetTester, ImmichTestHelper(widgetTester));
}, semanticsEnabled: false);
}
Future<void> pumpUntilFound(
@@ -79,8 +71,7 @@ Future<void> pumpUntilFound(
Duration timeout = const Duration(seconds: 120),
}) async {
bool found = false;
final timer =
Timer(timeout, () => throw TimeoutException("Pump until has timed out"));
final timer = Timer(timeout, () => throw TimeoutException("Pump until has timed out"));
while (found != true) {
await tester.pump();
found = tester.any(finder);

View File

@@ -3,7 +3,7 @@
archiveVersion = 1;
classes = {
};
objectVersion = 54;
objectVersion = 77;
objects = {
/* Begin PBXBuildFile section */
@@ -24,6 +24,9 @@
FAC6F89B2D287C890078CB2F /* ShareExtension.appex in Embed Foundation Extensions */ = {isa = PBXBuildFile; fileRef = FAC6F8902D287C890078CB2F /* ShareExtension.appex */; settings = {ATTRIBUTES = (RemoveHeadersOnCopy, ); }; };
FAC6F8B72D287F120078CB2F /* ShareViewController.swift in Sources */ = {isa = PBXBuildFile; fileRef = FAC6F8B52D287F120078CB2F /* ShareViewController.swift */; };
FAC6F8B92D287F120078CB2F /* MainInterface.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = FAC6F8B32D287F120078CB2F /* MainInterface.storyboard */; };
FEAFA8732E4D42F4001E47FE /* Thumbhash.swift in Sources */ = {isa = PBXBuildFile; fileRef = FEAFA8722E4D42F4001E47FE /* Thumbhash.swift */; };
FED3B1962E253E9B0030FD97 /* ThumbnailsImpl.swift in Sources */ = {isa = PBXBuildFile; fileRef = FED3B1942E253E9B0030FD97 /* ThumbnailsImpl.swift */; };
FED3B1972E253E9B0030FD97 /* Thumbnails.g.swift in Sources */ = {isa = PBXBuildFile; fileRef = FED3B1932E253E9B0030FD97 /* Thumbnails.g.swift */; };
/* End PBXBuildFile section */
/* Begin PBXContainerItemProxy section */
@@ -102,6 +105,9 @@
FAC6F8B42D287F120078CB2F /* ShareExtension.entitlements */ = {isa = PBXFileReference; lastKnownFileType = text.plist.entitlements; path = ShareExtension.entitlements; sourceTree = "<group>"; };
FAC6F8B52D287F120078CB2F /* ShareViewController.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ShareViewController.swift; sourceTree = "<group>"; };
FAC7416727DB9F5500C668D8 /* RunnerProfile.entitlements */ = {isa = PBXFileReference; lastKnownFileType = text.plist.entitlements; path = RunnerProfile.entitlements; sourceTree = "<group>"; };
FEAFA8722E4D42F4001E47FE /* Thumbhash.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Thumbhash.swift; sourceTree = "<group>"; };
FED3B1932E253E9B0030FD97 /* Thumbnails.g.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Thumbnails.g.swift; sourceTree = "<group>"; };
FED3B1942E253E9B0030FD97 /* ThumbnailsImpl.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ThumbnailsImpl.swift; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFileSystemSynchronizedBuildFileExceptionSet section */
@@ -117,8 +123,6 @@
/* Begin PBXFileSystemSynchronizedRootGroup section */
B2CF7F8C2DDE4EBB00744BF6 /* Sync */ = {
isa = PBXFileSystemSynchronizedRootGroup;
exceptions = (
);
path = Sync;
sourceTree = "<group>";
};
@@ -243,6 +247,7 @@
1498D2331E8E89220040F4C2 /* GeneratedPluginRegistrant.m */,
74858FAE1ED2DC5600515810 /* AppDelegate.swift */,
74858FAD1ED2DC5600515810 /* Runner-Bridging-Header.h */,
FED3B1952E253E9B0030FD97 /* Images */,
);
path = Runner;
sourceTree = "<group>";
@@ -258,6 +263,16 @@
path = ShareExtension;
sourceTree = "<group>";
};
FED3B1952E253E9B0030FD97 /* Images */ = {
isa = PBXGroup;
children = (
FEAFA8722E4D42F4001E47FE /* Thumbhash.swift */,
FED3B1932E253E9B0030FD97 /* Thumbnails.g.swift */,
FED3B1942E253E9B0030FD97 /* ThumbnailsImpl.swift */,
);
path = Images;
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
@@ -523,6 +538,9 @@
files = (
65F32F31299BD2F800CE9261 /* BackgroundServicePlugin.swift in Sources */,
74858FAF1ED2DC5600515810 /* AppDelegate.swift in Sources */,
FEAFA8732E4D42F4001E47FE /* Thumbhash.swift in Sources */,
FED3B1962E253E9B0030FD97 /* ThumbnailsImpl.swift in Sources */,
FED3B1972E253E9B0030FD97 /* Thumbnails.g.swift in Sources */,
1498D2341E8E89220040F4C2 /* GeneratedPluginRegistrant.m in Sources */,
65F32F33299D349D00CE9261 /* BackgroundSyncWorker.swift in Sources */,
);
@@ -649,7 +667,7 @@
CODE_SIGN_ENTITLEMENTS = Runner/RunnerProfile.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_BITCODE = NO;
@@ -793,7 +811,7 @@
CODE_SIGN_ENTITLEMENTS = Runner/Runner.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_BITCODE = NO;
@@ -823,7 +841,7 @@
CODE_SIGN_ENTITLEMENTS = Runner/Runner.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_BITCODE = NO;
@@ -857,7 +875,7 @@
CODE_SIGN_ENTITLEMENTS = WidgetExtension/WidgetExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
GCC_C_LANGUAGE_STANDARD = gnu17;
@@ -900,7 +918,7 @@
CODE_SIGN_ENTITLEMENTS = WidgetExtension/WidgetExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
GCC_C_LANGUAGE_STANDARD = gnu17;
@@ -940,7 +958,7 @@
CODE_SIGN_ENTITLEMENTS = WidgetExtension/WidgetExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
GCC_C_LANGUAGE_STANDARD = gnu17;
@@ -979,7 +997,7 @@
CODE_SIGN_ENTITLEMENTS = ShareExtension/ShareExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
@@ -1023,7 +1041,7 @@
CODE_SIGN_ENTITLEMENTS = ShareExtension/ShareExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
@@ -1064,7 +1082,7 @@
CODE_SIGN_ENTITLEMENTS = ShareExtension/ShareExtension.entitlements;
CODE_SIGN_IDENTITY = "Apple Development";
CODE_SIGN_STYLE = Automatic;
CURRENT_PROJECT_VERSION = 210;
CURRENT_PROJECT_VERSION = 213;
CUSTOM_GROUP_ID = group.app.immich.share;
DEVELOPMENT_TEAM = 2F67MQ8R79;
ENABLE_USER_SCRIPT_SANDBOXING = YES;

View File

@@ -25,6 +25,7 @@ import UIKit
let controller: FlutterViewController = window?.rootViewController as! FlutterViewController
NativeSyncApiSetup.setUp(binaryMessenger: controller.binaryMessenger, api: NativeSyncApiImpl())
ThumbnailApiSetup.setUp(binaryMessenger: controller.binaryMessenger, api: ThumbnailApiImpl())
BackgroundServicePlugin.setPluginRegistrantCallback { registry in
if !registry.hasPlugin("org.cocoapods.path-provider-foundation") {

View File

@@ -0,0 +1,225 @@
// Copyright (c) 2023 Evan Wallace
// Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
import Foundation
// NOTE: Swift has an exponential-time type checker and compiling very simple
// expressions can easily take many seconds, especially when expressions involve
// numeric type constructors.
//
// This file deliberately breaks compound expressions up into separate variables
// to improve compile time even though this comes at the expense of readability.
// This is a known workaround for this deficiency in the Swift compiler.
//
// The following command is helpful when debugging Swift compile time issues:
//
// swiftc ThumbHash.swift -Xfrontend -debug-time-function-bodies
//
// These optimizations brought the compile time for this file from around 2.5
// seconds to around 250ms (10x faster).
// NOTE: Swift's debug-build performance of for-in loops over numeric ranges is
// really awful. Debug builds compile a very generic indexing iterator thing
// that makes many nested calls for every iteration, which makes debug-build
// performance crawl.
//
// This file deliberately avoids for-in loops that loop for more than a few
// times to improve debug-build run time even though this comes at the expense
// of readability. Similarly unsafe pointers are used instead of array getters
// to avoid unnecessary bounds checks, which have extra overhead in debug builds.
//
// These optimizations brought the run time to encode and decode 10 ThumbHashes
// in debug mode from 700ms to 70ms (10x faster).
// changed signature and allocation method to avoid automatic GC
func thumbHashToRGBA(hash: Data) -> (Int, Int, UnsafeMutableRawBufferPointer) {
// Read the constants
let h0 = UInt32(hash[0])
let h1 = UInt32(hash[1])
let h2 = UInt32(hash[2])
let h3 = UInt16(hash[3])
let h4 = UInt16(hash[4])
let header24 = h0 | (h1 << 8) | (h2 << 16)
let header16 = h3 | (h4 << 8)
let il_dc = header24 & 63
let ip_dc = (header24 >> 6) & 63
let iq_dc = (header24 >> 12) & 63
var l_dc = Float32(il_dc)
var p_dc = Float32(ip_dc)
var q_dc = Float32(iq_dc)
l_dc = l_dc / 63
p_dc = p_dc / 31.5 - 1
q_dc = q_dc / 31.5 - 1
let il_scale = (header24 >> 18) & 31
var l_scale = Float32(il_scale)
l_scale = l_scale / 31
let hasAlpha = (header24 >> 23) != 0
let ip_scale = (header16 >> 3) & 63
let iq_scale = (header16 >> 9) & 63
var p_scale = Float32(ip_scale)
var q_scale = Float32(iq_scale)
p_scale = p_scale / 63
q_scale = q_scale / 63
let isLandscape = (header16 >> 15) != 0
let lx16 = max(3, isLandscape ? hasAlpha ? 5 : 7 : header16 & 7)
let ly16 = max(3, isLandscape ? header16 & 7 : hasAlpha ? 5 : 7)
let lx = Int(lx16)
let ly = Int(ly16)
var a_dc = Float32(1)
var a_scale = Float32(1)
if hasAlpha {
let ia_dc = hash[5] & 15
let ia_scale = hash[5] >> 4
a_dc = Float32(ia_dc)
a_scale = Float32(ia_scale)
a_dc /= 15
a_scale /= 15
}
// Read the varying factors (boost saturation by 1.25x to compensate for quantization)
let ac_start = hasAlpha ? 6 : 5
var ac_index = 0
let decodeChannel = { (nx: Int, ny: Int, scale: Float32) -> [Float32] in
var ac: [Float32] = []
for cy in 0 ..< ny {
var cx = cy > 0 ? 0 : 1
while cx * ny < nx * (ny - cy) {
let iac = (hash[ac_start + (ac_index >> 1)] >> ((ac_index & 1) << 2)) & 15;
var fac = Float32(iac)
fac = (fac / 7.5 - 1) * scale
ac.append(fac)
ac_index += 1
cx += 1
}
}
return ac
}
let l_ac = decodeChannel(lx, ly, l_scale)
let p_ac = decodeChannel(3, 3, p_scale * 1.25)
let q_ac = decodeChannel(3, 3, q_scale * 1.25)
let a_ac = hasAlpha ? decodeChannel(5, 5, a_scale) : []
// Decode using the DCT into RGB
let ratio = thumbHashToApproximateAspectRatio(hash: hash)
let fw = round(ratio > 1 ? 32 : 32 * ratio)
let fh = round(ratio > 1 ? 32 / ratio : 32)
let w = Int(fw)
let h = Int(fh)
let pointer = UnsafeMutableRawBufferPointer.allocate(
byteCount: w * h * 4,
alignment: MemoryLayout<UInt8>.alignment
)
var rgba = pointer.baseAddress!.assumingMemoryBound(to: UInt8.self)
let cx_stop = max(lx, hasAlpha ? 5 : 3)
let cy_stop = max(ly, hasAlpha ? 5 : 3)
var fx = [Float32](repeating: 0, count: cx_stop)
var fy = [Float32](repeating: 0, count: cy_stop)
fx.withUnsafeMutableBytes { fx in
let fx = fx.baseAddress!.bindMemory(to: Float32.self, capacity: fx.count)
fy.withUnsafeMutableBytes { fy in
let fy = fy.baseAddress!.bindMemory(to: Float32.self, capacity: fy.count)
var y = 0
while y < h {
var x = 0
while x < w {
var l = l_dc
var p = p_dc
var q = q_dc
var a = a_dc
// Precompute the coefficients
var cx = 0
while cx < cx_stop {
let fw = Float32(w)
let fxx = Float32(x)
let fcx = Float32(cx)
fx[cx] = cos(Float32.pi / fw * (fxx + 0.5) * fcx)
cx += 1
}
var cy = 0
while cy < cy_stop {
let fh = Float32(h)
let fyy = Float32(y)
let fcy = Float32(cy)
fy[cy] = cos(Float32.pi / fh * (fyy + 0.5) * fcy)
cy += 1
}
// Decode L
var j = 0
cy = 0
while cy < ly {
var cx = cy > 0 ? 0 : 1
let fy2 = fy[cy] * 2
while cx * ly < lx * (ly - cy) {
l += l_ac[j] * fx[cx] * fy2
j += 1
cx += 1
}
cy += 1
}
// Decode P and Q
j = 0
cy = 0
while cy < 3 {
var cx = cy > 0 ? 0 : 1
let fy2 = fy[cy] * 2
while cx < 3 - cy {
let f = fx[cx] * fy2
p += p_ac[j] * f
q += q_ac[j] * f
j += 1
cx += 1
}
cy += 1
}
// Decode A
if hasAlpha {
j = 0
cy = 0
while cy < 5 {
var cx = cy > 0 ? 0 : 1
let fy2 = fy[cy] * 2
while cx < 5 - cy {
a += a_ac[j] * fx[cx] * fy2
j += 1
cx += 1
}
cy += 1
}
}
// Convert to RGB
var b = l - 2 / 3 * p
var r = (3 * l - b + q) / 2
var g = r - q
r = max(0, 255 * min(1, r))
g = max(0, 255 * min(1, g))
b = max(0, 255 * min(1, b))
a = max(0, 255 * min(1, a))
rgba[0] = UInt8(r)
rgba[1] = UInt8(g)
rgba[2] = UInt8(b)
rgba[3] = UInt8(a)
rgba = rgba.advanced(by: 4)
x += 1
}
y += 1
}
}
}
return (w, h, pointer)
}
func thumbHashToApproximateAspectRatio(hash: Data) -> Float32 {
let header = hash[3]
let hasAlpha = (hash[2] & 0x80) != 0
let isLandscape = (hash[4] & 0x80) != 0
let lx = isLandscape ? hasAlpha ? 5 : 7 : header & 7
let ly = isLandscape ? header & 7 : hasAlpha ? 5 : 7
return Float32(lx) / Float32(ly)
}

View File

@@ -0,0 +1,138 @@
// Autogenerated from Pigeon (v26.0.0), do not edit directly.
// See also: https://pub.dev/packages/pigeon
import Foundation
#if os(iOS)
import Flutter
#elseif os(macOS)
import FlutterMacOS
#else
#error("Unsupported platform.")
#endif
private func wrapResult(_ result: Any?) -> [Any?] {
return [result]
}
private func wrapError(_ error: Any) -> [Any?] {
if let pigeonError = error as? PigeonError {
return [
pigeonError.code,
pigeonError.message,
pigeonError.details,
]
}
if let flutterError = error as? FlutterError {
return [
flutterError.code,
flutterError.message,
flutterError.details,
]
}
return [
"\(error)",
"\(type(of: error))",
"Stacktrace: \(Thread.callStackSymbols)",
]
}
private func isNullish(_ value: Any?) -> Bool {
return value is NSNull || value == nil
}
private func nilOrValue<T>(_ value: Any?) -> T? {
if value is NSNull { return nil }
return value as! T?
}
private class ThumbnailsPigeonCodecReader: FlutterStandardReader {
}
private class ThumbnailsPigeonCodecWriter: FlutterStandardWriter {
}
private class ThumbnailsPigeonCodecReaderWriter: FlutterStandardReaderWriter {
override func reader(with data: Data) -> FlutterStandardReader {
return ThumbnailsPigeonCodecReader(data: data)
}
override func writer(with data: NSMutableData) -> FlutterStandardWriter {
return ThumbnailsPigeonCodecWriter(data: data)
}
}
class ThumbnailsPigeonCodec: FlutterStandardMessageCodec, @unchecked Sendable {
static let shared = ThumbnailsPigeonCodec(readerWriter: ThumbnailsPigeonCodecReaderWriter())
}
/// Generated protocol from Pigeon that represents a handler of messages from Flutter.
protocol ThumbnailApi {
func requestImage(assetId: String, requestId: Int64, width: Int64, height: Int64, isVideo: Bool, completion: @escaping (Result<[String: Int64], Error>) -> Void)
func cancelImageRequest(requestId: Int64) throws
func getThumbhash(thumbhash: String, completion: @escaping (Result<[String: Int64], Error>) -> Void)
}
/// Generated setup class from Pigeon to handle messages through the `binaryMessenger`.
class ThumbnailApiSetup {
static var codec: FlutterStandardMessageCodec { ThumbnailsPigeonCodec.shared }
/// Sets up an instance of `ThumbnailApi` to handle messages through the `binaryMessenger`.
static func setUp(binaryMessenger: FlutterBinaryMessenger, api: ThumbnailApi?, messageChannelSuffix: String = "") {
let channelSuffix = messageChannelSuffix.count > 0 ? ".\(messageChannelSuffix)" : ""
let requestImageChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.ThumbnailApi.requestImage\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
requestImageChannel.setMessageHandler { message, reply in
let args = message as! [Any?]
let assetIdArg = args[0] as! String
let requestIdArg = args[1] as! Int64
let widthArg = args[2] as! Int64
let heightArg = args[3] as! Int64
let isVideoArg = args[4] as! Bool
api.requestImage(assetId: assetIdArg, requestId: requestIdArg, width: widthArg, height: heightArg, isVideo: isVideoArg) { result in
switch result {
case .success(let res):
reply(wrapResult(res))
case .failure(let error):
reply(wrapError(error))
}
}
}
} else {
requestImageChannel.setMessageHandler(nil)
}
let cancelImageRequestChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.ThumbnailApi.cancelImageRequest\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
cancelImageRequestChannel.setMessageHandler { message, reply in
let args = message as! [Any?]
let requestIdArg = args[0] as! Int64
do {
try api.cancelImageRequest(requestId: requestIdArg)
reply(wrapResult(nil))
} catch {
reply(wrapError(error))
}
}
} else {
cancelImageRequestChannel.setMessageHandler(nil)
}
let getThumbhashChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.ThumbnailApi.getThumbhash\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
getThumbhashChannel.setMessageHandler { message, reply in
let args = message as! [Any?]
let thumbhashArg = args[0] as! String
api.getThumbhash(thumbhash: thumbhashArg) { result in
switch result {
case .success(let res):
reply(wrapResult(res))
case .failure(let error):
reply(wrapError(error))
}
}
}
} else {
getThumbhashChannel.setMessageHandler(nil)
}
}
}

View File

@@ -0,0 +1,187 @@
import CryptoKit
import Flutter
import MobileCoreServices
import Photos
class Request {
weak var workItem: DispatchWorkItem?
var isCancelled = false
let callback: (Result<[String: Int64], any Error>) -> Void
init(callback: @escaping (Result<[String: Int64], any Error>) -> Void) {
self.callback = callback
}
}
class ThumbnailApiImpl: ThumbnailApi {
private static let imageManager = PHImageManager.default()
private static let fetchOptions = {
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 1
fetchOptions.wantsIncrementalChangeDetails = false
return fetchOptions
}()
private static let requestOptions = {
let requestOptions = PHImageRequestOptions()
requestOptions.isNetworkAccessAllowed = true
requestOptions.deliveryMode = .highQualityFormat
requestOptions.resizeMode = .fast
requestOptions.isSynchronous = true
requestOptions.version = .current
return requestOptions
}()
private static let assetQueue = DispatchQueue(label: "thumbnail.assets", qos: .userInitiated)
private static let requestQueue = DispatchQueue(label: "thumbnail.requests", qos: .userInitiated)
private static let cancelQueue = DispatchQueue(label: "thumbnail.cancellation", qos: .default)
private static let processingQueue = DispatchQueue(label: "thumbnail.processing", qos: .userInteractive, attributes: .concurrent)
private static let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
private static let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue).rawValue
private static var requests = [Int64: Request]()
private static let cancelledResult = Result<[String: Int64], any Error>.success([:])
private static let concurrencySemaphore = DispatchSemaphore(value: ProcessInfo.processInfo.activeProcessorCount * 2)
private static let assetCache = {
let assetCache = NSCache<NSString, PHAsset>()
assetCache.countLimit = 10000
return assetCache
}()
func getThumbhash(thumbhash: String, completion: @escaping (Result<[String : Int64], any Error>) -> Void) {
Self.processingQueue.async {
guard let data = Data(base64Encoded: thumbhash)
else { return completion(.failure(PigeonError(code: "", message: "Invalid base64 string: \(thumbhash)", details: nil)))}
let (width, height, pointer) = thumbHashToRGBA(hash: data)
completion(.success(["pointer": Int64(Int(bitPattern: pointer.baseAddress)), "width": Int64(width), "height": Int64(height)]))
}
}
func requestImage(assetId: String, requestId: Int64, width: Int64, height: Int64, completion: @escaping (Result<[String: Int64], any Error>) -> Void) {
let request = Request(callback: completion)
let item = DispatchWorkItem {
if request.isCancelled {
return completion(Self.cancelledResult)
}
Self.concurrencySemaphore.wait()
defer {
Self.concurrencySemaphore.signal()
}
if request.isCancelled {
return completion(Self.cancelledResult)
}
guard let asset = Self.requestAsset(assetId: assetId)
else {
Self.removeRequest(requestId: requestId)
completion(.failure(PigeonError(code: "", message: "Could not get asset data for \(assetId)", details: nil)))
return
}
if request.isCancelled {
return completion(Self.cancelledResult)
}
var image: UIImage?
Self.imageManager.requestImage(
for: asset,
targetSize: CGSize(width: Double(width), height: Double(height)),
contentMode: .aspectFit,
options: Self.requestOptions,
resultHandler: { (_image, info) -> Void in
image = _image
}
)
if request.isCancelled {
return completion(Self.cancelledResult)
}
guard let image = image,
let cgImage = image.cgImage else {
Self.removeRequest(requestId: requestId)
return completion(.failure(PigeonError(code: "", message: "Could not get pixel data for \(assetId)", details: nil)))
}
let pointer = UnsafeMutableRawPointer.allocate(
byteCount: Int(cgImage.width) * Int(cgImage.height) * 4,
alignment: MemoryLayout<UInt8>.alignment
)
if request.isCancelled {
pointer.deallocate()
return completion(Self.cancelledResult)
}
guard let context = CGContext(
data: pointer,
width: cgImage.width,
height: cgImage.height,
bitsPerComponent: 8,
bytesPerRow: cgImage.width * 4,
space: Self.rgbColorSpace,
bitmapInfo: Self.bitmapInfo
) else {
pointer.deallocate()
Self.removeRequest(requestId: requestId)
return completion(.failure(PigeonError(code: "", message: "Could not create context for \(assetId)", details: nil)))
}
if request.isCancelled {
pointer.deallocate()
return completion(Self.cancelledResult)
}
context.interpolationQuality = .none
context.draw(cgImage, in: CGRect(x: 0, y: 0, width: cgImage.width, height: cgImage.height))
if request.isCancelled {
pointer.deallocate()
return completion(Self.cancelledResult)
}
completion(.success(["pointer": Int64(Int(bitPattern: pointer)), "width": Int64(cgImage.width), "height": Int64(cgImage.height)]))
Self.removeRequest(requestId: requestId)
}
request.workItem = item
Self.addRequest(requestId: requestId, request: request)
Self.processingQueue.async(execute: item)
}
func cancelImageRequest(requestId: Int64) {
Self.cancelRequest(requestId: requestId)
}
private static func addRequest(requestId: Int64, request: Request) -> Void {
requestQueue.sync { requests[requestId] = request }
}
private static func removeRequest(requestId: Int64) -> Void {
requestQueue.sync { requests[requestId] = nil }
}
private static func cancelRequest(requestId: Int64) -> Void {
requestQueue.async {
guard let request = requests.removeValue(forKey: requestId) else { return }
request.isCancelled = true
guard let item = request.workItem else { return }
if item.isCancelled {
request.callback(Self.cancelledResult)
}
}
}
private static func requestAsset(assetId: String) -> PHAsset? {
var asset: PHAsset?
assetQueue.sync { asset = assetCache.object(forKey: assetId as NSString) }
if asset != nil { return asset }
guard let asset = PHAsset.fetchAssets(withLocalIdentifiers: [assetId], options: Self.fetchOptions).firstObject
else { return nil }
assetQueue.async { assetCache.setObject(asset, forKey: assetId as NSString) }
return asset
}
}

View File

@@ -78,7 +78,7 @@
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>1.135.1</string>
<string>1.137.2</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleURLTypes</key>
@@ -105,7 +105,7 @@
</dict>
</array>
<key>CFBundleVersion</key>
<string>210</string>
<string>213</string>
<key>FLTEnableImpeller</key>
<true />
<key>ITSAppUsesNonExemptEncryption</key>
@@ -184,4 +184,4 @@
<string>We need local network permission to connect to the local server using IP address and
allow the casting feature to work</string>
</dict>
</plist>
</plist>

View File

@@ -1,4 +1,4 @@
// Autogenerated from Pigeon (v25.3.2), do not edit directly.
// Autogenerated from Pigeon (v26.0.0), do not edit directly.
// See also: https://pub.dev/packages/pigeon
import Foundation

View File

@@ -104,10 +104,13 @@ struct Album: Codable, Equatable {
// MARK: API
class ImmichAPI {
typealias CustomHeaders = [String:String]
struct ServerConfig {
let serverEndpoint: String
let sessionKey: String
let customHeaders: CustomHeaders
}
let serverConfig: ServerConfig
init() async throws {
@@ -122,10 +125,20 @@ class ImmichAPI {
if serverURL == "" || sessionKey == "" {
throw WidgetError.noLogin
}
// custom headers come in the form of KV pairs in JSON
var customHeadersJSON = (defaults.string(forKey: "widget_custom_headers") ?? "")
var customHeaders: CustomHeaders = [:]
if customHeadersJSON != "",
let parsedHeaders = try? JSONDecoder().decode(CustomHeaders.self, from: customHeadersJSON.data(using: .utf8)!) {
customHeaders = parsedHeaders
}
serverConfig = ServerConfig(
serverEndpoint: serverURL,
sessionKey: sessionKey
sessionKey: sessionKey,
customHeaders: customHeaders
)
}
@@ -155,6 +168,12 @@ class ImmichAPI {
return components?.url
}
func applyCustomHeaders(for request: inout URLRequest) {
for (header, value) in serverConfig.customHeaders {
request.addValue(value, forHTTPHeaderField: header)
}
}
func fetchSearchResults(with filters: SearchFilter = Album.NONE.filter)
async throws
@@ -174,7 +193,8 @@ class ImmichAPI {
request.httpMethod = "POST"
request.httpBody = try JSONEncoder().encode(filters)
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
applyCustomHeaders(for: &request)
let (data, _) = try await URLSession.shared.data(for: request)
// decode data
@@ -196,6 +216,7 @@ class ImmichAPI {
var request = URLRequest(url: searchURL)
request.httpMethod = "GET"
applyCustomHeaders(for: &request)
let (data, _) = try await URLSession.shared.data(for: request)
@@ -254,7 +275,8 @@ class ImmichAPI {
var request = URLRequest(url: searchURL)
request.httpMethod = "GET"
applyCustomHeaders(for: &request)
let (data, _) = try await URLSession.shared.data(for: request)
// decode data

View File

@@ -22,7 +22,7 @@ platform :ios do
path: "./Runner.xcodeproj",
)
increment_version_number(
version_number: "1.136.0"
version_number: "1.137.3"
)
increment_build_number(
build_number: latest_testflight_build_number + 1,

View File

@@ -3,7 +3,7 @@ const double downloadCompleted = -1;
const double downloadFailed = -2;
// Number of log entries to retain on app start
const int kLogTruncateLimit = 250;
const int kLogTruncateLimit = 2000;
// Sync
const int kSyncEventBatchSize = 5000;
@@ -26,13 +26,14 @@ const String kDownloadGroupLivePhoto = 'group_livephoto';
// Timeline constants
const int kTimelineNoneSegmentSize = 120;
const int kTimelineAssetLoadBatchSize = 256;
const int kTimelineAssetLoadBatchSize = 1024;
const int kTimelineAssetLoadOppositeSize = 64;
// Widget keys
const String appShareGroupId = "group.app.immich.share";
const String kWidgetAuthToken = "widget_auth_token";
const String kWidgetServerEndpoint = "widget_server_url";
const String appShareGroupId = "group.app.immich.share";
const String kWidgetCustomHeaders = "widget_custom_headers";
// add widget identifiers here for new widgets
// these are used to force a widget refresh

View File

@@ -0,0 +1,18 @@
import 'package:maplibre_gl/maplibre_gl.dart';
class Marker {
final LatLng location;
final String assetId;
const Marker({required this.location, required this.assetId});
@override
bool operator ==(covariant Marker other) {
if (identical(this, other)) return true;
return other.location == location && other.assetId == assetId;
}
@override
int get hashCode => location.hashCode ^ assetId.hashCode;
}

View File

@@ -6,7 +6,6 @@ import 'package:immich_mobile/infrastructure/repositories/local_album.repository
import 'package:immich_mobile/infrastructure/repositories/local_asset.repository.dart';
import 'package:immich_mobile/infrastructure/repositories/storage.repository.dart';
import 'package:immich_mobile/platform/native_sync_api.g.dart';
import 'package:immich_mobile/presentation/pages/dev/dev_logger.dart';
import 'package:logging/logging.dart';
class HashService {
@@ -46,7 +45,6 @@ class HashService {
stopwatch.stop();
_log.info("Hashing took - ${stopwatch.elapsedMilliseconds}ms");
DLog.log("Hashing took - ${stopwatch.elapsedMilliseconds}ms");
}
/// Processes a list of [LocalAsset]s, storing their hash and updating the assets in the DB
@@ -96,14 +94,14 @@ class HashService {
if (hash?.length == 20) {
hashed.add(asset.copyWith(checksum: base64.encode(hash!)));
} else {
_log.warning("Failed to hash file for ${asset.id}");
_log.warning("Failed to hash file for ${asset.id}: ${asset.name} created at ${asset.createdAt}");
}
}
_log.fine("Hashed ${hashed.length}/${toHash.length} assets");
DLog.log("Hashed ${hashed.length}/${toHash.length} assets");
await _localAssetRepository.updateHashes(hashed);
await _storageRepository.clearCache();
}
}

View File

@@ -6,7 +6,6 @@ import 'package:immich_mobile/domain/models/album/local_album.model.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/infrastructure/repositories/local_album.repository.dart';
import 'package:immich_mobile/platform/native_sync_api.g.dart';
import 'package:immich_mobile/presentation/pages/dev/dev_logger.dart';
import 'package:immich_mobile/utils/diff.dart';
import 'package:logging/logging.dart';
import 'package:platform/platform.dart';
@@ -30,19 +29,17 @@ class LocalSyncService {
try {
if (full || await _nativeSyncApi.shouldFullSync()) {
_log.fine("Full sync request from ${full ? "user" : "native"}");
DLog.log("Full sync request from ${full ? "user" : "native"}");
return await fullSync();
}
final delta = await _nativeSyncApi.getMediaChanges();
if (!delta.hasChanges) {
_log.fine("No media changes detected. Skipping sync");
DLog.log("No media changes detected. Skipping sync");
return;
}
DLog.log("Delta updated: ${delta.updates.length}");
DLog.log("Delta deleted: ${delta.deletes.length}");
_log.fine("Delta updated: ${delta.updates.length}");
_log.fine("Delta deleted: ${delta.deletes.length}");
final deviceAlbums = await _nativeSyncApi.getAlbums();
await _localAlbumRepository.updateAll(deviceAlbums.toLocalAlbums());
@@ -83,7 +80,6 @@ class LocalSyncService {
} finally {
stopwatch.stop();
_log.info("Device sync took - ${stopwatch.elapsedMilliseconds}ms");
DLog.log("Device sync took - ${stopwatch.elapsedMilliseconds}ms");
}
}
@@ -106,7 +102,6 @@ class LocalSyncService {
await _nativeSyncApi.checkpointSync();
stopwatch.stop();
_log.info("Full device sync took - ${stopwatch.elapsedMilliseconds}ms");
DLog.log("Full device sync took - ${stopwatch.elapsedMilliseconds}ms");
} catch (e, s) {
_log.severe("Error performing full device sync", e, s);
}
@@ -150,7 +145,6 @@ class LocalSyncService {
// Faster path - only new assets added
if (await checkAddition(dbAlbum, deviceAlbum)) {
_log.fine("Fast synced device album ${dbAlbum.name}");
DLog.log("Fast synced device album ${dbAlbum.name}");
return true;
}

View File

@@ -14,7 +14,7 @@ import 'package:logging/logging.dart';
/// writes them to a persistent [ILogRepository], and manages log levels
/// via [IStoreRepository]
class LogService {
final IsarLogRepository _logRepository;
final LogRepository _logRepository;
final IsarStoreRepository _storeRepository;
final List<LogMessage> _msgBuffer = [];
@@ -37,7 +37,7 @@ class LogService {
}
static Future<LogService> init({
required IsarLogRepository logRepository,
required LogRepository logRepository,
required IsarStoreRepository storeRepository,
bool shouldBuffer = true,
}) async {
@@ -50,7 +50,7 @@ class LogService {
}
static Future<LogService> create({
required IsarLogRepository logRepository,
required LogRepository logRepository,
required IsarStoreRepository storeRepository,
bool shouldBuffer = true,
}) async {
@@ -85,7 +85,7 @@ class LogService {
if (_shouldBuffer) {
_msgBuffer.add(record);
_flushTimer ??= Timer(const Duration(seconds: 5), () => unawaited(flushBuffer()));
_flushTimer ??= Timer(const Duration(seconds: 5), () => unawaited(_flushBuffer()));
} else {
unawaited(_logRepository.insert(record));
}
@@ -108,20 +108,18 @@ class LogService {
await _logRepository.deleteAll();
}
void flush() {
Future<void> flush() {
_flushTimer?.cancel();
// TODO: Rename enable this after moving to sqlite - #16504
// await _flushBufferToDatabase();
return _flushBuffer();
}
Future<void> dispose() {
_flushTimer?.cancel();
_logSubscription.cancel();
return flushBuffer();
return _flushBuffer();
}
// TOOD: Move this to private once Isar is removed
Future<void> flushBuffer() async {
Future<void> _flushBuffer() async {
_flushTimer = null;
final buffer = [..._msgBuffer];
_msgBuffer.clear();

View File

@@ -0,0 +1,23 @@
import 'package:immich_mobile/domain/models/map.model.dart';
import 'package:immich_mobile/infrastructure/repositories/map.repository.dart';
import 'package:maplibre_gl/maplibre_gl.dart';
typedef MapMarkerSource = Future<List<Marker>> Function(LatLngBounds? bounds);
typedef MapQuery = ({MapMarkerSource markerSource});
class MapFactory {
final DriftMapRepository _mapRepository;
const MapFactory({required DriftMapRepository mapRepository}) : _mapRepository = mapRepository;
MapService remote(String ownerId) => MapService(_mapRepository.remote(ownerId));
}
class MapService {
final MapMarkerSource _markerSource;
MapService(MapQuery query) : _markerSource = query.markerSource;
Future<List<Marker>> Function(LatLngBounds? bounds) get getMarkers => _markerSource;
}

View File

@@ -1,12 +1,12 @@
import 'dart:async';
import 'package:collection/collection.dart';
import 'package:immich_mobile/domain/models/album/album.model.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/domain/models/user.model.dart';
import 'package:immich_mobile/infrastructure/repositories/remote_album.repository.dart';
import 'package:immich_mobile/models/albums/album_search.model.dart';
import 'package:immich_mobile/repositories/drift_album_api_repository.dart';
import 'package:immich_mobile/utils/remote_album.utils.dart';
class RemoteAlbumService {
final DriftRemoteAlbumRepository _repository;
@@ -26,8 +26,21 @@ class RemoteAlbumService {
return _repository.get(albumId);
}
List<RemoteAlbum> sortAlbums(List<RemoteAlbum> albums, RemoteAlbumSortMode sortMode, {bool isReverse = false}) {
return sortMode.sortFn(albums, isReverse);
Future<List<RemoteAlbum>> sortAlbums(
List<RemoteAlbum> albums,
RemoteAlbumSortMode sortMode, {
bool isReverse = false,
}) async {
final List<RemoteAlbum> sorted = switch (sortMode) {
RemoteAlbumSortMode.created => albums.sortedBy((album) => album.createdAt),
RemoteAlbumSortMode.title => albums.sortedBy((album) => album.name),
RemoteAlbumSortMode.lastModified => albums.sortedBy((album) => album.updatedAt),
RemoteAlbumSortMode.assetCount => albums.sortedBy((album) => album.assetCount),
RemoteAlbumSortMode.mostRecent => await _sortByNewestAsset(albums),
RemoteAlbumSortMode.mostOldest => await _sortByOldestAsset(albums),
};
return (isReverse ? sorted.reversed : sorted).toList();
}
List<RemoteAlbum> searchAlbums(
@@ -128,7 +141,75 @@ class RemoteAlbumService {
return _repository.addUsers(albumId, userIds);
}
Future<void> removeUser(String albumId, {required String userId}) async {
await _albumApiRepository.removeUser(albumId, userId: userId);
return _repository.removeUser(albumId, userId: userId);
}
Future<void> setActivityStatus(String albumId, bool enabled) async {
await _albumApiRepository.setActivityStatus(albumId, enabled);
return _repository.setActivityStatus(albumId, enabled);
}
Future<int> getCount() {
return _repository.getCount();
}
Future<List<RemoteAlbum>> _sortByNewestAsset(List<RemoteAlbum> albums) async {
// map album IDs to their newest asset dates
final Map<String, Future<DateTime?>> assetTimestampFutures = {};
for (final album in albums) {
assetTimestampFutures[album.id] = _repository.getNewestAssetTimestamp(album.id);
}
// await all database queries
final entries = await Future.wait(
assetTimestampFutures.entries.map((entry) async => MapEntry(entry.key, await entry.value)),
);
final assetTimestamps = Map.fromEntries(entries);
final sorted = albums.sorted((a, b) {
final aDate = assetTimestamps[a.id] ?? DateTime.fromMillisecondsSinceEpoch(0);
final bDate = assetTimestamps[b.id] ?? DateTime.fromMillisecondsSinceEpoch(0);
return aDate.compareTo(bDate);
});
return sorted;
}
Future<List<RemoteAlbum>> _sortByOldestAsset(List<RemoteAlbum> albums) async {
// map album IDs to their oldest asset dates
final Map<String, Future<DateTime?>> assetTimestampFutures = {
for (final album in albums) album.id: _repository.getOldestAssetTimestamp(album.id),
};
// await all database queries
final entries = await Future.wait(
assetTimestampFutures.entries.map((entry) async => MapEntry(entry.key, await entry.value)),
);
final assetTimestamps = Map.fromEntries(entries);
final sorted = albums.sorted((a, b) {
final aDate = assetTimestamps[a.id] ?? DateTime.fromMillisecondsSinceEpoch(0);
final bDate = assetTimestamps[b.id] ?? DateTime.fromMillisecondsSinceEpoch(0);
return aDate.compareTo(bDate);
});
return sorted.reversed.toList();
}
}
enum RemoteAlbumSortMode {
title("library_page_sort_title"),
assetCount("library_page_sort_asset_count"),
lastModified("library_page_sort_last_modified"),
created("library_page_sort_created"),
mostRecent("sort_newest"),
mostOldest("sort_oldest");
final String key;
const RemoteAlbumSortMode(this.key);
}

View File

@@ -3,7 +3,6 @@ import 'dart:async';
import 'package:immich_mobile/domain/models/sync_event.model.dart';
import 'package:immich_mobile/infrastructure/repositories/sync_api.repository.dart';
import 'package:immich_mobile/infrastructure/repositories/sync_stream.repository.dart';
import 'package:immich_mobile/presentation/pages/dev/dev_logger.dart';
import 'package:logging/logging.dart';
import 'package:openapi/api.dart';
@@ -26,7 +25,6 @@ class SyncStreamService {
Future<void> sync() {
_logger.info("Remote sync request for user");
DLog.log("Remote sync request for user");
// Start the sync stream and handle events
return _syncApiRepository.streamChanges(_handleEvents);
}
@@ -139,14 +137,18 @@ class SyncStreamService {
return _syncStreamRepository.updateAlbumUsersV1(data.cast(), debugLabel: 'backfill');
case SyncEntityType.albumUserDeleteV1:
return _syncStreamRepository.deleteAlbumUsersV1(data.cast());
case SyncEntityType.albumAssetV1:
return _syncStreamRepository.updateAssetsV1(data.cast(), debugLabel: 'album');
case SyncEntityType.albumAssetCreateV1:
return _syncStreamRepository.updateAssetsV1(data.cast(), debugLabel: 'album asset create');
case SyncEntityType.albumAssetUpdateV1:
return _syncStreamRepository.updateAssetsV1(data.cast(), debugLabel: 'album asset update');
case SyncEntityType.albumAssetBackfillV1:
return _syncStreamRepository.updateAssetsV1(data.cast(), debugLabel: 'album backfill');
case SyncEntityType.albumAssetExifV1:
return _syncStreamRepository.updateAssetsExifV1(data.cast(), debugLabel: 'album');
return _syncStreamRepository.updateAssetsV1(data.cast(), debugLabel: 'album asset backfill');
case SyncEntityType.albumAssetExifCreateV1:
return _syncStreamRepository.updateAssetsExifV1(data.cast(), debugLabel: 'album asset exif create');
case SyncEntityType.albumAssetExifUpdateV1:
return _syncStreamRepository.updateAssetsExifV1(data.cast(), debugLabel: 'album asset exif update');
case SyncEntityType.albumAssetExifBackfillV1:
return _syncStreamRepository.updateAssetsExifV1(data.cast(), debugLabel: 'album backfill');
return _syncStreamRepository.updateAssetsExifV1(data.cast(), debugLabel: 'album asset exif backfill');
case SyncEntityType.albumToAssetV1:
return _syncStreamRepository.updateAlbumToAssetsV1(data.cast());
case SyncEntityType.albumToAssetBackfillV1:

View File

@@ -10,6 +10,7 @@ import 'package:immich_mobile/domain/services/setting.service.dart';
import 'package:immich_mobile/domain/utils/event_stream.dart';
import 'package:immich_mobile/infrastructure/repositories/timeline.repository.dart';
import 'package:immich_mobile/utils/async_mutex.dart';
import 'package:maplibre_gl/maplibre_gl.dart';
typedef TimelineAssetSource = Future<List<BaseAsset>> Function(int index, int count);
@@ -57,6 +58,8 @@ class TimelineFactory {
TimelineService(_timelineRepository.person(userId, personId, groupBy));
TimelineService fromAssets(List<BaseAsset> assets) => TimelineService(_timelineRepository.fromAssets(assets));
TimelineService map(LatLngBounds bounds) => TimelineService(_timelineRepository.map(bounds, groupBy));
}
class TimelineService {

View File

@@ -37,7 +37,7 @@ class BackgroundSyncManager {
this.onHashingError,
});
Future<void> cancel() {
Future<void> cancel() async {
final futures = <Future>[];
if (_syncTask != null) {
@@ -52,7 +52,11 @@ class BackgroundSyncManager {
_syncWebsocketTask?.cancel();
_syncWebsocketTask = null;
return Future.wait(futures);
try {
await Future.wait(futures);
} on CanceledError {
// Ignore cancellation errors
}
}
// No need to cancel the task, as it can also be run when the user logs out

View File

@@ -13,6 +13,9 @@ extension ContextHelper on BuildContext {
// Returns the current height from MediaQuery
double get height => MediaQuery.sizeOf(this).height;
// Returns the current size from MediaQuery
Size get sizeData => MediaQuery.sizeOf(this);
// Returns true if the app is running on a mobile device (!tablets)
bool get isMobile => width < 550;

View File

@@ -0,0 +1,10 @@
import 'dart:ui';
import 'package:flutter/painting.dart';
extension CodecImageInfoExtension on Codec {
Future<ImageInfo> getImageInfo({double scale = 1.0}) async {
final frame = await getNextFrame();
return ImageInfo(image: frame.image, scale: scale);
}
}

View File

@@ -85,3 +85,13 @@ extension DateRangeFormatting on DateTime {
}
}
}
extension IsSameExtension on DateTime {
bool isSameDay(DateTime other) {
return day == other.day && month == other.month && year == other.year;
}
bool isSameMonth(DateTime other) {
return month == other.month && year == other.year;
}
}

View File

@@ -32,3 +32,31 @@ class FastClampingScrollPhysics extends ClampingScrollPhysics {
damping: 80,
);
}
class ScrollUnawareScrollPhysics extends ScrollPhysics {
const ScrollUnawareScrollPhysics({super.parent});
@override
ScrollUnawareScrollPhysics applyTo(ScrollPhysics? ancestor) {
return ScrollUnawareScrollPhysics(parent: buildParent(ancestor));
}
@override
bool recommendDeferredLoading(double velocity, ScrollMetrics metrics, BuildContext context) {
return false;
}
}
class ScrollUnawareClampingScrollPhysics extends ClampingScrollPhysics {
const ScrollUnawareClampingScrollPhysics({super.parent});
@override
ScrollUnawareClampingScrollPhysics applyTo(ScrollPhysics? ancestor) {
return ScrollUnawareClampingScrollPhysics(parent: buildParent(ancestor));
}
@override
bool recommendDeferredLoading(double velocity, ScrollMetrics metrics, BuildContext context) {
return false;
}
}

View File

@@ -95,6 +95,7 @@ class ExifInfo {
);
}
@TableIndex.sql('CREATE INDEX IF NOT EXISTS idx_lat_lng ON remote_exif_entity (latitude, longitude)')
class RemoteExifEntity extends Table with DriftDefaultsMixin {
const RemoteExifEntity();

View File

@@ -699,6 +699,10 @@ typedef $$RemoteExifEntityTableProcessedTableManager =
i1.RemoteExifEntityData,
i0.PrefetchHooks Function({bool assetId})
>;
i0.Index get idxLatLng => i0.Index(
'idx_lat_lng',
'CREATE INDEX IF NOT EXISTS idx_lat_lng ON remote_exif_entity (latitude, longitude)',
);
class $RemoteExifEntityTable extends i2.RemoteExifEntity
with i0.TableInfo<$RemoteExifEntityTable, i1.RemoteExifEntityData> {

View File

@@ -4,7 +4,7 @@ import 'package:immich_mobile/infrastructure/entities/local_asset.entity.drift.d
import 'package:immich_mobile/infrastructure/utils/asset.mixin.dart';
import 'package:immich_mobile/infrastructure/utils/drift_default.mixin.dart';
@TableIndex(name: 'idx_local_asset_checksum', columns: {#checksum})
@TableIndex.sql('CREATE INDEX IF NOT EXISTS idx_local_asset_checksum ON local_asset_entity (checksum)')
class LocalAssetEntity extends Table with DriftDefaultsMixin, AssetEntityMixin {
const LocalAssetEntity();

View File

@@ -338,7 +338,7 @@ typedef $$LocalAssetEntityTableProcessedTableManager =
>;
i0.Index get idxLocalAssetChecksum => i0.Index(
'idx_local_asset_checksum',
'CREATE INDEX idx_local_asset_checksum ON local_asset_entity (checksum)',
'CREATE INDEX IF NOT EXISTS idx_local_asset_checksum ON local_asset_entity (checksum)',
);
class $LocalAssetEntityTable extends i3.LocalAssetEntity

View File

@@ -1,47 +1,29 @@
import 'package:immich_mobile/domain/models/log.model.dart';
import 'package:isar/isar.dart';
import 'package:drift/drift.dart';
import 'package:immich_mobile/infrastructure/entities/log.entity.drift.dart';
import 'package:immich_mobile/domain/models/log.model.dart' as domain;
part 'log.entity.g.dart';
class LogMessageEntity extends Table {
const LogMessageEntity();
@Collection(inheritance: false)
class LoggerMessage {
final Id id = Isar.autoIncrement;
final String message;
final String? details;
@Enumerated(EnumType.ordinal)
final LogLevel level;
final DateTime createdAt;
final String? context1;
final String? context2;
@override
String get tableName => 'logger_messages';
const LoggerMessage({
required this.message,
required this.details,
this.level = LogLevel.info,
required this.createdAt,
required this.context1,
required this.context2,
});
LogMessage toDto() {
return LogMessage(
message: message,
level: level,
createdAt: createdAt,
logger: context1,
error: details,
stack: context2,
);
}
static LoggerMessage fromDto(LogMessage log) {
return LoggerMessage(
message: log.message,
details: log.error,
level: log.level,
createdAt: log.createdAt,
context1: log.logger,
context2: log.stack,
);
}
IntColumn get id => integer().autoIncrement()();
TextColumn get message => text()();
TextColumn get details => text().nullable()();
IntColumn get level => intEnum<domain.LogLevel>()();
DateTimeColumn get createdAt => dateTime()();
TextColumn get logger => text().nullable()();
TextColumn get stack => text().nullable()();
}
extension LogMessageEntityDataDomainEx on LogMessageEntityData {
domain.LogMessage toDto() => domain.LogMessage(
message: message,
level: level,
createdAt: createdAt,
logger: logger,
error: details,
stack: stack,
);
}

View File

@@ -0,0 +1,697 @@
// dart format width=80
// ignore_for_file: type=lint
import 'package:drift/drift.dart' as i0;
import 'package:immich_mobile/infrastructure/entities/log.entity.drift.dart'
as i1;
import 'package:immich_mobile/domain/models/log.model.dart' as i2;
import 'package:immich_mobile/infrastructure/entities/log.entity.dart' as i3;
typedef $$LogMessageEntityTableCreateCompanionBuilder =
i1.LogMessageEntityCompanion Function({
i0.Value<int> id,
required String message,
i0.Value<String?> details,
required i2.LogLevel level,
required DateTime createdAt,
i0.Value<String?> logger,
i0.Value<String?> stack,
});
typedef $$LogMessageEntityTableUpdateCompanionBuilder =
i1.LogMessageEntityCompanion Function({
i0.Value<int> id,
i0.Value<String> message,
i0.Value<String?> details,
i0.Value<i2.LogLevel> level,
i0.Value<DateTime> createdAt,
i0.Value<String?> logger,
i0.Value<String?> stack,
});
class $$LogMessageEntityTableFilterComposer
extends i0.Composer<i0.GeneratedDatabase, i1.$LogMessageEntityTable> {
$$LogMessageEntityTableFilterComposer({
required super.$db,
required super.$table,
super.joinBuilder,
super.$addJoinBuilderToRootComposer,
super.$removeJoinBuilderFromRootComposer,
});
i0.ColumnFilters<int> get id => $composableBuilder(
column: $table.id,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnFilters<String> get message => $composableBuilder(
column: $table.message,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnFilters<String> get details => $composableBuilder(
column: $table.details,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnWithTypeConverterFilters<i2.LogLevel, i2.LogLevel, int> get level =>
$composableBuilder(
column: $table.level,
builder: (column) => i0.ColumnWithTypeConverterFilters(column),
);
i0.ColumnFilters<DateTime> get createdAt => $composableBuilder(
column: $table.createdAt,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnFilters<String> get logger => $composableBuilder(
column: $table.logger,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnFilters<String> get stack => $composableBuilder(
column: $table.stack,
builder: (column) => i0.ColumnFilters(column),
);
}
class $$LogMessageEntityTableOrderingComposer
extends i0.Composer<i0.GeneratedDatabase, i1.$LogMessageEntityTable> {
$$LogMessageEntityTableOrderingComposer({
required super.$db,
required super.$table,
super.joinBuilder,
super.$addJoinBuilderToRootComposer,
super.$removeJoinBuilderFromRootComposer,
});
i0.ColumnOrderings<int> get id => $composableBuilder(
column: $table.id,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<String> get message => $composableBuilder(
column: $table.message,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<String> get details => $composableBuilder(
column: $table.details,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<int> get level => $composableBuilder(
column: $table.level,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<DateTime> get createdAt => $composableBuilder(
column: $table.createdAt,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<String> get logger => $composableBuilder(
column: $table.logger,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<String> get stack => $composableBuilder(
column: $table.stack,
builder: (column) => i0.ColumnOrderings(column),
);
}
class $$LogMessageEntityTableAnnotationComposer
extends i0.Composer<i0.GeneratedDatabase, i1.$LogMessageEntityTable> {
$$LogMessageEntityTableAnnotationComposer({
required super.$db,
required super.$table,
super.joinBuilder,
super.$addJoinBuilderToRootComposer,
super.$removeJoinBuilderFromRootComposer,
});
i0.GeneratedColumn<int> get id =>
$composableBuilder(column: $table.id, builder: (column) => column);
i0.GeneratedColumn<String> get message =>
$composableBuilder(column: $table.message, builder: (column) => column);
i0.GeneratedColumn<String> get details =>
$composableBuilder(column: $table.details, builder: (column) => column);
i0.GeneratedColumnWithTypeConverter<i2.LogLevel, int> get level =>
$composableBuilder(column: $table.level, builder: (column) => column);
i0.GeneratedColumn<DateTime> get createdAt =>
$composableBuilder(column: $table.createdAt, builder: (column) => column);
i0.GeneratedColumn<String> get logger =>
$composableBuilder(column: $table.logger, builder: (column) => column);
i0.GeneratedColumn<String> get stack =>
$composableBuilder(column: $table.stack, builder: (column) => column);
}
class $$LogMessageEntityTableTableManager
extends
i0.RootTableManager<
i0.GeneratedDatabase,
i1.$LogMessageEntityTable,
i1.LogMessageEntityData,
i1.$$LogMessageEntityTableFilterComposer,
i1.$$LogMessageEntityTableOrderingComposer,
i1.$$LogMessageEntityTableAnnotationComposer,
$$LogMessageEntityTableCreateCompanionBuilder,
$$LogMessageEntityTableUpdateCompanionBuilder,
(
i1.LogMessageEntityData,
i0.BaseReferences<
i0.GeneratedDatabase,
i1.$LogMessageEntityTable,
i1.LogMessageEntityData
>,
),
i1.LogMessageEntityData,
i0.PrefetchHooks Function()
> {
$$LogMessageEntityTableTableManager(
i0.GeneratedDatabase db,
i1.$LogMessageEntityTable table,
) : super(
i0.TableManagerState(
db: db,
table: table,
createFilteringComposer: () =>
i1.$$LogMessageEntityTableFilterComposer($db: db, $table: table),
createOrderingComposer: () => i1
.$$LogMessageEntityTableOrderingComposer($db: db, $table: table),
createComputedFieldComposer: () =>
i1.$$LogMessageEntityTableAnnotationComposer(
$db: db,
$table: table,
),
updateCompanionCallback:
({
i0.Value<int> id = const i0.Value.absent(),
i0.Value<String> message = const i0.Value.absent(),
i0.Value<String?> details = const i0.Value.absent(),
i0.Value<i2.LogLevel> level = const i0.Value.absent(),
i0.Value<DateTime> createdAt = const i0.Value.absent(),
i0.Value<String?> logger = const i0.Value.absent(),
i0.Value<String?> stack = const i0.Value.absent(),
}) => i1.LogMessageEntityCompanion(
id: id,
message: message,
details: details,
level: level,
createdAt: createdAt,
logger: logger,
stack: stack,
),
createCompanionCallback:
({
i0.Value<int> id = const i0.Value.absent(),
required String message,
i0.Value<String?> details = const i0.Value.absent(),
required i2.LogLevel level,
required DateTime createdAt,
i0.Value<String?> logger = const i0.Value.absent(),
i0.Value<String?> stack = const i0.Value.absent(),
}) => i1.LogMessageEntityCompanion.insert(
id: id,
message: message,
details: details,
level: level,
createdAt: createdAt,
logger: logger,
stack: stack,
),
withReferenceMapper: (p0) => p0
.map((e) => (e.readTable(table), i0.BaseReferences(db, table, e)))
.toList(),
prefetchHooksCallback: null,
),
);
}
typedef $$LogMessageEntityTableProcessedTableManager =
i0.ProcessedTableManager<
i0.GeneratedDatabase,
i1.$LogMessageEntityTable,
i1.LogMessageEntityData,
i1.$$LogMessageEntityTableFilterComposer,
i1.$$LogMessageEntityTableOrderingComposer,
i1.$$LogMessageEntityTableAnnotationComposer,
$$LogMessageEntityTableCreateCompanionBuilder,
$$LogMessageEntityTableUpdateCompanionBuilder,
(
i1.LogMessageEntityData,
i0.BaseReferences<
i0.GeneratedDatabase,
i1.$LogMessageEntityTable,
i1.LogMessageEntityData
>,
),
i1.LogMessageEntityData,
i0.PrefetchHooks Function()
>;
class $LogMessageEntityTable extends i3.LogMessageEntity
with i0.TableInfo<$LogMessageEntityTable, i1.LogMessageEntityData> {
@override
final i0.GeneratedDatabase attachedDatabase;
final String? _alias;
$LogMessageEntityTable(this.attachedDatabase, [this._alias]);
static const i0.VerificationMeta _idMeta = const i0.VerificationMeta('id');
@override
late final i0.GeneratedColumn<int> id = i0.GeneratedColumn<int>(
'id',
aliasedName,
false,
hasAutoIncrement: true,
type: i0.DriftSqlType.int,
requiredDuringInsert: false,
defaultConstraints: i0.GeneratedColumn.constraintIsAlways(
'PRIMARY KEY AUTOINCREMENT',
),
);
static const i0.VerificationMeta _messageMeta = const i0.VerificationMeta(
'message',
);
@override
late final i0.GeneratedColumn<String> message = i0.GeneratedColumn<String>(
'message',
aliasedName,
false,
type: i0.DriftSqlType.string,
requiredDuringInsert: true,
);
static const i0.VerificationMeta _detailsMeta = const i0.VerificationMeta(
'details',
);
@override
late final i0.GeneratedColumn<String> details = i0.GeneratedColumn<String>(
'details',
aliasedName,
true,
type: i0.DriftSqlType.string,
requiredDuringInsert: false,
);
@override
late final i0.GeneratedColumnWithTypeConverter<i2.LogLevel, int> level =
i0.GeneratedColumn<int>(
'level',
aliasedName,
false,
type: i0.DriftSqlType.int,
requiredDuringInsert: true,
).withConverter<i2.LogLevel>(i1.$LogMessageEntityTable.$converterlevel);
static const i0.VerificationMeta _createdAtMeta = const i0.VerificationMeta(
'createdAt',
);
@override
late final i0.GeneratedColumn<DateTime> createdAt =
i0.GeneratedColumn<DateTime>(
'created_at',
aliasedName,
false,
type: i0.DriftSqlType.dateTime,
requiredDuringInsert: true,
);
static const i0.VerificationMeta _loggerMeta = const i0.VerificationMeta(
'logger',
);
@override
late final i0.GeneratedColumn<String> logger = i0.GeneratedColumn<String>(
'logger',
aliasedName,
true,
type: i0.DriftSqlType.string,
requiredDuringInsert: false,
);
static const i0.VerificationMeta _stackMeta = const i0.VerificationMeta(
'stack',
);
@override
late final i0.GeneratedColumn<String> stack = i0.GeneratedColumn<String>(
'stack',
aliasedName,
true,
type: i0.DriftSqlType.string,
requiredDuringInsert: false,
);
@override
List<i0.GeneratedColumn> get $columns => [
id,
message,
details,
level,
createdAt,
logger,
stack,
];
@override
String get aliasedName => _alias ?? actualTableName;
@override
String get actualTableName => $name;
static const String $name = 'logger_messages';
@override
i0.VerificationContext validateIntegrity(
i0.Insertable<i1.LogMessageEntityData> instance, {
bool isInserting = false,
}) {
final context = i0.VerificationContext();
final data = instance.toColumns(true);
if (data.containsKey('id')) {
context.handle(_idMeta, id.isAcceptableOrUnknown(data['id']!, _idMeta));
}
if (data.containsKey('message')) {
context.handle(
_messageMeta,
message.isAcceptableOrUnknown(data['message']!, _messageMeta),
);
} else if (isInserting) {
context.missing(_messageMeta);
}
if (data.containsKey('details')) {
context.handle(
_detailsMeta,
details.isAcceptableOrUnknown(data['details']!, _detailsMeta),
);
}
if (data.containsKey('created_at')) {
context.handle(
_createdAtMeta,
createdAt.isAcceptableOrUnknown(data['created_at']!, _createdAtMeta),
);
} else if (isInserting) {
context.missing(_createdAtMeta);
}
if (data.containsKey('logger')) {
context.handle(
_loggerMeta,
logger.isAcceptableOrUnknown(data['logger']!, _loggerMeta),
);
}
if (data.containsKey('stack')) {
context.handle(
_stackMeta,
stack.isAcceptableOrUnknown(data['stack']!, _stackMeta),
);
}
return context;
}
@override
Set<i0.GeneratedColumn> get $primaryKey => {id};
@override
i1.LogMessageEntityData map(
Map<String, dynamic> data, {
String? tablePrefix,
}) {
final effectivePrefix = tablePrefix != null ? '$tablePrefix.' : '';
return i1.LogMessageEntityData(
id: attachedDatabase.typeMapping.read(
i0.DriftSqlType.int,
data['${effectivePrefix}id'],
)!,
message: attachedDatabase.typeMapping.read(
i0.DriftSqlType.string,
data['${effectivePrefix}message'],
)!,
details: attachedDatabase.typeMapping.read(
i0.DriftSqlType.string,
data['${effectivePrefix}details'],
),
level: i1.$LogMessageEntityTable.$converterlevel.fromSql(
attachedDatabase.typeMapping.read(
i0.DriftSqlType.int,
data['${effectivePrefix}level'],
)!,
),
createdAt: attachedDatabase.typeMapping.read(
i0.DriftSqlType.dateTime,
data['${effectivePrefix}created_at'],
)!,
logger: attachedDatabase.typeMapping.read(
i0.DriftSqlType.string,
data['${effectivePrefix}logger'],
),
stack: attachedDatabase.typeMapping.read(
i0.DriftSqlType.string,
data['${effectivePrefix}stack'],
),
);
}
@override
$LogMessageEntityTable createAlias(String alias) {
return $LogMessageEntityTable(attachedDatabase, alias);
}
static i0.JsonTypeConverter2<i2.LogLevel, int, int> $converterlevel =
const i0.EnumIndexConverter<i2.LogLevel>(i2.LogLevel.values);
}
class LogMessageEntityData extends i0.DataClass
implements i0.Insertable<i1.LogMessageEntityData> {
final int id;
final String message;
final String? details;
final i2.LogLevel level;
final DateTime createdAt;
final String? logger;
final String? stack;
const LogMessageEntityData({
required this.id,
required this.message,
this.details,
required this.level,
required this.createdAt,
this.logger,
this.stack,
});
@override
Map<String, i0.Expression> toColumns(bool nullToAbsent) {
final map = <String, i0.Expression>{};
map['id'] = i0.Variable<int>(id);
map['message'] = i0.Variable<String>(message);
if (!nullToAbsent || details != null) {
map['details'] = i0.Variable<String>(details);
}
{
map['level'] = i0.Variable<int>(
i1.$LogMessageEntityTable.$converterlevel.toSql(level),
);
}
map['created_at'] = i0.Variable<DateTime>(createdAt);
if (!nullToAbsent || logger != null) {
map['logger'] = i0.Variable<String>(logger);
}
if (!nullToAbsent || stack != null) {
map['stack'] = i0.Variable<String>(stack);
}
return map;
}
factory LogMessageEntityData.fromJson(
Map<String, dynamic> json, {
i0.ValueSerializer? serializer,
}) {
serializer ??= i0.driftRuntimeOptions.defaultSerializer;
return LogMessageEntityData(
id: serializer.fromJson<int>(json['id']),
message: serializer.fromJson<String>(json['message']),
details: serializer.fromJson<String?>(json['details']),
level: i1.$LogMessageEntityTable.$converterlevel.fromJson(
serializer.fromJson<int>(json['level']),
),
createdAt: serializer.fromJson<DateTime>(json['createdAt']),
logger: serializer.fromJson<String?>(json['logger']),
stack: serializer.fromJson<String?>(json['stack']),
);
}
@override
Map<String, dynamic> toJson({i0.ValueSerializer? serializer}) {
serializer ??= i0.driftRuntimeOptions.defaultSerializer;
return <String, dynamic>{
'id': serializer.toJson<int>(id),
'message': serializer.toJson<String>(message),
'details': serializer.toJson<String?>(details),
'level': serializer.toJson<int>(
i1.$LogMessageEntityTable.$converterlevel.toJson(level),
),
'createdAt': serializer.toJson<DateTime>(createdAt),
'logger': serializer.toJson<String?>(logger),
'stack': serializer.toJson<String?>(stack),
};
}
i1.LogMessageEntityData copyWith({
int? id,
String? message,
i0.Value<String?> details = const i0.Value.absent(),
i2.LogLevel? level,
DateTime? createdAt,
i0.Value<String?> logger = const i0.Value.absent(),
i0.Value<String?> stack = const i0.Value.absent(),
}) => i1.LogMessageEntityData(
id: id ?? this.id,
message: message ?? this.message,
details: details.present ? details.value : this.details,
level: level ?? this.level,
createdAt: createdAt ?? this.createdAt,
logger: logger.present ? logger.value : this.logger,
stack: stack.present ? stack.value : this.stack,
);
LogMessageEntityData copyWithCompanion(i1.LogMessageEntityCompanion data) {
return LogMessageEntityData(
id: data.id.present ? data.id.value : this.id,
message: data.message.present ? data.message.value : this.message,
details: data.details.present ? data.details.value : this.details,
level: data.level.present ? data.level.value : this.level,
createdAt: data.createdAt.present ? data.createdAt.value : this.createdAt,
logger: data.logger.present ? data.logger.value : this.logger,
stack: data.stack.present ? data.stack.value : this.stack,
);
}
@override
String toString() {
return (StringBuffer('LogMessageEntityData(')
..write('id: $id, ')
..write('message: $message, ')
..write('details: $details, ')
..write('level: $level, ')
..write('createdAt: $createdAt, ')
..write('logger: $logger, ')
..write('stack: $stack')
..write(')'))
.toString();
}
@override
int get hashCode =>
Object.hash(id, message, details, level, createdAt, logger, stack);
@override
bool operator ==(Object other) =>
identical(this, other) ||
(other is i1.LogMessageEntityData &&
other.id == this.id &&
other.message == this.message &&
other.details == this.details &&
other.level == this.level &&
other.createdAt == this.createdAt &&
other.logger == this.logger &&
other.stack == this.stack);
}
class LogMessageEntityCompanion
extends i0.UpdateCompanion<i1.LogMessageEntityData> {
final i0.Value<int> id;
final i0.Value<String> message;
final i0.Value<String?> details;
final i0.Value<i2.LogLevel> level;
final i0.Value<DateTime> createdAt;
final i0.Value<String?> logger;
final i0.Value<String?> stack;
const LogMessageEntityCompanion({
this.id = const i0.Value.absent(),
this.message = const i0.Value.absent(),
this.details = const i0.Value.absent(),
this.level = const i0.Value.absent(),
this.createdAt = const i0.Value.absent(),
this.logger = const i0.Value.absent(),
this.stack = const i0.Value.absent(),
});
LogMessageEntityCompanion.insert({
this.id = const i0.Value.absent(),
required String message,
this.details = const i0.Value.absent(),
required i2.LogLevel level,
required DateTime createdAt,
this.logger = const i0.Value.absent(),
this.stack = const i0.Value.absent(),
}) : message = i0.Value(message),
level = i0.Value(level),
createdAt = i0.Value(createdAt);
static i0.Insertable<i1.LogMessageEntityData> custom({
i0.Expression<int>? id,
i0.Expression<String>? message,
i0.Expression<String>? details,
i0.Expression<int>? level,
i0.Expression<DateTime>? createdAt,
i0.Expression<String>? logger,
i0.Expression<String>? stack,
}) {
return i0.RawValuesInsertable({
if (id != null) 'id': id,
if (message != null) 'message': message,
if (details != null) 'details': details,
if (level != null) 'level': level,
if (createdAt != null) 'created_at': createdAt,
if (logger != null) 'logger': logger,
if (stack != null) 'stack': stack,
});
}
i1.LogMessageEntityCompanion copyWith({
i0.Value<int>? id,
i0.Value<String>? message,
i0.Value<String?>? details,
i0.Value<i2.LogLevel>? level,
i0.Value<DateTime>? createdAt,
i0.Value<String?>? logger,
i0.Value<String?>? stack,
}) {
return i1.LogMessageEntityCompanion(
id: id ?? this.id,
message: message ?? this.message,
details: details ?? this.details,
level: level ?? this.level,
createdAt: createdAt ?? this.createdAt,
logger: logger ?? this.logger,
stack: stack ?? this.stack,
);
}
@override
Map<String, i0.Expression> toColumns(bool nullToAbsent) {
final map = <String, i0.Expression>{};
if (id.present) {
map['id'] = i0.Variable<int>(id.value);
}
if (message.present) {
map['message'] = i0.Variable<String>(message.value);
}
if (details.present) {
map['details'] = i0.Variable<String>(details.value);
}
if (level.present) {
map['level'] = i0.Variable<int>(
i1.$LogMessageEntityTable.$converterlevel.toSql(level.value),
);
}
if (createdAt.present) {
map['created_at'] = i0.Variable<DateTime>(createdAt.value);
}
if (logger.present) {
map['logger'] = i0.Variable<String>(logger.value);
}
if (stack.present) {
map['stack'] = i0.Variable<String>(stack.value);
}
return map;
}
@override
String toString() {
return (StringBuffer('LogMessageEntityCompanion(')
..write('id: $id, ')
..write('message: $message, ')
..write('details: $details, ')
..write('level: $level, ')
..write('createdAt: $createdAt, ')
..write('logger: $logger, ')
..write('stack: $stack')
..write(')'))
.toString();
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -5,7 +5,9 @@ import 'package:immich_mobile/infrastructure/entities/user.entity.dart';
import 'package:immich_mobile/infrastructure/utils/asset.mixin.dart';
import 'package:immich_mobile/infrastructure/utils/drift_default.mixin.dart';
@TableIndex(name: 'idx_remote_asset_owner_checksum', columns: {#ownerId, #checksum})
@TableIndex.sql(
'CREATE INDEX IF NOT EXISTS idx_remote_asset_owner_checksum ON remote_asset_entity (owner_id, checksum)',
)
@TableIndex.sql('''
CREATE UNIQUE INDEX IF NOT EXISTS UQ_remote_assets_owner_checksum
ON remote_asset_entity (owner_id, checksum)
@@ -16,7 +18,7 @@ CREATE UNIQUE INDEX IF NOT EXISTS UQ_remote_assets_owner_library_checksum
ON remote_asset_entity (owner_id, library_id, checksum)
WHERE (library_id IS NOT NULL);
''')
@TableIndex(name: 'idx_remote_asset_checksum', columns: {#checksum})
@TableIndex.sql('CREATE INDEX IF NOT EXISTS idx_remote_asset_checksum ON remote_asset_entity (checksum)')
class RemoteAssetEntity extends Table with DriftDefaultsMixin, AssetEntityMixin {
const RemoteAssetEntity();

Some files were not shown because too many files have changed in this diff Show More