Compare commits

...

106 Commits

Author SHA1 Message Date
mertalev
b59c4ddc9c upload asset button 2025-10-13 08:35:49 -04:00
mertalev
8128876472 remove upload-length from conventional upload e2e 2025-10-12 19:44:42 -04:00
mertalev
e9af3bf2fe linting 2025-10-12 19:39:56 -04:00
mertalev
87ca5e7b1d redundant check 2025-10-12 19:33:22 -04:00
mertalev
0d3cc89ba0 update api 2025-10-12 19:29:11 -04:00
mertalev
3e6a4f2417 lint 2025-10-12 19:21:41 -04:00
mertalev
63ff2e22d0 support conventional uploads 2025-10-12 19:18:52 -04:00
mertalev
7a32eb699c require header for incomplete uploads 2025-10-10 21:10:26 -04:00
mertalev
85a3854208 reject empty file 2025-10-10 21:08:16 -04:00
mertalev
504d8dc96c infer upload length when possible 2025-10-10 20:59:53 -04:00
mertalev
b0aa68d83a update api 2025-10-10 19:35:18 -04:00
mertalev
0ad983135c set max-age limit 2025-10-10 19:26:22 -04:00
mertalev
da52b3ebf4 add live photo e2e 2025-10-09 20:03:06 -04:00
mertalev
0be3b06a2a better abort check 2025-10-09 20:03:06 -04:00
mertalev
325f30815c unnecessary change 2025-10-09 20:03:06 -04:00
mertalev
ae2abb3cfe configurable cleanup 2025-10-09 20:03:06 -04:00
mertalev
883eb15ecb handle live photos 2025-10-09 20:03:06 -04:00
mertalev
a3d10ff46a tweak types 2025-10-09 20:03:06 -04:00
mertalev
38d2a03836 MUST NOT validation 2025-10-09 20:03:06 -04:00
mertalev
b1a2e7708e remove log 2025-10-09 20:03:06 -04:00
mertalev
57ea75bfc9 lint 2025-10-09 20:03:06 -04:00
mertalev
c295a48061 test interruption + abort 2025-10-09 20:03:06 -04:00
mertalev
ebda00fcf0 more content length test inputs 2025-10-09 20:03:06 -04:00
mertalev
4d04f80425 fix abortion return 2025-10-09 20:03:06 -04:00
mertalev
b68f70f28b typo 2025-10-09 20:03:06 -04:00
mertalev
758553672a proactive abortion 2025-10-09 20:03:06 -04:00
mertalev
1915e3ceb2 better content length handling 2025-10-09 20:03:06 -04:00
mertalev
0db8c10601 add timeout 2025-10-09 20:03:06 -04:00
mertalev
12b1a319e9 tidying 2025-10-09 20:03:06 -04:00
mertalev
6dbcf8b876 listen to upload event in e2e
test resume with real image
2025-10-09 20:03:06 -04:00
mertalev
484b73eb60 add service tests 2025-10-09 20:03:06 -04:00
mertalev
d4f3d9d6a5 add controller tests, move validation testing from e2e
revert unnecessary change

update mocks

add structured-headers to e2e deps
2025-10-09 20:03:06 -04:00
mertalev
597382a25f add note about RFC 9651
authdto

remove excess logs

use structured dictionary
2025-10-09 20:03:06 -04:00
mertalev
0105c9e2b6 clean up stale uploads
stale upload cleanup

try/catch file check
2025-10-09 20:03:06 -04:00
mertalev
071dbc1c50 unnecessary quota check 2025-10-09 20:03:06 -04:00
mertalev
97185964cb interim+500
interim+500

interim+500
2025-10-09 20:03:06 -04:00
mertalev
9f3a9030c7 more e2e tests
consistent e2e sections

decrement quota on cancel
2025-10-09 20:03:06 -04:00
mertalev
0a955e21b6 tweaks
shared pipe method

shared pipe method

require size upfront

make length optional for patch requests
2025-10-09 20:03:06 -04:00
mertalev
fb192bd310 ensure stream is closed before releasing lock 2025-10-09 20:03:06 -04:00
mertalev
a39f3f765d dto refactor
add logging

handle metadata
2025-10-09 20:03:06 -04:00
mertalev
35d3802219 backward compatibility 2025-10-09 20:03:06 -04:00
mertalev
026e367609 working e2e 2025-10-09 20:03:06 -04:00
mertalev
b3e5a381a8 interop v8 compliance 2025-10-09 20:03:06 -04:00
mertalev
7f50f268a5 chunked upload controller 2025-10-09 20:03:05 -04:00
renovate[bot]
ea610760ee fix(deps): update typescript-projects (#22809)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Daniel Dietzler <mail@ddietzler.dev>
2025-10-10 00:50:48 +02:00
renovate[bot]
a5e0d83d9f chore(deps): update base-image to v202510092146 (major) (#22818)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-09 22:40:23 +00:00
Daniel Dietzler
9793828dc7 chore: don't enforce runes (#22813) 2025-10-09 19:17:42 +00:00
bo0tzz
aed7bb53aa fix: revert terragrunt-action bump (#22812) 2025-10-09 21:11:39 +02:00
renovate[bot]
1fdbe2c6b8 chore(deps): update github-actions (major) (#22810)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-09 20:51:51 +02:00
bo0tzz
84302dc14c fix: remove postgres exclude datasource match (#22811) 2025-10-09 18:44:20 +00:00
bo0tzz
f7250f24fe chore: ignore renovate major updates for postgres image (#22764) 2025-10-09 18:34:54 +00:00
Peter Dave Hello
53680d9643 feat(docs): add zh_TW Traditional Chinese version README (#22703)
docs: add zh_TW Traditional Chinese version README
2025-10-08 15:10:58 -04:00
Tushar Harsora
b2d00405f1 feat(server): add immich.users.total metric (#21780)
* Add immich.users.total metric

* Fix tests & one lint error

* Lint

* Fix SQL Schema checks

* Fix nit

* Use workers argument in OnEvent hook and remove condition from method body
2025-10-08 13:24:11 -04:00
Sebastian Schneider
cf60f4cdcd feat(web): Add upload to stack action (#19842)
* feat(web): Add upload to stack action

* Event handling and translation

* Update asset viewer instead

* lint, improve upload return type

* Add suggestions from code review

* Resolve merge conflicts

* Apply suggestions from code review
2025-10-08 13:22:33 -04:00
Qhilm
d764a59011 docs: add Immich-Stack to community-projects (#21563)
docs: add Immich Stack community project

Co-authored-by: Jason Rasmussen <jason@rasm.me>
2025-10-08 17:21:44 +00:00
Jorge Montejo
7ee1b977c1 feat(cli): add debug development config (#22712)
* add debug and change ts-node with tsx

* update pr changes

* update pnpm-lock

* remove ts-node from readme

* typo

* resolve conflicts

* remove tsx

* launch from dist

* add preLaunchTask

* update readme

* undo main in package.json

* remove typo

* Apply suggestion from @bwees

Co-authored-by: Brandon Wees <brandonwees@gmail.com>

* revert pnpm-lock changes

* @jrasm91 suggestions

* chore: run node with source maps

---------

Co-authored-by: Jason Rasmussen <jason@rasm.me>
Co-authored-by: Brandon Wees <brandonwees@gmail.com>
2025-10-08 17:08:33 +00:00
kaziu687
9838634067 feat(web): seconds and milliseconds in timestamps (#20337)
* fix(web): seconds in timestamps

* changed date-input step to provide millisecond precision
2025-10-08 16:30:54 +00:00
Jason Rasmussen
eee793bfe4 docs: add some external library notes (#22776) 2025-10-08 12:13:41 -04:00
shenlong
b3342323de fix: persist search page scroll offset between rebuilds (#22733)
fix: persist search scroll between rebuilds

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-10-08 10:00:51 -05:00
Pascal Sommer
6f3cb4f1bb fix(web): Uniform random distribution during shuffle (#19902)
feat: better random distribution
2025-10-08 10:19:33 -04:00
Saschl
54ed78d0bf fix: brief flashing when swiping from video (#22187) 2025-10-08 09:31:15 -04:00
shenlong
265ed0b38f fix: skip local only assets in move to lock action (#22728)
* fix:prefer trashing to deletions

* skip local only assets in move to lock action

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-10-08 04:21:34 -05:00
shenlong
63c2f4415b chore: use hosted isar flutter libs (#22757)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-10-08 04:19:46 -05:00
renovate[bot]
a7cfd7f183 fix(deps): update dependency connectivity_plus to v7 (#22723)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-10-07 21:21:43 -05:00
renovate[bot]
ee4c45d5d3 chore(deps): update dependency nodemailer to v7.0.7 [security] (#22740)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 21:24:33 +02:00
renovate[bot]
24334aa3df chore(deps): update dependency @types/node to ^22.18.8 (#22719)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 14:42:00 -04:00
Brandon Wees
882baecf21 fix: bottom sheet blank with local assets that have remote counterparts (#22743) 2025-10-07 18:04:23 +00:00
shenlong
f16327d0ab chore: use isar immich fork (#22738)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-10-07 13:13:35 -04:00
Mert
8353db6a50 chore(deps): cache pnpm dependencies in prod build (#22555)
* cache pnpm dependencies

use different ids to be safe

unnecessary lines

* use buildcache folder
2025-10-07 13:10:54 -04:00
Mert
5270107926 fix(ml): ipv6 check (#22735) 2025-10-07 12:24:23 -04:00
bo0tzz
740ca14a68 chore: track full actions/cache version in comment (#22359) 2025-10-07 15:58:26 +00:00
Min Idzelis
966ab22065 refactor(web): extract asset viewer logic from Timeline into TimelineAssetViewer component (#22268)
refactor(web): extract asset viewer logic from Timeline into TimelineAssetViewer component

- Extracted asset viewer navigation and action handling logic from Timeline.svelte into a dedicated TimelineAssetViewer component
- Reduces Timeline.svelte complexity by ~150 lines and improves separation of concerns
- No functional changes - purely a refactoring to improve code organization

## Changes
- Created new TimelineAssetViewer.svelte component containing all asset viewer-related logic
- Moved handlePrevious, handleNext, handleRandom, handleClose, handlePreAction, and handleAction methods
- Timeline.svelte now only passes required props to the new component
- Maintained all existing functionality including navigation, asset actions, and stack management
2025-10-07 14:01:06 +00:00
Min Idzelis
78fbe0fd49 feat: make skeleton title optional (#22396)
feat: skeleton title is optional

feat: skeleton title optional
2025-10-07 13:58:59 +00:00
Min Idzelis
5862c454b7 refactor(web): extract timeline keyboard actions into separate component (#22266)
refactor(web): extract timeline keyboard actions into separate component

Extracts keyboard shortcuts and related functionality from Timeline component into a dedicated TimelineKeyboardActions component for better separation of concerns and maintainability.
2025-10-07 13:52:19 +00:00
shenlong
8ee495b08f fix: promote to foreground service before starting engine (#22517)
fix: show notification from native

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-10-07 08:49:43 -05:00
Min Idzelis
83db851b00 refactor(web): Clarify property names in Timeline and Scrubber (#22265)
refactor(web): Clarify property names in Timeline and Scrubber

  Renamed properties across Timeline/Scrubber components for clarity:
  - scrubOverallPercent → timelineScrollPercent
  - scrubberMonthPercent → viewportTopMonthScrollPercent
  - scrubberMonth → viewportTopMonth
  - leadout → isInLeadOutSection

  Additional changes:
  - Updated ScrubberListener signature to accept object parameter
  - Added detailed JSDoc comments for all Scrubber props
  - Fixed callback invocations to use new object syntax
  - Aligned Timeline's local state variables with Scrubber prop names
2025-10-07 09:43:27 -04:00
bo0tzz
70037018c8 fix: --no-git-checks on pnpm publish (#22715)
* fix: --no-git-checks on sdk publish

* fix: --no-git-checks on cli publish
2025-10-07 08:33:19 -05:00
renovate[bot]
796444d211 chore(deps): update github-actions (#22721)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 08:33:07 -05:00
renovate[bot]
0d66a15d9b chore(deps): update dependency flutter to v3.35.5 (#22720)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 08:32:55 -05:00
renovate[bot]
3cf8ed5f2d fix(deps): update dependency device_info_plus to v12 (#22724)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 08:31:26 -05:00
Min Idzelis
ff01af2450 chore: update devcontainers for trixie, devenv changes (#22194) 2025-10-07 08:28:47 -05:00
renovate[bot]
2de1b832e5 chore(deps): update redis:6.2-alpine docker digest to 2185e74 (#22718)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-07 12:41:16 +02:00
Alex
25142bb6c6 fix: improve the selected sidebar item text color in dark mode (#22640) 2025-10-06 21:51:00 +00:00
TDR001
01660b20fd docs: update Synology install guide (#21996)
Co-authored-by: mertalev <101130780+mertalev@users.noreply.github.com>
2025-10-06 21:48:39 +00:00
Xantin
9affee1ea0 docs: update TrueNAS migration instructions (#22463)
Co-authored-by: bo0tzz <git@bo0tzz.me>
Co-authored-by: Nicholas Flamy <30300649+NicholasFlamy@users.noreply.github.com>
2025-10-06 23:02:43 +02:00
Yaros
d02a82b618 fix(mobile): closing editor goes back to main page (#22647)
Co-authored-by: bwees <brandonwees@gmail.com>
2025-10-06 20:56:35 +00:00
Mārtiņš Bruņenieks
ad87dff18d fix(docs): Remove immich_remove_offline_files as no longer functional (#21774)
Co-authored-by: Mert <101130780+mertalev@users.noreply.github.com>
Co-authored-by: Brandon Wees <brandonwees@gmail.com>
2025-10-06 20:54:35 +00:00
CuberL
b7e06e7b6f fix: Fix issue fail to download iOS live photos (#22708)
Co-authored-by: bwees <brandonwees@gmail.com>
2025-10-06 20:53:35 +00:00
Diogo Correia
21f49572b1 chore(server): support vectorchord 0.5.x (#21602)
Co-authored-by: Mert <101130780+mertalev@users.noreply.github.com>
2025-10-06 20:37:54 +00:00
Brandon Wees
2b7d28528d fix: hide view in timeline button on local timeline (#22713) 2025-10-06 15:28:59 -05:00
Alex
cf4cf56ac0 chore: post release tasks (#22616) 2025-10-06 20:30:23 +01:00
Jason Rasmussen
50ac27238e fix(web): do not notify on patch releases (#22591) 2025-10-06 13:00:37 -04:00
Min Idzelis
b06b8ceef6 chore(web): upgrade ESLint and plugins (#22495)
* chore(web): upgrade ESLint and plugins, simplify linting configuration

- Update eslint from ^9.18.0 to ^9.36.0
- Update eslint plugins:
  - eslint-plugin-svelte: ^3.9.0 → ^3.12.4
  - eslint-plugin-unicorn: ^60.0.0 → ^61.0.2
  - svelte-eslint-parser: ^1.2.0 → ^1.3.3
  - typescript-eslint: ^8.28.0 → ^8.45.0
- Remove eslint-p dependency in favor of native eslint concurrency
- Add unicorn/no-array-sort rule exception
- Update linting scripts to use eslint's native --concurrency flag
- Update Makefile and mise.toml to reflect simplified lint commands
- Update GitHub Actions workflow to use standard pnpm lint command

* pnpm dedupe

---------

Co-authored-by: Ben McCann <322311+benmccann@users.noreply.github.com>
2025-10-06 12:41:28 -04:00
Jorge Montejo
119f92bb20 chore: update cli docs to pnpm (#22702)
update cli docs to pnpm
2025-10-06 12:39:23 -04:00
grgergo
6973683ea7 fix: use full-size image for non-web-compatible panoramas (#20359)
* fix(web): use full-size image for non-web-compatible panoramas

* always generate full-size image for panoramas

* add unit test

* fix formatting

---------

Co-authored-by: gergo= <gergo@pitty.hu>
2025-10-06 12:38:41 -04:00
Min Idzelis
42f46b11f4 fix: missing responsive calculation in UserPageLayout (#22455) 2025-10-06 12:36:20 -04:00
USBAkimbo
0fd16a3c46 docs: add job order diagram (#22673)
* docs: add job order diagram

* wording

---------

Co-authored-by: mertalev <101130780+mertalev@users.noreply.github.com>
2025-10-06 16:34:01 +00:00
Mert
43b06a036d fix(mobile): video player using ref after disposal (#21843)
check if disposed
2025-10-06 16:20:30 +00:00
Yaros
55ad83d80d fix(mobile): empty album description does not save (#22649) 2025-10-06 11:19:57 -05:00
Yaros
a80b9be07c fix(mobile): trash description cut off (#22662) 2025-10-06 16:18:00 +00:00
Mert
24234bedf1 fix(server): override reserved color metadata for video thumbnails (#22348)
override reserved metadata
2025-10-06 12:13:10 -04:00
Cokodayo
51150a3ed1 fix(ml): Resolve IPv6 startup crash and healthcheck failure (#22387)
* fix(ml): Resolve IPv6 startup crash and healthcheck failure

Fixes #13782

* fix(ml): updated the fix to use the std lib

* Apply code formatting to __main__.py
2025-10-06 12:09:40 -04:00
Adrian Jost
075436a5d1 chore: mark VSCode tasks as background tasks (#22631)
VSCode expect tasks that aren't marked as background tasks to finish eventually. That's not how a dev-server is supposed to work, we expect it to run for basically infinite time.

By marking those tasks as background tasks, VSCode stops showing the infinite loading spinner on those processes.
2025-10-06 11:55:54 -04:00
Sergey Katsubo
9da138e01e feat(server): improve checkAlbumAccess query performance (#22467)
* Fix slow SQL query in checkAlbumAccess caused by the array overlap operator &&

* Update access.repository.sql

* Rewrite the query to pass assetIds once as a single array parameter
2025-10-06 11:54:07 -04:00
Sergey Katsubo
1a2a46014e fix(server): fix chunking Postgres query parameters (#22684) 2025-10-06 11:37:35 -04:00
Xavier Dupuis
29acf89979 fix(docs): link to immich docs does not lead correctly to docs (#22687) 2025-10-06 12:57:15 +02:00
215 changed files with 8958 additions and 3374 deletions

View File

@@ -6,28 +6,35 @@ services:
- IMMICH_SERVER_URL=http://127.0.0.1:2283/
volumes: !override # bind mount host to /workspaces/immich
- ..:/workspaces/immich
- cli_node_modules:/workspaces/immich/cli/node_modules
- e2e_node_modules:/workspaces/immich/e2e/node_modules
- open_api_node_modules:/workspaces/immich/open-api/typescript-sdk/node_modules
- server_node_modules:/workspaces/immich/server/node_modules
- web_node_modules:/workspaces/immich/web/node_modules
- ${UPLOAD_LOCATION}/photos:/data
- ${UPLOAD_LOCATION:-upload-devcontainer-volume}${UPLOAD_LOCATION:+/photos}:/data
- pnpm-store:/usr/src/app/.pnpm-store
- server-node_modules:/usr/src/app/server/node_modules
- web-node_modules:/usr/src/app/web/node_modules
- github-node_modules:/usr/src/app/.github/node_modules
- cli-node_modules:/usr/src/app/cli/node_modules
- docs-node_modules:/usr/src/app/docs/node_modules
- e2e-node_modules:/usr/src/app/e2e/node_modules
- sdk-node_modules:/usr/src/app/open-api/typescript-sdk/node_modules
- app-node_modules:/usr/src/app/node_modules
- sveltekit:/usr/src/app/web/.svelte-kit
- coverage:/usr/src/app/web/coverage
- /etc/localtime:/etc/localtime:ro
immich-web:
env_file: !reset []
immich-machine-learning:
env_file: !reset []
database:
env_file: !reset []
environment: !override
POSTGRES_PASSWORD: ${DB_PASSWORD-postgres}
POSTGRES_USER: ${DB_USERNAME-postgres}
POSTGRES_DB: ${DB_DATABASE_NAME-immich}
POSTGRES_INITDB_ARGS: '--data-checksums'
POSTGRES_HOST_AUTH_METHOD: md5
volumes:
- ${UPLOAD_LOCATION}/postgres:/var/lib/postgresql/data
- ${UPLOAD_LOCATION:-postgres-devcontainer-volume}${UPLOAD_LOCATION:+/postgres}:/var/lib/postgresql/data
redis:
env_file: !reset []
volumes:
# Node modules for each service to avoid conflicts and ensure consistent dependencies
cli_node_modules:
e2e_node_modules:
open_api_node_modules:
server_node_modules:
web_node_modules:
# UPLOAD_LOCATION must be set to a absolute path or vol-upload
vol-upload:
# DB_DATA_LOCATION must be set to a absolute path or vol-database
vol-database:
upload-devcontainer-volume:
postgres-devcontainer-volume:

View File

@@ -40,7 +40,7 @@
"userEnvProbe": "loginInteractiveShell",
"remoteEnv": {
// The location where your uploaded files are stored
"UPLOAD_LOCATION": "${localEnv:UPLOAD_LOCATION:./Library}",
"UPLOAD_LOCATION": "${localEnv:UPLOAD_LOCATION:./library}",
// Connection secret for postgres. You should change it to a random password
// Please use only the characters `A-Za-z0-9`, without special characters or spaces
"DB_PASSWORD": "${localEnv:DB_PASSWORD:postgres}",

View File

@@ -55,7 +55,7 @@ jobs:
runs-on: mich
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ inputs.ref || github.sha }}
persist-credentials: false
@@ -66,14 +66,14 @@ jobs:
working-directory: ./mobile
run: printf "%s" $KEY_JKS | base64 -d > android/key.jks
- uses: actions/setup-java@c5195efecf7bdfc987ee8bae7a71cb8b11521c00 # v4.7.1
- uses: actions/setup-java@dded0888837ed1f317902acf8a20df0ad188d165 # v5.0.0
with:
distribution: 'zulu'
java-version: '17'
- name: Restore Gradle Cache
id: cache-gradle-restore
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: |
~/.gradle/caches
@@ -130,7 +130,7 @@ jobs:
- name: Save Gradle Cache
id: cache-gradle-save
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
if: github.ref == 'refs/heads/main'
with:
path: |

View File

@@ -19,7 +19,7 @@ jobs:
actions: write
steps:
- name: Check out code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false

View File

@@ -29,7 +29,7 @@ jobs:
working-directory: ./cli
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
@@ -37,7 +37,7 @@ jobs:
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './cli/.nvmrc'
registry-url: 'https://registry.npmjs.org'
@@ -50,7 +50,7 @@ jobs:
- run: pnpm install --frozen-lockfile
- run: pnpm build
- run: pnpm publish
- run: pnpm publish --no-git-checks
if: ${{ github.event_name == 'release' }}
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
@@ -65,7 +65,7 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
@@ -76,7 +76,7 @@ jobs:
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Login to GitHub Container Registry
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
if: ${{ !github.event.pull_request.head.repo.fork }}
with:
registry: ghcr.io

View File

@@ -44,13 +44,13 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
uses: github/codeql-action/init@3599b3baa15b485a2e49ef411a7a4bb2452e7f93 # v3.30.5
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@@ -63,7 +63,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
uses: github/codeql-action/autobuild@3599b3baa15b485a2e49ef411a7a4bb2452e7f93 # v3.30.5
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
@@ -76,6 +76,6 @@ jobs:
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@192325c86100d080feab897ff886c34abd4c83a3 # v3.30.3
uses: github/codeql-action/analyze@3599b3baa15b485a2e49ef411a7a4bb2452e7f93 # v3.30.5
with:
category: '/language:${{matrix.language}}'

View File

@@ -53,7 +53,7 @@ jobs:
suffix: ['', '-cuda', '-rocm', '-openvino', '-armnn', '-rknn']
steps:
- name: Login to GitHub Container Registry
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -82,7 +82,7 @@ jobs:
suffix: ['']
steps:
- name: Login to GitHub Container Registry
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

View File

@@ -47,7 +47,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
@@ -55,7 +55,7 @@ jobs:
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './docs/.nvmrc'
cache: 'pnpm'

View File

@@ -20,7 +20,7 @@ jobs:
run: echo 'The triggering workflow did not succeed' && exit 1
- name: Get artifact
id: get-artifact
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
let allArtifacts = await github.rest.actions.listWorkflowRunArtifacts({
@@ -38,7 +38,7 @@ jobs:
return { found: true, id: matchArtifact.id };
- name: Determine deploy parameters
id: parameters
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
HEAD_SHA: ${{ github.event.workflow_run.head_sha }}
with:
@@ -108,13 +108,13 @@ jobs:
if: ${{ fromJson(needs.checks.outputs.artifact).found && fromJson(needs.checks.outputs.parameters).shouldDeploy }}
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Load parameters
id: parameters
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
PARAM_JSON: ${{ needs.checks.outputs.parameters }}
with:
@@ -125,7 +125,7 @@ jobs:
core.setOutput("shouldDeploy", parameters.shouldDeploy);
- name: Download artifact
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
ARTIFACT_JSON: ${{ needs.checks.outputs.artifact }}
with:

View File

@@ -14,7 +14,7 @@ jobs:
pull-requests: write
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false

View File

@@ -22,7 +22,7 @@ jobs:
private-key: ${{ secrets.PUSH_O_MATIC_APP_KEY }}
- name: 'Checkout'
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.ref }}
token: ${{ steps.generate-token.outputs.token }}
@@ -32,7 +32,7 @@ jobs:
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
@@ -48,7 +48,7 @@ jobs:
message: 'chore: fix formatting'
- name: Remove label
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
if: always()
with:
script: |

View File

@@ -11,4 +11,4 @@ jobs:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/labeler@8558fd74291d67161a8a78ce36a881fa63b766a9 # v5.0.0
- uses: actions/labeler@634933edcd8ababfe52f92936142cc22ac488b1b # v6.0.1

View File

@@ -55,20 +55,20 @@ jobs:
private-key: ${{ secrets.PUSH_O_MATIC_APP_KEY }}
- name: Checkout
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
persist-credentials: true
ref: main
- name: Install uv
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
@@ -117,13 +117,13 @@ jobs:
private-key: ${{ secrets.PUSH_O_MATIC_APP_KEY }}
- name: Checkout
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
token: ${{ steps.generate-token.outputs.token }}
persist-credentials: false
- name: Download APK
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: release-apk-signed

View File

@@ -24,7 +24,7 @@ jobs:
permissions:
pull-requests: write
steps:
- uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
github.rest.issues.removeLabel({

View File

@@ -16,7 +16,7 @@ jobs:
run:
working-directory: ./open-api/typescript-sdk
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
@@ -24,7 +24,7 @@ jobs:
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
# Setup .npmrc file to publish to npm
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './open-api/typescript-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'
@@ -35,6 +35,6 @@ jobs:
- name: Build
run: pnpm build
- name: Publish
run: pnpm publish
run: pnpm publish --no-git-checks
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -42,7 +42,7 @@ jobs:
working-directory: ./mobile
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false

View File

@@ -56,13 +56,13 @@ jobs:
working-directory: ./server
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
@@ -93,13 +93,13 @@ jobs:
working-directory: ./cli
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './cli/.nvmrc'
cache: 'pnpm'
@@ -133,13 +133,13 @@ jobs:
working-directory: ./cli
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './cli/.nvmrc'
cache: 'pnpm'
@@ -168,13 +168,13 @@ jobs:
working-directory: ./web
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
@@ -185,7 +185,7 @@ jobs:
- name: Run pnpm install
run: pnpm rebuild && pnpm install --frozen-lockfile
- name: Run linter
run: pnpm lint:p
run: pnpm lint
if: ${{ !cancelled() }}
- name: Run formatter
run: pnpm format
@@ -205,13 +205,13 @@ jobs:
working-directory: ./web
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
@@ -236,13 +236,13 @@ jobs:
contents: read
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
@@ -277,13 +277,13 @@ jobs:
working-directory: ./e2e
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
@@ -316,13 +316,13 @@ jobs:
working-directory: ./server
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
@@ -347,14 +347,14 @@ jobs:
runner: [ubuntu-latest, ubuntu-24.04-arm]
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
submodules: 'recursive'
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
@@ -395,14 +395,14 @@ jobs:
runner: [ubuntu-latest, ubuntu-24.04-arm]
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
submodules: 'recursive'
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
@@ -441,7 +441,7 @@ jobs:
permissions:
contents: read
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup Flutter SDK
@@ -466,12 +466,12 @@ jobs:
run:
working-directory: ./machine-learning
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Install uv
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: astral-sh/setup-uv@d0cc045d04ccac9d8b7881df0226f9e82c39688e # v6.8.0
- uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
# TODO: add caching when supported (https://github.com/actions/setup-python/pull/818)
# with:
# python-version: 3.11
@@ -503,13 +503,13 @@ jobs:
working-directory: ./.github
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './.github/.nvmrc'
cache: 'pnpm'
@@ -525,7 +525,7 @@ jobs:
permissions:
contents: read
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Run ShellCheck
@@ -540,13 +540,13 @@ jobs:
contents: read
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
@@ -595,13 +595,13 @@ jobs:
working-directory: ./server
steps:
- name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'

14
.vscode/launch.json vendored
View File

@@ -18,6 +18,20 @@
"name": "Immich Workers",
"remoteRoot": "/usr/src/app/server",
"localRoot": "${workspaceFolder}/server"
},
{
"type": "node",
"request": "launch",
"name": "Immich CLI",
"program": "${workspaceFolder}/cli/dist/index.js",
"args": ["upload", "--help"],
"runtimeArgs": ["--enable-source-maps"],
"console": "integratedTerminal",
"resolveSourceMapLocations": ["${workspaceFolder}/cli/dist/**/*.js.map"],
"sourceMaps": true,
"outFiles": ["${workspaceFolder}/cli/dist/**/*.js"],
"skipFiles": ["<node_internals>/**"],
"preLaunchTask": "Build Immich CLI"
}
]
}

8
.vscode/tasks.json vendored
View File

@@ -5,6 +5,7 @@
"label": "Fix Permissions, Install Dependencies",
"type": "shell",
"command": "[ -f /immich-devcontainer/container-start.sh ] && /immich-devcontainer/container-start.sh || exit 0",
"isBackground": true,
"presentation": {
"echo": true,
"reveal": "always",
@@ -25,6 +26,7 @@
"dependsOn": ["Fix Permissions, Install Dependencies"],
"type": "shell",
"command": "[ -f /immich-devcontainer/container-start-backend.sh ] && /immich-devcontainer/container-start-backend.sh || exit 0",
"isBackground": true,
"presentation": {
"echo": true,
"reveal": "always",
@@ -45,6 +47,7 @@
"dependsOn": ["Fix Permissions, Install Dependencies"],
"type": "shell",
"command": "[ -f /immich-devcontainer/container-start-frontend.sh ] && /immich-devcontainer/container-start-frontend.sh || exit 0",
"isBackground": true,
"presentation": {
"echo": true,
"reveal": "always",
@@ -67,6 +70,11 @@
"runOn": "folderOpen"
},
"problemMatcher": []
},
{
"label": "Build Immich CLI",
"type": "shell",
"command": "pnpm --filter cli build:dev"
}
]
}

View File

@@ -91,8 +91,6 @@ format-%:
pnpm --filter $(call map-package,$*) run format:fix
lint-%:
pnpm --filter $(call map-package,$*) run lint:fix
lint-web:
pnpm --filter $(call map-package,$*) run lint:p
check-%:
pnpm --filter $(call map-package,$*) run check
check-web:

View File

@@ -28,7 +28,8 @@
<a href="readme_i18n/README_de_DE.md">Deutsch</a>
<a href="readme_i18n/README_nl_NL.md">Nederlands</a>
<a href="readme_i18n/README_tr_TR.md">Türkçe</a>
<a href="readme_i18n/README_zh_CN.md">中文</a>
<a href="readme_i18n/README_zh_CN.md">简体中文</a>
<a href="readme_i18n/README_zh_TW.md">正體中文</a>
<a href="readme_i18n/README_uk_UA.md">Українська</a>
<a href="readme_i18n/README_ru_RU.md">Русский</a>
<a href="readme_i18n/README_pt_BR.md">Português Brasileiro</a>

View File

@@ -6,25 +6,33 @@ Please see the [Immich CLI documentation](https://docs.immich.app/features/comma
Before building the CLI, you must build the immich server and the open-api client. To build the server run the following in the server folder:
$ npm install
$ npm run build
$ pnpm install
$ pnpm run build
Then, to build the open-api client run the following in the open-api folder:
$ ./bin/generate-open-api.sh
To run the Immich CLI from source, run the following in the cli folder:
## Run from build
$ npm install
$ npm run build
$ ts-node .
Go to the cli folder and build it:
You'll need ts-node, the easiest way to install it is to use npm:
$ pnpm install
$ pnpm run build
$ node dist/index.js
$ npm i -g ts-node
## Run and Debug from source (VSCode)
With VScode you can run and debug the Immich CLI. Go to the launch.json file, find the Immich CLI config and change this with the command you need to debug
`"args": ["upload", "--help"],`
replace that for the command of your choice.
## Install from build
You can also build and install the CLI using
$ npm run build
$ npm install -g .
$ pnpm run build
$ pnpm install -g .
****

View File

@@ -20,7 +20,7 @@
"@types/lodash-es": "^4.17.12",
"@types/micromatch": "^4.0.9",
"@types/mock-fs": "^4.13.1",
"@types/node": "^22.18.1",
"@types/node": "^22.18.8",
"@vitest/coverage-v8": "^3.0.0",
"byte-size": "^9.0.0",
"cli-progress": "^3.12.0",
@@ -43,6 +43,7 @@
},
"scripts": {
"build": "vite build",
"build:dev": "vite build --sourcemap true",
"lint": "eslint \"src/**/*.ts\" --max-warnings 0",
"lint:fix": "npm run lint -- --fix",
"prepack": "npm run build",

View File

@@ -55,3 +55,19 @@ Additionally, some jobs (such as memories generation) run on a schedule, which i
:::note
Some jobs ([External Libraries](/features/libraries) scanning, Database Dump) are configured in their own sections in System Settings.
:::
## Job processing order
The below diagram shows the job run order for newly uploaded files
```mermaid
graph TD
A[Asset Upload] --> B[Metadata Extraction]
B --> C[Storage Template Migration]
C --> D["Thumbnail Generation (Large, small, blurred and person)"]
D --> E[Smart Search]
D --> F[Face Detection]
D --> G[Video Transcoding]
E --> H[Duplicate Detection]
F --> I[Facial Recognition]
```

View File

@@ -1,5 +1,9 @@
# External Libraries
:::info
Currently an external library can only belong to a single user which is selected when the library is initially created.
:::
External libraries track assets stored in the filesystem outside of Immich. When the external library is scanned, Immich will load videos and photos from disk and create the corresponding assets. These assets will then be shown in the main timeline, and they will look and behave like any other asset, including viewing on the map, adding to albums, etc. Later, if a file is modified outside of Immich, you need to scan the library for the changes to show up.
If an external asset is deleted from disk, Immich will move it to trash on rescan. To restore the asset, you need to restore the original file. After 30 days the file will be removed from trash, and any changes to metadata within Immich will be lost.

View File

@@ -21,6 +21,10 @@ Restart Immich by running `docker compose up -d`.
# Create the library
:::info
External library management requires administrator access and the steps below assume you are using an admin account.
:::
In the Immich web UI:
- click the **Administration** link in the upper right corner.

View File

@@ -16,7 +16,7 @@ Immich can easily be installed on a Synology NAS using Container Manager within
## Step 1 - Download the required files
Create a directory of your choice (e.g. `./immich-app`) to house Immich. In general, it's a best practice to have all Docker-based applications running under the `./docker` directory, so in this case, your directory structure will look like `./docker/immich-app`.
Create a directory of your choice (e.g. `./immich-app`) to house Immich. In general, it's best practice to have all Docker-based applications running under the `./docker` directory, so in this case, your directory structure will look like `./docker/immich-app`.
Now create a `./postgres` and `./library` directory as sub-directories of the `./docker/immich-app`.
@@ -25,7 +25,7 @@ When you're all done, you should have the following:
- `./docker/immich-app/postgres`
- `./docker/immich-app/library`
Download [`docker-compose.yml`](https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml) and [`example.env`](https://github.com/immich-app/immich/releases/latest/download/example.env) to your computer. Upload the files to the `./docker/immich-app` directory, and rename `example.env` to `.env`.
Download [`docker-compose.yml`](https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml) and [`example.env`](https://github.com/immich-app/immich/releases/latest/download/example.env) to your computer. Upload the files to the `./docker/immich-app` directory, and rename `example.env` to `.env`. Note: If you plan to use the Synology Text editor to edit the `.env` file on the NAS within File Station, you will need to rename it to a temporary name (e.g. `example.txt`) in order to see 'Open with Text Editor' in the file context menu. Once saved, rename it back to `.env`.
## Step 2 - Populate the .env file with custom values
@@ -34,23 +34,23 @@ Follow [Step 2 in Docker Compose](/install/docker-compose#step-2---populate-the-
## Step 3 - Create a new project in Container Manager
Open Container Manager, and select the "**Project**" action on the left navigation bar and then click "**Create**".
![Create Project](../../static/img/synology-container-manager-create-project.png)
![Create project](../../static/img/synology-container-manager-create-project.png)
In the settings of your new project, set "**Project name**" to a name you'll remember, such as _immich-app_. When setting the "**Path**", select the `./docker/immich-app` directory you created earlier. Doing so will prompt a message to use the existing `docker-compose.yml` already present in the directory for your project. Click "**OK**" to continue.
![Set Path](../../static/img/synology-container-manager-set-path.png)
![Set path](../../static/img/synology-container-manager-set-path.png)
The following screen will give you the option to further customize your `docker-compose.yml` file, giving you a warning regarding the `start_interval` property. Under the `healthcheck` heading, remove the `start_interval: 30s` completely and click "**Next**".
The following screen will give you the option to further customize your `docker-compose.yml` file. Take note of `DB_STORAGE_TYPE: 'HDD'`and uncomment if applicable for your Synology setup.
![start interval](../../static/img/synology-container-manager-customize-docker-compose.png)
![DB storage](../../static/img/synology-container-manager-customize-docker-compose.png)
Skip the section asking to set-up a portal for Web Station, and then complete the wizard which will build and start the containers for your project.
Once your containers are successfully running, navigate to the "**Container**" section of Container Manager, right-click on the "**immich-server**" container, and choose the "**Details**".
Scroll to the bottom of the "**Details**" section, and find the `IP Address` of the container, located in the `Network` section. Take note of the container's IP address as you will need it for **Step 4**.
Scroll to the bottom of the "**Details**" section and find the `IP Address` listed in the `Network` section. Take note of the container's IP address as you will need it for **Step 4**.
![Container Details](../../static/img/synology-container-manager-container-details.png)
![Container details](../../static/img/synology-container-manager-container-details.png)
## Step 4 - Configure Firewall Settings
@@ -63,8 +63,66 @@ Open "**Control Panel**" on your Synology NAS, and select "**Security**". Naviga
Click "**Edit Rules**" and add the following firewall rules:
- Add a "**Source IP**" rule for the IP address of your container that you obtained in Step 3 above
![IP address rule](../../static/img/synology-ipaddress-firewall-rule.png)
- Add a "**Ports**" rule for the port specified in the `docker-compose.yml`, which should be `2283`
![Custom port rule](../../static/img/synology-custom-port-firewall-rule.png)
## Next Steps
Read the [Post Installation](/install/post-install.mdx) steps and [upgrade instructions](/install/upgrading.md).
<details>
<summary>Updating Immich using Container Manager</summary>
Check the post installation and upgrade instructions at the links above before proceeding with this section.
## Step 1. Backup
Ensure your photos and videos are backed up. Your `.env` settings will define where they are stored. There is no need to delete any files or folders within the `docker` folder when doing a release upgrade unless instructed in the release notes.
## Step 2. Check release notes
Always check the [release notes](https://github.com/immich-app/immich/releases) before proceeding with an update!
## Step 3. Stop containers & clean up
Open **Container Manager**. Select **Project** then your Immich app
![Select project](../../static/img/synology-select-proj.png)
Select **Stop**
![Stop project](../../static/img/synology-project-stop.png)
Select **Action** then **Clean**. This removes the containers.
![Clean project](../../static/img/synology-action-clean.png)
Go to **Image** and select **Remove Unused Images**.
![Remove unused](../../static/img/synology-remove-unused.png)
## Step 4. Build
Go to **Project**, select **Action** then **Build**. This will download, unpack, install and start the containers.
![Build](../../static/img/synology-build.png)
## Step 5. Update firewall rule
The default behavior is to automatically start the containers once installed. If `immich_server` runs for a few seconds and then stops, it may be because the firewall rule no longer matches the server IP address.
Go to the **Container** section. Click on `immich_server` and scroll down on **General** to find the IP address.
![Container IP](../../static/img/synology-container-ip.png)
Go to Synology **Control Panel**. Select **Security** and **Firewall**.
![Firewall](../../static/img/synology-fw-rules.png)
In this example, the IP addresses mismatch and the firewall rule needs to be edited to match above.
![Edit IP](../../static/img/synology-fw-ipedit.png)
</details>

View File

@@ -387,27 +387,35 @@ To migrate from the old storage configuration to the new one, you will need to c
3. **Copy the data** from the old datasets to the new dataset. We advise using the `rsync` command to copy the data, as it will preserve the permissions and ownership of the files. The following commands are examples:
```bash
rsync -av /mnt/tank/immich/library/ /mnt/tank/immich/data/library/
rsync -av /mnt/tank/immich/upload/ /mnt/tank/immich/data/upload/
rsync -av /mnt/tank/immich/thumbs/ /mnt/tank/immich/data/thumbs/
rsync -av /mnt/tank/immich/profile/ /mnt/tank/immich/data/profile/
rsync -av /mnt/tank/immich/video/ /mnt/tank/immich/data/encoded-video/
rsync -av /mnt/tank/immich/backups/ /mnt/tank/immich/data/backups/
sudo rsync -av /mnt/tank/immich/library/ /mnt/tank/immich/data/library/
sudo rsync -av /mnt/tank/immich/upload/ /mnt/tank/immich/data/upload/
sudo rsync -av /mnt/tank/immich/thumbs/ /mnt/tank/immich/data/thumbs/
sudo rsync -av /mnt/tank/immich/profile/ /mnt/tank/immich/data/profile/
sudo rsync -av /mnt/tank/immich/video/ /mnt/tank/immich/data/encoded-video/
sudo rsync -av /mnt/tank/immich/backups/ /mnt/tank/immich/data/backups/
```
Make sure to replace `/mnt/tank/immich/` with the correct path to your old datasets and `/mnt/tank/immich/data/` with the correct path to your new dataset.
:::tip
If you were using **ixVolume (dataset created automatically by the system)** for Immich data storage, the path to the data should be `/mnt/.ix-apps/app_mounts/immich/`. You have to use this path instead of `/mnt/tank/immich/` in the `rsync` command above, for example:
If you were using **ixVolume (dataset created automatically by the system)** for some of Immich data storage, the path to the data should be `/mnt/.ix-apps/app_mounts/immich/`. You have to use this path instead of `/mnt/tank/immich/` in the `rsync` command above, for example:
```bash
rsync -av /mnt/.ix-apps/app_mounts/immich/library/ /mnt/tank/immich/data/library/
sudo rsync -av /mnt/.ix-apps/app_mounts/immich/library/ /mnt/tank/immich/data/library/
```
If you also were storing your files in the **ixVolume**, the **_upload_** folder is named `uploads` instead of `upload`, so the command to run should be:
```bash
sudo rsync -av /mnt/.ix-apps/app_mounts/immich/uploads/ /mnt/tank/immich/data/upload/
```
This means that depending on your old storage configuration, you might have to use a mix of paths in the `rsync` commands above.
If you were also using an ixVolume for Postgres data storage, you also should, first create the pgData dataset, as described in the [Setting up Storage Datasets](#setting-up-storage-datasets) section above, and then you can use the following command to copy the Postgres data:
```bash
rsync -av /mnt/.ix-apps/app_mounts/immich/pgData/ /mnt/tank/immich/pgData/
sudo rsync -av /mnt/.ix-apps/app_mounts/immich/pgData/ /mnt/tank/immich/pgData/
```
:::
@@ -416,7 +424,7 @@ rsync -av /mnt/.ix-apps/app_mounts/immich/pgData/ /mnt/tank/immich/pgData/
Make sure that for each folder, the `.immich` file is copied as well, as it contains important metadata for Immich. If for some reason the `.immich` file is not copied, you can copy it manually with the `rsync` command, for example:
```bash
rsync -av /mnt/tank/immich/library/.immich /mnt/tank/immich/data/library/
sudo rsync -av /mnt/tank/immich/library/.immich /mnt/tank/immich/data/library/
```
Replace `library` with the name of the folder where you are copying the file.
@@ -437,38 +445,37 @@ This will recreate the Immich container with the new storage configuration and s
If everything went well, you should now be able to access Immich with the new storage configuration. You can verify that the data has been copied correctly by checking the Immich web interface and ensuring that all your photos and videos are still available. You may delete the old datasets, if you no longer need them, using the TrueNAS web interface.
:::tip
If you were using **ixVolume (dataset created automatically by the system)** or folders for Immich data storage, you can delete the old datasets using the following commands:
```bash
rm -r /mnt/.ix-apps/app_mounts/immich/library
rm -r /mnt/.ix-apps/app_mounts/immich/uploads
rm -r /mnt/.ix-apps/app_mounts/immich/thumbs
rm -r /mnt/.ix-apps/app_mounts/immich/profile
rm -r /mnt/.ix-apps/app_mounts/immich/video
rm -r /mnt/.ix-apps/app_mounts/immich/backups
sudo rm -r /mnt/.ix-apps/app_mounts/immich/*
```
:::
</TabItem>
<TabItem value="migrate-old-dataset" label="Keep the existing datasets">
To migrate from the old storage configuration to the new one without creating new datasets.
1. **Stop the Immich app** from the TrueNAS web interface to ensure no data is being written while you are updating the app.
2. **Update the datasets permissions**: Ensure that the datasets used for Immich data storage (`library`, `upload`, `thumbs`, `profile`, `video`, `backups`) have the correct permissions set for the user who will run Immich. The user should have ***modify*** permissions on these datasets. The default user for Immich is `apps` (UID 568) and the default group is `apps` (GID 568). If you are using a different user, make sure to set the permissions accordingly. You can do this from the TrueNAS web interface by going to the **Datasets** screen, selecting each dataset, clicking on the **Edit** button next to **Permissions**, and adding the user with ***modify*** permissions.
2. **Update the datasets permissions**: Ensure that the datasets used for Immich data storage (`library`, `upload`, `thumbs`, `profile`, `video`, `backups`) have the correct permissions set for the user who will run Immich. The user should have **_modify_** permissions on these datasets. The default user for Immich is `apps` (UID 568) and the default group is `apps` (GID 568). If you are using a different user, make sure to set the permissions accordingly. You can do this from the TrueNAS web interface by going to the **Datasets** screen, selecting each dataset, clicking on the **Edit** button next to **Permissions**, and adding the user with **_modify_** permissions.
3. **Update the Immich app** to use the existing datasets:
- Go to the **Installed Applications** screen and select Immich from the list of installed applications.
- Click **Edit** on the **Application Info** widget.
- In the **Storage Configuration** section, untick the **Use Old Storage Configuration (Deprecated)** checkbox.
- For the **Data Storage**, you can keep the **ixVolume (dataset created automatically by the system)** as no data will be directly written to it. We recommend selecting **Host Path (Path that already exists on the system)** and then select a **new** dataset you created for Immich data storage, for example, `data`.
- For the **Postgres Data Storage**, keep **Host Path (Path that already exists on the system)** and then select the existing dataset you used for Postgres data storage, for example, `pgData`.
- Following the instructions in the [Multiple Datasets for Immich Storage](#additional-storage-advanced-users) section, you can add, **for each old dataset**, a new Additional Storage with the following settings:
- **Type**: `Host Path (Path that already exists on the system)`
- **Mount Path**: `/data/<folder-name>` (e.g. `/data/library`)
- **Host Path**: `/mnt/<your-pool-name>/<dataset-name>` (e.g. `/mnt/tank/immich/library`)
:::danger Ensure using the correct paths names
Make sure to replace `<folder-name>` with the actual name of the folder used by Immich: `library`, `upload`, `thumbs`, `profile`, `encoded-video`, and `backups`. Also, replace `<your-pool-name>` and `<dataset-name>` with the actual names of your pool and dataset.
:::
- **Read Only**: Keep it unticked as Immich needs to write to these datasets.
- Click **Update** at the bottom of the page to save changes.
- Go to the **Installed Applications** screen and select Immich from the list of installed applications.
- Click **Edit** on the **Application Info** widget.
- In the **Storage Configuration** section, untick the **Use Old Storage Configuration (Deprecated)** checkbox.
- For the **Data Storage**, you can keep the **ixVolume (dataset created automatically by the system)** as no data will be directly written to it. We recommend selecting **Host Path (Path that already exists on the system)** and then select a **new** dataset you created for Immich data storage, for example, `data`.
- For the **Postgres Data Storage**, keep **Host Path (Path that already exists on the system)** and then select the existing dataset you used for Postgres data storage, for example, `pgData`.
- Following the instructions in the [Multiple Datasets for Immich Storage](#additional-storage-advanced-users) section, you can add, **for each old dataset**, a new Additional Storage with the following settings:
- **Type**: `Host Path (Path that already exists on the system)`
- **Mount Path**: `/data/<folder-name>` (e.g. `/data/library`)
- **Host Path**: `/mnt/<your-pool-name>/<dataset-name>` (e.g. `/mnt/tank/immich/library`)
:::danger Ensure using the correct paths names
Make sure to replace `<folder-name>` with the actual name of the folder used by Immich: `library`, `upload`, `thumbs`, `profile`, `encoded-video`, and `backups`. Also, replace `<your-pool-name>` and `<dataset-name>` with the actual names of your pool and dataset.
:::
- **Read Only**: Keep it unticked as Immich needs to write to these datasets.
- Click **Update** at the bottom of the page to save changes.
4. **Start the Immich app** from the TrueNAS web interface. This will recreate the Immich container with the new storage configuration and start the app. If everything went well, you should now be able to access Immich with the new storage configuration. You can verify that the data is still available by checking the Immich web interface and ensuring that all your photos and videos are still accessible.
</TabItem>

View File

@@ -17,9 +17,9 @@
"write-heading-ids": "docusaurus write-heading-ids"
},
"dependencies": {
"@docusaurus/core": "~3.8.0",
"@docusaurus/preset-classic": "~3.8.0",
"@docusaurus/theme-common": "~3.8.0",
"@docusaurus/core": "~3.9.0",
"@docusaurus/preset-classic": "~3.9.0",
"@docusaurus/theme-common": "~3.9.0",
"@mdi/js": "^7.3.67",
"@mdi/react": "^1.6.1",
"@mdx-js/react": "^3.0.0",
@@ -35,7 +35,7 @@
"url": "^0.11.0"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "~3.8.0",
"@docusaurus/module-type-aliases": "~3.9.0",
"@docusaurus/tsconfig": "^3.7.0",
"@docusaurus/types": "^3.7.0",
"prettier": "^3.2.4",

View File

@@ -23,11 +23,6 @@ const projects: CommunityProjectProps[] = [
description: 'A Python script to sync folders as albums.',
url: 'https://git.orenit.solutions/open/immichalbumpull',
},
{
title: 'Remove offline files',
description: 'A simple way to remove orphaned offline assets from the Immich database',
url: 'https://github.com/Thoroslives/immich_remove_offline_files',
},
{
title: 'Immich-Tools',
description: 'Provides scripts for handling problems on the repair page.',
@@ -120,6 +115,11 @@ const projects: CommunityProjectProps[] = [
description: 'Auto-stack photos with identical filenames and differing extensions (i.e. JPG+RAW)',
url: 'https://github.com/sid3windr/immich-stack',
},
{
title: 'Immich Stack',
description: 'Automatically groups similar photos into stacks within the Immich photo management system.',
url: 'https://github.com/Majorfi/immich-stack/',
},
];
function CommunityProject({ title, description, url }: CommunityProjectProps): JSX.Element {

Binary file not shown.

After

Width:  |  Height:  |  Size: 140 KiB

BIN
docs/static/img/synology-build.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 140 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 50 KiB

After

Width:  |  Height:  |  Size: 153 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 88 KiB

BIN
docs/static/img/synology-fw-ipedit.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

BIN
docs/static/img/synology-fw-rules.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

BIN
docs/static/img/synology-select-proj.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

View File

@@ -35,7 +35,7 @@ services:
- 2285:2285
redis:
image: redis:6.2-alpine@sha256:7fe72c486b910f6b1a9769c937dad5d63648ddee82e056f47417542dd40825bb
image: redis:6.2-alpine@sha256:2185e741f4c1e7b0ea9ca1e163a3767c4270a73086b6bbea2049a7203212fb7f
database:
image: ghcr.io/immich-app/postgres:14-vectorchord0.3.0@sha256:11ced39d65a92a54d12890ced6a26cc2003f92697d6f0d4d944b98459dba7138

View File

@@ -25,7 +25,7 @@
"@playwright/test": "^1.44.1",
"@socket.io/component-emitter": "^3.1.2",
"@types/luxon": "^3.4.2",
"@types/node": "^22.18.1",
"@types/node": "^22.18.8",
"@types/oidc-provider": "^9.0.0",
"@types/pg": "^8.15.1",
"@types/pngjs": "^6.0.4",
@@ -43,7 +43,7 @@
"pngjs": "^7.0.0",
"prettier": "^3.2.5",
"prettier-plugin-organize-imports": "^4.0.0",
"sharp": "^0.34.3",
"sharp": "^0.34.4",
"socket.io-client": "^4.7.4",
"supertest": "^7.0.0",
"typescript": "^5.3.3",
@@ -53,5 +53,8 @@
},
"volta": {
"node": "22.20.0"
},
"dependencies": {
"structured-headers": "^2.0.2"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -561,6 +561,16 @@ export const utils = {
await utils.waitForQueueFinish(accessToken, 'sidecar');
await utils.waitForQueueFinish(accessToken, 'metadataExtraction');
},
downloadAsset: async (accessToken: string, id: string) => {
const downloadedRes = await fetch(`${baseUrl}/api/assets/${id}/original`, {
headers: asBearerAuth(accessToken),
});
if (!downloadedRes.ok) {
throw new Error(`Failed to download asset ${id}: ${downloadedRes.status} ${await downloadedRes.text()}`);
}
return await downloadedRes.blob();
},
};
utils.initSdk();

View File

@@ -33,6 +33,7 @@
"add_to_albums": "Add to albums",
"add_to_albums_count": "Add to albums ({count})",
"add_to_shared_album": "Add to shared album",
"add_upload_to_stack": "Add upload to stack",
"add_url": "Add URL",
"added_to_archive": "Added to archive",
"added_to_favorites": "Added to favorites",

View File

@@ -1,6 +1,7 @@
import os
import signal
import subprocess
from ipaddress import ip_address
from pathlib import Path
from .config import log, non_prefixed_settings, settings
@@ -12,6 +13,19 @@ else:
module_dir = Path(__file__).parent
def is_ipv6(host: str) -> bool:
try:
return ip_address(host).version == 6
except ValueError:
return False
bind_host = non_prefixed_settings.immich_host
if is_ipv6(bind_host):
bind_host = f"[{bind_host}]"
bind_address = f"{bind_host}:{non_prefixed_settings.immich_port}"
try:
with subprocess.Popen(
[
@@ -24,7 +38,7 @@ try:
"-c",
module_dir / "gunicorn_conf.py",
"-b",
f"{non_prefixed_settings.immich_host}:{non_prefixed_settings.immich_port}",
bind_address,
"-w",
str(settings.workers),
"-t",

View File

@@ -1,12 +1,22 @@
import os
import sys
from ipaddress import ip_address
import requests
port = os.getenv("IMMICH_PORT", 3003)
host = os.getenv("IMMICH_HOST", "0.0.0.0")
def is_ipv6(host: str) -> bool:
try:
return ip_address(host).version == 6
except ValueError:
return False
host = "localhost" if host == "0.0.0.0" else host
host = f"[{host}]" if is_ipv6(host) else host
try:
response = requests.get(f"http://{host}:{port}/ping", timeout=2)

View File

@@ -1,7 +1,7 @@
[tools]
node = "22.20.0"
flutter = "3.35.4"
pnpm = "10.15.1"
flutter = "3.35.5"
pnpm = "10.18.0"
[tools."github:CQLabs/homebrew-dcm"]
version = "1.30.0"
@@ -278,12 +278,7 @@ run = "prettier --write ."
[tasks."web:lint"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "eslint . --max-warnings 0"
[tasks."web:lint-p"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "eslint-p . --max-warnings 0 --concurrency=4"
run = "eslint . --max-warnings 0 --concurrency 4"
[tasks."web:lint-fix"]
run = "mise run web:lint --fix"

View File

@@ -136,6 +136,7 @@ private open class BackgroundWorkerPigeonCodec : StandardMessageCodec() {
/** Generated interface from Pigeon that represents a handler of messages from Flutter. */
interface BackgroundWorkerFgHostApi {
fun enable()
fun saveNotificationMessage(title: String, body: String)
fun configure(settings: BackgroundWorkerSettings)
fun disable()
@@ -164,6 +165,25 @@ interface BackgroundWorkerFgHostApi {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.saveNotificationMessage$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { message, reply ->
val args = message as List<Any?>
val titleArg = args[0] as String
val bodyArg = args[1] as String
val wrapped: List<Any?> = try {
api.saveNotificationMessage(titleArg, bodyArg)
listOf(null)
} catch (exception: Throwable) {
BackgroundWorkerPigeonUtils.wrapError(exception)
}
reply.reply(wrapped)
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.configure$separatedMessageChannelSuffix", codec)
if (api != null) {
@@ -204,7 +224,6 @@ interface BackgroundWorkerFgHostApi {
/** Generated interface from Pigeon that represents a handler of messages from Flutter. */
interface BackgroundWorkerBgHostApi {
fun onInitialized()
fun showNotification(title: String, content: String)
fun close()
companion object {
@@ -232,25 +251,6 @@ interface BackgroundWorkerBgHostApi {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.showNotification$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { message, reply ->
val args = message as List<Any?>
val titleArg = args[0] as String
val contentArg = args[1] as String
val wrapped: List<Any?> = try {
api.showNotification(titleArg, contentArg)
listOf(null)
} catch (exception: Throwable) {
BackgroundWorkerPigeonUtils.wrapError(exception)
}
reply.reply(wrapped)
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.close$separatedMessageChannelSuffix", codec)
if (api != null) {

View File

@@ -73,6 +73,8 @@ class BackgroundWorker(context: Context, params: WorkerParameters) :
NotificationManager.IMPORTANCE_LOW
)
notificationManager.createNotificationChannel(notificationChannel)
val notificationConfig = BackgroundWorkerPreferences(ctx).getNotificationConfig()
showNotification(notificationConfig.first, notificationConfig.second)
loader.ensureInitializationCompleteAsync(ctx, null, Handler(Looper.getMainLooper())) {
engine = FlutterEngine(ctx)
@@ -109,7 +111,7 @@ class BackgroundWorker(context: Context, params: WorkerParameters) :
}
// TODO: Move this to a separate NotificationManager class
override fun showNotification(title: String, content: String) {
private fun showNotification(title: String, content: String) {
val notification = NotificationCompat.Builder(applicationContext, NOTIFICATION_CHANNEL_ID)
.setSmallIcon(R.drawable.notification_icon)
.setOnlyAlertOnce(true)

View File

@@ -20,6 +20,10 @@ class BackgroundWorkerApiImpl(context: Context) : BackgroundWorkerFgHostApi {
enqueueMediaObserver(ctx)
}
override fun saveNotificationMessage(title: String, body: String) {
BackgroundWorkerPreferences(ctx).updateNotificationConfig(title, body)
}
override fun configure(settings: BackgroundWorkerSettings) {
BackgroundWorkerPreferences(ctx).updateSettings(settings)
enqueueMediaObserver(ctx)

View File

@@ -10,9 +10,13 @@ class BackgroundWorkerPreferences(private val ctx: Context) {
private const val SHARED_PREF_MIN_DELAY_KEY = "BackgroundWorker::minDelaySeconds"
private const val SHARED_PREF_REQUIRE_CHARGING_KEY = "BackgroundWorker::requireCharging"
private const val SHARED_PREF_LOCK_KEY = "BackgroundWorker::isLocked"
private const val SHARED_PREF_NOTIF_TITLE_KEY = "BackgroundWorker::notificationTitle"
private const val SHARED_PREF_NOTIF_MSG_KEY = "BackgroundWorker::notificationMessage"
private const val DEFAULT_MIN_DELAY_SECONDS = 30L
private const val DEFAULT_REQUIRE_CHARGING = false
private const val DEFAULT_NOTIF_TITLE = "Uploading media"
private const val DEFAULT_NOTIF_MSG = "Checking for new assets…"
}
private val sp: SharedPreferences by lazy {
@@ -38,6 +42,20 @@ class BackgroundWorkerPreferences(private val ctx: Context) {
)
}
fun updateNotificationConfig(title: String, message: String) {
sp.edit {
putString(SHARED_PREF_NOTIF_TITLE_KEY, title)
putString(SHARED_PREF_NOTIF_MSG_KEY, message)
}
}
fun getNotificationConfig(): Pair<String, String> {
val title =
sp.getString(SHARED_PREF_NOTIF_TITLE_KEY, DEFAULT_NOTIF_TITLE) ?: DEFAULT_NOTIF_TITLE
val message = sp.getString(SHARED_PREF_NOTIF_MSG_KEY, DEFAULT_NOTIF_MSG) ?: DEFAULT_NOTIF_MSG
return Pair(title, message)
}
fun setLocked(paused: Boolean) {
sp.edit {
putBoolean(SHARED_PREF_LOCK_KEY, paused)

View File

@@ -304,6 +304,7 @@ interface NativeSyncApi {
fun getAssetsCountSince(albumId: String, timestamp: Long): Long
fun getAssetsForAlbum(albumId: String, updatedTimeCond: Long?): List<PlatformAsset>
fun hashAssets(assetIds: List<String>, allowNetworkAccess: Boolean, callback: (Result<List<HashResult>>) -> Unit)
fun uploadAsset(callback: (Result<Boolean>) -> Unit)
fun cancelHashing()
companion object {
@@ -467,6 +468,24 @@ interface NativeSyncApi {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.NativeSyncApi.uploadAsset$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { _, reply ->
api.uploadAsset{ result: Result<Boolean> ->
val error = result.exceptionOrNull()
if (error != null) {
reply.reply(MessagesPigeonUtils.wrapError(error))
} else {
val data = result.getOrNull()
reply.reply(MessagesPigeonUtils.wrapResult(data))
}
}
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.NativeSyncApi.cancelHashing$separatedMessageChannelSuffix", codec)
if (api != null) {

View File

@@ -64,7 +64,7 @@ PODS:
- Flutter
- integration_test (0.0.1):
- Flutter
- isar_flutter_libs (1.0.0):
- isar_community_flutter_libs (1.0.0):
- Flutter
- local_auth_darwin (0.0.1):
- Flutter
@@ -149,7 +149,7 @@ DEPENDENCIES:
- home_widget (from `.symlinks/plugins/home_widget/ios`)
- image_picker_ios (from `.symlinks/plugins/image_picker_ios/ios`)
- integration_test (from `.symlinks/plugins/integration_test/ios`)
- isar_flutter_libs (from `.symlinks/plugins/isar_flutter_libs/ios`)
- isar_community_flutter_libs (from `.symlinks/plugins/isar_community_flutter_libs/ios`)
- local_auth_darwin (from `.symlinks/plugins/local_auth_darwin/darwin`)
- maplibre_gl (from `.symlinks/plugins/maplibre_gl/ios`)
- native_video_player (from `.symlinks/plugins/native_video_player/ios`)
@@ -210,8 +210,8 @@ EXTERNAL SOURCES:
:path: ".symlinks/plugins/image_picker_ios/ios"
integration_test:
:path: ".symlinks/plugins/integration_test/ios"
isar_flutter_libs:
:path: ".symlinks/plugins/isar_flutter_libs/ios"
isar_community_flutter_libs:
:path: ".symlinks/plugins/isar_community_flutter_libs/ios"
local_auth_darwin:
:path: ".symlinks/plugins/local_auth_darwin/darwin"
maplibre_gl:
@@ -264,7 +264,7 @@ SPEC CHECKSUMS:
home_widget: f169fc41fd807b4d46ab6615dc44d62adbf9f64f
image_picker_ios: 7fe1ff8e34c1790d6fff70a32484959f563a928a
integration_test: 4a889634ef21a45d28d50d622cf412dc6d9f586e
isar_flutter_libs: bc909e72c3d756c2759f14c8776c13b5b0556e26
isar_community_flutter_libs: bede843185a61a05ff364a05c9b23209523f7e0d
local_auth_darwin: 553ce4f9b16d3fdfeafce9cf042e7c9f77c1c391
MapLibre: 69e572367f4ef6287e18246cfafc39c80cdcabcd
maplibre_gl: 3c924e44725147b03dda33430ad216005b40555f

View File

@@ -182,6 +182,7 @@ class BackgroundWorkerPigeonCodec: FlutterStandardMessageCodec, @unchecked Senda
/// Generated protocol from Pigeon that represents a handler of messages from Flutter.
protocol BackgroundWorkerFgHostApi {
func enable() throws
func saveNotificationMessage(title: String, body: String) throws
func configure(settings: BackgroundWorkerSettings) throws
func disable() throws
}
@@ -205,6 +206,22 @@ class BackgroundWorkerFgHostApiSetup {
} else {
enableChannel.setMessageHandler(nil)
}
let saveNotificationMessageChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.saveNotificationMessage\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
saveNotificationMessageChannel.setMessageHandler { message, reply in
let args = message as! [Any?]
let titleArg = args[0] as! String
let bodyArg = args[1] as! String
do {
try api.saveNotificationMessage(title: titleArg, body: bodyArg)
reply(wrapResult(nil))
} catch {
reply(wrapError(error))
}
}
} else {
saveNotificationMessageChannel.setMessageHandler(nil)
}
let configureChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.configure\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
configureChannel.setMessageHandler { message, reply in
@@ -238,7 +255,6 @@ class BackgroundWorkerFgHostApiSetup {
/// Generated protocol from Pigeon that represents a handler of messages from Flutter.
protocol BackgroundWorkerBgHostApi {
func onInitialized() throws
func showNotification(title: String, content: String) throws
func close() throws
}
@@ -261,22 +277,6 @@ class BackgroundWorkerBgHostApiSetup {
} else {
onInitializedChannel.setMessageHandler(nil)
}
let showNotificationChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.showNotification\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
showNotificationChannel.setMessageHandler { message, reply in
let args = message as! [Any?]
let titleArg = args[0] as! String
let contentArg = args[1] as! String
do {
try api.showNotification(title: titleArg, content: contentArg)
reply(wrapResult(nil))
} catch {
reply(wrapError(error))
}
}
} else {
showNotificationChannel.setMessageHandler(nil)
}
let closeChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.close\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
closeChannel.setMessageHandler { _, reply in

View File

@@ -119,10 +119,6 @@ class BackgroundWorker: BackgroundWorkerBgHostApi {
})
}
func showNotification(title: String, content: String) throws {
// No-op on iOS for the time being
}
/**
* Cancels the currently running background task, either due to timeout or external request.
* Sends a cancel signal to the Flutter side and sets up a fallback timer to ensure

View File

@@ -12,6 +12,10 @@ class BackgroundWorkerApiImpl: BackgroundWorkerFgHostApi {
// Android only
}
func saveNotificationMessage(title: String, body: String) throws {
// Android only
}
func disable() throws {
BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: BackgroundWorkerApiImpl.refreshTaskID);
BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: BackgroundWorkerApiImpl.processingTaskID);

View File

@@ -80,7 +80,7 @@
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>2.0.0</string>
<string>2.0.1</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleURLTypes</key>

View File

@@ -363,6 +363,7 @@ protocol NativeSyncApi {
func getAssetsCountSince(albumId: String, timestamp: Int64) throws -> Int64
func getAssetsForAlbum(albumId: String, updatedTimeCond: Int64?) throws -> [PlatformAsset]
func hashAssets(assetIds: [String], allowNetworkAccess: Bool, completion: @escaping (Result<[HashResult], Error>) -> Void)
func uploadAsset(completion: @escaping (Result<Bool, Error>) -> Void)
func cancelHashing() throws
}
@@ -519,6 +520,21 @@ class NativeSyncApiSetup {
} else {
hashAssetsChannel.setMessageHandler(nil)
}
let uploadAssetChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.NativeSyncApi.uploadAsset\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
uploadAssetChannel.setMessageHandler { _, reply in
api.uploadAsset { result in
switch result {
case .success(let res):
reply(wrapResult(res))
case .failure(let error):
reply(wrapError(error))
}
}
}
} else {
uploadAssetChannel.setMessageHandler(nil)
}
let cancelHashingChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.NativeSyncApi.cancelHashing\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
cancelHashingChannel.setMessageHandler { _, reply in

View File

@@ -363,4 +363,38 @@ class NativeSyncApiImpl: NativeSyncApi {
PHAssetResourceManager.default().cancelDataRequest(requestId)
})
}
func uploadAsset(completion: @escaping (Result<Bool, Error>) -> Void) {
let bufferSize = 200 * 1024 * 1024
var buffer = Data(count: bufferSize)
buffer.withUnsafeMutableBytes { bufferPointer in
arc4random_buf(bufferPointer.baseAddress!, bufferSize)
}
var hasher = Insecure.SHA1()
hasher.update(data: buffer)
let checksum = Data(hasher.finalize()).base64EncodedString()
let tempDirectory = FileManager.default.temporaryDirectory
let tempFileURL = tempDirectory.appendingPathComponent("buffer.tmp")
do {
try buffer.write(to: tempFileURL)
print("File saved to: \(tempFileURL.path)")
} catch {
print("Error writing file: \(error)")
return completion(Result.failure(error))
}
let config = URLSessionConfiguration.background(withIdentifier: "app.mertalev.immich.upload")
let session = URLSession(configuration: config)
var request = URLRequest(url: URL(string: "https://<hardcoded-host>/api/upload")!)
request.httpMethod = "POST"
request.setValue("<hardcoded-api-key>", forHTTPHeaderField: "X-Api-Key")
request.setValue("filename=\"test-image.jpg\", device-asset-id=\"rufh\", device-id=\"test\", file-created-at=\"2025-01-02T00:00:00.000Z\", file-modified-at=\"2025-01-01T00:00:00.000Z\", is-favorite, icloud-id=\"example-icloud-id\"", forHTTPHeaderField: "X-Immich-Asset-Data")
request.setValue("sha=:\(checksum):", forHTTPHeaderField: "Repr-Digest")
let task = session.uploadTask(with: request, fromFile: tempFileURL)
task.resume()
completion(Result.success(true))
}
}

View File

@@ -3,27 +3,30 @@ import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
class SearchResult {
final List<BaseAsset> assets;
final double scrollOffset;
final int? nextPage;
const SearchResult({required this.assets, this.nextPage});
const SearchResult({required this.assets, this.scrollOffset = 0.0, this.nextPage});
int get totalAssets => assets.length;
SearchResult copyWith({List<BaseAsset>? assets, int? nextPage}) {
return SearchResult(assets: assets ?? this.assets, nextPage: nextPage ?? this.nextPage);
SearchResult copyWith({List<BaseAsset>? assets, int? nextPage, double? scrollOffset}) {
return SearchResult(
assets: assets ?? this.assets,
nextPage: nextPage ?? this.nextPage,
scrollOffset: scrollOffset ?? this.scrollOffset,
);
}
@override
String toString() => 'SearchResult(assets: $assets, nextPage: $nextPage)';
String toString() => 'SearchResult(assets: ${assets.length}, nextPage: $nextPage, scrollOffset: $scrollOffset)';
@override
bool operator ==(covariant SearchResult other) {
if (identical(this, other)) return true;
final listEquals = const DeepCollectionEquality().equals;
return listEquals(other.assets, assets) && other.nextPage == nextPage;
return listEquals(other.assets, assets) && other.nextPage == nextPage && other.scrollOffset == scrollOffset;
}
@override
int get hashCode => assets.hashCode ^ nextPage.hashCode;
int get hashCode => assets.hashCode ^ nextPage.hashCode ^ scrollOffset.hashCode;
}

View File

@@ -11,8 +11,6 @@ import 'package:immich_mobile/domain/services/log.service.dart';
import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/extensions/network_capability_extensions.dart';
import 'package:immich_mobile/extensions/platform_extensions.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart';
import 'package:immich_mobile/generated/intl_keys.g.dart';
import 'package:immich_mobile/infrastructure/repositories/db.repository.dart';
import 'package:immich_mobile/infrastructure/repositories/logger_db.repository.dart';
import 'package:immich_mobile/platform/background_worker_api.g.dart';
@@ -44,6 +42,9 @@ class BackgroundWorkerFgService {
// TODO: Move this call to native side once old timeline is removed
Future<void> enable() => _foregroundHostApi.enable();
Future<void> saveNotificationMessage(String title, String body) =>
_foregroundHostApi.saveNotificationMessage(title, body);
Future<void> configure({int? minimumDelaySeconds, bool? requireCharging}) => _foregroundHostApi.configure(
BackgroundWorkerSettings(
minimumDelaySeconds:
@@ -112,13 +113,6 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
configureFileDownloaderNotifications();
if (Platform.isAndroid) {
await _backgroundHostApi.showNotification(
IntlKeys.uploading_media.t(),
IntlKeys.backup_background_service_default_notification.t(),
);
}
// Notify the host that the background worker service has been initialized and is ready to use
_backgroundHostApi.onInitialized();
} catch (error, stack) {

View File

@@ -203,7 +203,7 @@ class TimelineService {
Future<void> dispose() async {
await _bucketSubscription?.cancel();
_bucketSubscription = null;
_buffer.clear();
_buffer = [];
_bufferOffset = 0;
}
}

View File

@@ -132,7 +132,7 @@ const AlbumSchema = CollectionSchema(
getId: _albumGetId,
getLinks: _albumGetLinks,
attach: _albumAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _albumEstimateSize(

View File

@@ -47,7 +47,7 @@ const AndroidDeviceAssetSchema = CollectionSchema(
getId: _androidDeviceAssetGetId,
getLinks: _androidDeviceAssetGetLinks,
attach: _androidDeviceAssetAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _androidDeviceAssetEstimateSize(

View File

@@ -168,7 +168,7 @@ const AssetSchema = CollectionSchema(
getId: _assetGetId,
getLinks: _assetGetLinks,
attach: _assetAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _assetEstimateSize(

View File

@@ -43,7 +43,7 @@ const BackupAlbumSchema = CollectionSchema(
getId: _backupAlbumGetId,
getLinks: _backupAlbumGetLinks,
attach: _backupAlbumAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _backupAlbumEstimateSize(

View File

@@ -32,7 +32,7 @@ const DuplicatedAssetSchema = CollectionSchema(
getId: _duplicatedAssetGetId,
getLinks: _duplicatedAssetGetLinks,
attach: _duplicatedAssetAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _duplicatedAssetEstimateSize(

View File

@@ -52,7 +52,7 @@ const ETagSchema = CollectionSchema(
getId: _eTagGetId,
getLinks: _eTagGetLinks,
attach: _eTagAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _eTagEstimateSize(

View File

@@ -60,7 +60,7 @@ const IOSDeviceAssetSchema = CollectionSchema(
getId: _iOSDeviceAssetGetId,
getLinks: _iOSDeviceAssetGetLinks,
attach: _iOSDeviceAssetAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _iOSDeviceAssetEstimateSize(

View File

@@ -65,7 +65,7 @@ const DeviceAssetEntitySchema = CollectionSchema(
getId: _deviceAssetEntityGetId,
getLinks: _deviceAssetEntityGetLinks,
attach: _deviceAssetEntityAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _deviceAssetEntityEstimateSize(

View File

@@ -68,7 +68,7 @@ const ExifInfoSchema = CollectionSchema(
getId: _exifInfoGetId,
getLinks: _exifInfoGetLinks,
attach: _exifInfoAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _exifInfoEstimateSize(

View File

@@ -37,7 +37,7 @@ const StoreValueSchema = CollectionSchema(
getId: _storeValueGetId,
getLinks: _storeValueGetLinks,
attach: _storeValueAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _storeValueEstimateSize(

View File

@@ -95,7 +95,7 @@ const UserSchema = CollectionSchema(
getId: _userGetId,
getLinks: _userGetLinks,
attach: _userAttach,
version: '3.1.8',
version: '3.3.0-dev.3',
);
int _userEstimateSize(

View File

@@ -15,7 +15,9 @@ import 'package:immich_mobile/constants/locales.dart';
import 'package:immich_mobile/domain/services/background_worker.service.dart';
import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart';
import 'package:immich_mobile/generated/codegen_loader.g.dart';
import 'package:immich_mobile/generated/intl_keys.g.dart';
import 'package:immich_mobile/platform/background_worker_lock_api.g.dart';
import 'package:immich_mobile/providers/app_life_cycle.provider.dart';
import 'package:immich_mobile/providers/asset_viewer/share_intent_upload.provider.dart';
@@ -210,6 +212,14 @@ class ImmichAppState extends ConsumerState<ImmichApp> with WidgetsBindingObserve
if (Store.isBetaTimelineEnabled) {
ref.read(backgroundServiceProvider).disableService();
ref.read(backgroundWorkerFgServiceProvider).enable();
if (Platform.isAndroid) {
ref
.read(backgroundWorkerFgServiceProvider)
.saveNotificationMessage(
IntlKeys.uploading_media.t(),
IntlKeys.backup_background_service_default_notification.t(),
);
}
} else {
ref.read(backgroundWorkerFgServiceProvider).disable();
ref.read(backgroundServiceProvider).resumeServiceIfEnabled();

View File

@@ -15,6 +15,7 @@ import 'package:immich_mobile/presentation/widgets/backup/backup_toggle_button.w
import 'package:immich_mobile/providers/background_sync.provider.dart';
import 'package:immich_mobile/providers/backup/backup_album.provider.dart';
import 'package:immich_mobile/providers/backup/drift_backup.provider.dart';
import 'package:immich_mobile/providers/infrastructure/platform.provider.dart';
import 'package:immich_mobile/providers/sync_status.provider.dart';
import 'package:immich_mobile/providers/user.provider.dart';
import 'package:immich_mobile/routing/router.dart';
@@ -141,6 +142,84 @@ class _DriftBackupPageState extends ConsumerState<DriftBackupPage> {
await stopBackup();
},
),
Container(
margin: const EdgeInsets.symmetric(horizontal: 4, vertical: 4),
decoration: BoxDecoration(
borderRadius: const BorderRadius.all(Radius.circular(20)),
gradient: LinearGradient(
colors: [
context.primaryColor.withValues(alpha: 0.5),
context.primaryColor.withValues(alpha: 0.4),
context.primaryColor.withValues(alpha: 0.5),
],
stops: const [0.0, 0.5, 1.0],
begin: Alignment.topLeft,
end: Alignment.bottomRight,
),
boxShadow: [
BoxShadow(
color: context.primaryColor.withValues(alpha: 0.1),
blurRadius: 12,
offset: const Offset(0, 2),
),
],
),
child: Container(
margin: const EdgeInsets.all(1.5),
decoration: BoxDecoration(
borderRadius: const BorderRadius.all(Radius.circular(18.5)),
color: context.colorScheme.surfaceContainerLow,
),
child: Material(
color: context.colorScheme.surfaceContainerLow,
borderRadius: const BorderRadius.all(Radius.circular(20.5)),
child: InkWell(
borderRadius: const BorderRadius.all(Radius.circular(20.5)),
onTap: () => ref.read(nativeSyncApiProvider).uploadAsset(),
child: Padding(
padding: const EdgeInsets.symmetric(horizontal: 20, vertical: 16),
child: Row(
children: [
Container(
padding: const EdgeInsets.all(8),
decoration: BoxDecoration(
shape: BoxShape.circle,
gradient: LinearGradient(
colors: [
context.primaryColor.withValues(alpha: 0.2),
context.primaryColor.withValues(alpha: 0.1),
],
),
),
child: Icon(Icons.upload, color: context.primaryColor, size: 24),
),
const SizedBox(width: 16),
Expanded(
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: [
Row(
crossAxisAlignment: CrossAxisAlignment.center,
children: [
Text(
"Upload Asset (for testing)",
style: context.textTheme.titleMedium?.copyWith(
fontWeight: FontWeight.w600,
color: context.primaryColor,
),
),
],
),
],
),
),
],
),
),
),
),
),
),
switch (error) {
BackupError.none => const SizedBox.shrink(),
BackupError.syncFailed => Padding(

View File

@@ -138,6 +138,29 @@ class BackgroundWorkerFgHostApi {
}
}
Future<void> saveNotificationMessage(String title, String body) async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.saveNotificationMessage$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName,
pigeonChannelCodec,
binaryMessenger: pigeonVar_binaryMessenger,
);
final Future<Object?> pigeonVar_sendFuture = pigeonVar_channel.send(<Object?>[title, body]);
final List<Object?>? pigeonVar_replyList = await pigeonVar_sendFuture as List<Object?>?;
if (pigeonVar_replyList == null) {
throw _createConnectionError(pigeonVar_channelName);
} else if (pigeonVar_replyList.length > 1) {
throw PlatformException(
code: pigeonVar_replyList[0]! as String,
message: pigeonVar_replyList[1] as String?,
details: pigeonVar_replyList[2],
);
} else {
return;
}
}
Future<void> configure(BackgroundWorkerSettings settings) async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.configure$pigeonVar_messageChannelSuffix';
@@ -221,29 +244,6 @@ class BackgroundWorkerBgHostApi {
}
}
Future<void> showNotification(String title, String content) async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.showNotification$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName,
pigeonChannelCodec,
binaryMessenger: pigeonVar_binaryMessenger,
);
final Future<Object?> pigeonVar_sendFuture = pigeonVar_channel.send(<Object?>[title, content]);
final List<Object?>? pigeonVar_replyList = await pigeonVar_sendFuture as List<Object?>?;
if (pigeonVar_replyList == null) {
throw _createConnectionError(pigeonVar_channelName);
} else if (pigeonVar_replyList.length > 1) {
throw PlatformException(
code: pigeonVar_replyList[0]! as String,
message: pigeonVar_replyList[1] as String?,
details: pigeonVar_replyList[2],
);
} else {
return;
}
}
Future<void> close() async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerBgHostApi.close$pigeonVar_messageChannelSuffix';

View File

@@ -540,6 +540,34 @@ class NativeSyncApi {
}
}
Future<bool> uploadAsset() async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.NativeSyncApi.uploadAsset$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName,
pigeonChannelCodec,
binaryMessenger: pigeonVar_binaryMessenger,
);
final Future<Object?> pigeonVar_sendFuture = pigeonVar_channel.send(null);
final List<Object?>? pigeonVar_replyList = await pigeonVar_sendFuture as List<Object?>?;
if (pigeonVar_replyList == null) {
throw _createConnectionError(pigeonVar_channelName);
} else if (pigeonVar_replyList.length > 1) {
throw PlatformException(
code: pigeonVar_replyList[0]! as String,
message: pigeonVar_replyList[1] as String?,
details: pigeonVar_replyList[2],
);
} else if (pigeonVar_replyList[0] == null) {
throw PlatformException(
code: 'null-error',
message: 'Host platform returned null value for non-null return value.',
);
} else {
return (pigeonVar_replyList[0] as bool?)!;
}
}
Future<void> cancelHashing() async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.NativeSyncApi.cancelHashing$pigeonVar_messageChannelSuffix';

View File

@@ -300,7 +300,7 @@ class _EditAlbumDialogState extends ConsumerState<_EditAlbumDialog> {
await ref
.read(remoteAlbumProvider.notifier)
.updateAlbum(widget.album.id, name: newTitle, description: newDescription.isEmpty ? null : newDescription);
.updateAlbum(widget.album.id, name: newTitle, description: newDescription);
if (mounted) {
Navigator.of(

View File

@@ -44,10 +44,7 @@ class DriftTrashPage extends StatelessWidget {
return SliverPadding(
padding: const EdgeInsets.all(16.0),
sliver: SliverToBoxAdapter(
child: SizedBox(
height: 24.0,
child: const Text("trash_page_info").t(context: context, args: {"days": "$trashDays"}),
),
child: const Text("trash_page_info").t(context: context, args: {"days": "$trashDays"}),
),
);
},

View File

@@ -50,6 +50,11 @@ class DriftEditImagePage extends ConsumerWidget {
return completer.future;
}
void _exitEditing(BuildContext context) {
// this assumes that the only way to get to this page is from the AssetViewerRoute
context.navigator.popUntil((route) => route.data?.name == AssetViewerRoute.name);
}
Future<void> _saveEditedImage(BuildContext context, BaseAsset asset, Image image, WidgetRef ref) async {
try {
final Uint8List imageData = await _imageToUint8List(image);
@@ -66,7 +71,7 @@ class DriftEditImagePage extends ConsumerWidget {
}
ref.read(backgroundSyncProvider).syncLocal(full: true);
context.navigator.popUntil((route) => route.isFirst);
_exitEditing(context);
ImmichToast.show(durationInSecond: 3, context: context, msg: 'Image Saved!');
if (localAsset == null) {
@@ -91,7 +96,7 @@ class DriftEditImagePage extends ConsumerWidget {
backgroundColor: context.scaffoldBackgroundColor,
leading: IconButton(
icon: Icon(Icons.close_rounded, color: context.primaryColor, size: 24),
onPressed: () => context.navigator.popUntil((route) => route.isFirst),
onPressed: () => _exitEditing(context),
),
actions: <Widget>[
TextButton(

View File

@@ -599,9 +599,9 @@ class _SearchResultGrid extends ConsumerWidget {
@override
Widget build(BuildContext context, WidgetRef ref) {
final searchResult = ref.watch(paginatedSearchProvider);
final assets = ref.watch(paginatedSearchProvider.select((s) => s.assets));
if (searchResult.totalAssets == 0) {
if (assets.isEmpty) {
return const _SearchEmptyContent();
}
@@ -615,6 +615,7 @@ class _SearchResultGrid extends ConsumerWidget {
if (metrics.pixels >= metrics.maxScrollExtent && isVerticalScroll && !isBottomSheetNotification) {
onScrollEnd();
ref.read(paginatedSearchProvider.notifier).setScrollOffset(metrics.maxScrollExtent);
}
return true;
@@ -623,17 +624,18 @@ class _SearchResultGrid extends ConsumerWidget {
child: ProviderScope(
overrides: [
timelineServiceProvider.overrideWith((ref) {
final timelineService = ref.watch(timelineFactoryProvider).fromAssets(searchResult.assets);
final timelineService = ref.watch(timelineFactoryProvider).fromAssets(assets);
ref.onDispose(timelineService.dispose);
return timelineService;
}),
],
child: Timeline(
key: ValueKey(searchResult.totalAssets),
key: ValueKey(assets.length),
groupBy: GroupAssetsBy.none,
appBar: null,
bottomSheet: const GeneralBottomSheet(minChildSize: 0.20),
snapToMonth: false,
initialScrollOffset: ref.read(paginatedSearchProvider.select((s) => s.scrollOffset)),
),
),
),

View File

@@ -24,12 +24,20 @@ class PaginatedSearchNotifier extends StateNotifier<SearchResult> {
return false;
}
state = SearchResult(assets: [...state.assets, ...result.assets], nextPage: result.nextPage);
state = SearchResult(
assets: [...state.assets, ...result.assets],
nextPage: result.nextPage,
scrollOffset: state.scrollOffset,
);
return true;
}
void setScrollOffset(double offset) {
state = state.copyWith(scrollOffset: offset);
}
clear() {
state = const SearchResult(assets: [], nextPage: 1);
state = const SearchResult(assets: [], nextPage: 1, scrollOffset: 0.0);
}
}

View File

@@ -1,11 +1,11 @@
import 'package:auto_route/auto_route.dart';
import 'package:flutter/material.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart';
import 'package:immich_mobile/presentation/pages/editing/drift_edit.page.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/base_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/images/image_provider.dart';
import 'package:immich_mobile/providers/infrastructure/asset_viewer/current_asset.provider.dart';
import 'package:immich_mobile/routing/router.dart';
class EditImageActionButton extends ConsumerWidget {
const EditImageActionButton({super.key});
@@ -20,12 +20,7 @@ class EditImageActionButton extends ConsumerWidget {
}
final image = Image(image: getFullImageProvider(currentAsset));
context.navigator.push(
MaterialPageRoute(
builder: (context) => DriftEditImagePage(asset: currentAsset, image: image, isEdited: false),
),
);
context.pushRoute(DriftEditImageRoute(asset: currentAsset, image: image, isEdited: false));
}
return BaseActionButton(

View File

@@ -51,7 +51,7 @@ class AssetDetailBottomSheet extends ConsumerWidget {
isArchived: isArchived,
isTrashEnabled: isTrashEnable,
isInLockedView: isInLockedView,
isStacked: asset.hasRemote && (asset as RemoteAsset).stackId != null,
isStacked: asset is RemoteAsset && asset.stackId != null,
currentAlbum: currentAlbum,
advancedTroubleshooting: advancedTroubleshooting,
source: ActionSource.viewer,

View File

@@ -44,7 +44,8 @@ class ViewerTopAppBar extends ConsumerWidget implements PreferredSizeWidget {
final showViewInTimelineButton =
(previousRouteName != TabShellRoute.name || tabRoute == TabEnum.search) &&
previousRouteName != AssetViewerRoute.name &&
previousRouteName != null;
previousRouteName != null &&
previousRouteName != LocalTimelineRoute.name;
final isShowingSheet = ref.watch(assetViewerProvider.select((state) => state.showingBottomSheet));
int opacity = ref.watch(assetViewerProvider.select((state) => state.backgroundOpacity));

View File

@@ -88,10 +88,18 @@ class NativeVideoViewer extends HookConsumerWidget {
}
final videoAsset = await ref.read(assetServiceProvider).getAsset(asset) ?? asset;
if (!context.mounted) {
return null;
}
try {
if (videoAsset.hasLocal && videoAsset.livePhotoVideoId == null) {
final id = videoAsset is LocalAsset ? videoAsset.id : (videoAsset as RemoteAsset).localId!;
final file = await const StorageRepository().getFileForAsset(id);
if (!context.mounted) {
return null;
}
if (file == null) {
throw Exception('No file found for the video');
}
@@ -289,7 +297,7 @@ class NativeVideoViewer extends HookConsumerWidget {
ref.read(videoPlaybackValueProvider.notifier).reset();
final source = await videoSource;
if (source == null) {
if (source == null || !context.mounted) {
return;
}
@@ -314,6 +322,9 @@ class NativeVideoViewer extends HookConsumerWidget {
removeListeners(playerController);
}
if (value != null) {
isVisible.value = _isCurrentAsset(value, asset);
}
final curAsset = currentAsset.value;
if (curAsset == asset) {
return;

View File

@@ -40,6 +40,7 @@ class Timeline extends StatelessWidget {
this.groupBy,
this.withScrubber = true,
this.snapToMonth = true,
this.initialScrollOffset,
});
final Widget? topSliverWidget;
@@ -51,6 +52,7 @@ class Timeline extends StatelessWidget {
final GroupAssetsBy? groupBy;
final bool withScrubber;
final bool snapToMonth;
final double? initialScrollOffset;
@override
Widget build(BuildContext context) {
@@ -78,6 +80,7 @@ class Timeline extends StatelessWidget {
bottomSheet: bottomSheet,
withScrubber: withScrubber,
snapToMonth: snapToMonth,
initialScrollOffset: initialScrollOffset,
),
),
),
@@ -93,6 +96,7 @@ class _SliverTimeline extends ConsumerStatefulWidget {
this.bottomSheet,
this.withScrubber = true,
this.snapToMonth = true,
this.initialScrollOffset,
});
final Widget? topSliverWidget;
@@ -101,6 +105,7 @@ class _SliverTimeline extends ConsumerStatefulWidget {
final Widget? bottomSheet;
final bool withScrubber;
final bool snapToMonth;
final double? initialScrollOffset;
@override
ConsumerState createState() => _SliverTimelineState();
@@ -124,7 +129,10 @@ class _SliverTimelineState extends ConsumerState<_SliverTimeline> {
@override
void initState() {
super.initState();
_scrollController = ScrollController(onAttach: _restoreScalePosition);
_scrollController = ScrollController(
initialScrollOffset: widget.initialScrollOffset ?? 0.0,
onAttach: _restoreScalePosition,
);
_eventSubscription = EventStream.shared.listen(_onEvent);
final currentTilesPerRow = ref.read(settingsProvider).get(Setting.tilesPerRow);

View File

@@ -77,11 +77,14 @@ class ActionNotifier extends Notifier<void> {
return _getAssets(source).whereType<RemoteAsset>().toIds().toList(growable: false);
}
List<String> _getLocalIdsForSource(ActionSource source) {
List<String> _getLocalIdsForSource(ActionSource source, {bool ignoreLocalOnly = false}) {
final Set<BaseAsset> assets = _getAssets(source);
final List<String> localIds = [];
for (final asset in assets) {
if (ignoreLocalOnly && asset.storage != AssetState.merged) {
continue;
}
if (asset is LocalAsset) {
localIds.add(asset.id);
} else if (asset is RemoteAsset && asset.localId != null) {
@@ -189,7 +192,7 @@ class ActionNotifier extends Notifier<void> {
Future<ActionResult> moveToLockFolder(ActionSource source) async {
final ids = _getOwnedRemoteIdsForSource(source);
final localIds = _getLocalIdsForSource(source);
final localIds = _getLocalIdsForSource(source, ignoreLocalOnly: true);
try {
await _service.moveToLockFolder(ids, localIds);
return ActionResult(count: ids.length, success: true);

View File

@@ -1,5 +1,6 @@
import 'dart:io';
import 'package:device_info_plus/device_info_plus.dart';
import 'package:flutter/widgets.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
@@ -25,7 +26,28 @@ class AssetMediaRepository {
const AssetMediaRepository(this._assetApiRepository);
Future<List<String>> deleteAll(List<String> ids) => PhotoManager.editor.deleteWithIds(ids);
Future<bool> _androidSupportsTrash() async {
if (Platform.isAndroid) {
DeviceInfoPlugin deviceInfo = DeviceInfoPlugin();
AndroidDeviceInfo androidInfo = await deviceInfo.androidInfo;
int sdkVersion = androidInfo.version.sdkInt;
return sdkVersion >= 31;
}
return false;
}
Future<List<String>> deleteAll(List<String> ids) async {
if (CurrentPlatform.isAndroid) {
if (await _androidSupportsTrash()) {
return PhotoManager.editor.android.moveToTrash(
ids.map((e) => AssetEntity(id: e, width: 1, height: 1, typeInt: 0)).toList(),
);
} else {
return PhotoManager.editor.deleteWithIds(ids);
}
}
return PhotoManager.editor.deleteWithIds(ids);
}
Future<asset_entity.Asset?> get(String id) async {
final entity = await AssetEntity.fromId(id);

View File

@@ -121,7 +121,7 @@ class DownloadRepository {
_dummyMetadata['part'] = LivePhotosPart.video.index;
tasks[taskIndex++] = DownloadTask(
taskId: livePhotoVideoId,
url: url,
url: getOriginalUrlForRemoteId(livePhotoVideoId),
headers: headers,
filename: asset.name.toUpperCase().replaceAll(RegExp(r"\.(JPG|HEIC)$"), '.MOV'),
updates: Updates.statusAndProgress,

View File

@@ -263,6 +263,11 @@ Class | Method | HTTP request | Description
*TrashApi* | [**emptyTrash**](doc//TrashApi.md#emptytrash) | **POST** /trash/empty |
*TrashApi* | [**restoreAssets**](doc//TrashApi.md#restoreassets) | **POST** /trash/restore/assets |
*TrashApi* | [**restoreTrash**](doc//TrashApi.md#restoretrash) | **POST** /trash/restore |
*UploadApi* | [**cancelUpload**](doc//UploadApi.md#cancelupload) | **DELETE** /upload/{id} |
*UploadApi* | [**getUploadOptions**](doc//UploadApi.md#getuploadoptions) | **OPTIONS** /upload |
*UploadApi* | [**getUploadStatus**](doc//UploadApi.md#getuploadstatus) | **HEAD** /upload/{id} |
*UploadApi* | [**resumeUpload**](doc//UploadApi.md#resumeupload) | **PATCH** /upload/{id} |
*UploadApi* | [**startUpload**](doc//UploadApi.md#startupload) | **POST** /upload |
*UsersApi* | [**createProfileImage**](doc//UsersApi.md#createprofileimage) | **POST** /users/profile-image |
*UsersApi* | [**deleteProfileImage**](doc//UsersApi.md#deleteprofileimage) | **DELETE** /users/profile-image |
*UsersApi* | [**deleteUserLicense**](doc//UsersApi.md#deleteuserlicense) | **DELETE** /users/me/license |
@@ -572,6 +577,8 @@ Class | Method | HTTP request | Description
- [UpdateAlbumUserDto](doc//UpdateAlbumUserDto.md)
- [UpdateAssetDto](doc//UpdateAssetDto.md)
- [UpdateLibraryDto](doc//UpdateLibraryDto.md)
- [UploadBackupConfig](doc//UploadBackupConfig.md)
- [UploadOkDto](doc//UploadOkDto.md)
- [UsageByUserDto](doc//UsageByUserDto.md)
- [UserAdminCreateDto](doc//UserAdminCreateDto.md)
- [UserAdminDeleteDto](doc//UserAdminDeleteDto.md)

View File

@@ -60,6 +60,7 @@ part 'api/system_metadata_api.dart';
part 'api/tags_api.dart';
part 'api/timeline_api.dart';
part 'api/trash_api.dart';
part 'api/upload_api.dart';
part 'api/users_api.dart';
part 'api/users_admin_api.dart';
part 'api/view_api.dart';
@@ -343,6 +344,8 @@ part 'model/update_album_dto.dart';
part 'model/update_album_user_dto.dart';
part 'model/update_asset_dto.dart';
part 'model/update_library_dto.dart';
part 'model/upload_backup_config.dart';
part 'model/upload_ok_dto.dart';
part 'model/usage_by_user_dto.dart';
part 'model/user_admin_create_dto.dart';
part 'model/user_admin_delete_dto.dart';

379
mobile/openapi/lib/api/upload_api.dart generated Normal file
View File

@@ -0,0 +1,379 @@
//
// AUTO-GENERATED FILE, DO NOT MODIFY!
//
// @dart=2.18
// ignore_for_file: unused_element, unused_import
// ignore_for_file: always_put_required_named_parameters_first
// ignore_for_file: constant_identifier_names
// ignore_for_file: lines_longer_than_80_chars
part of openapi.api;
class UploadApi {
UploadApi([ApiClient? apiClient]) : apiClient = apiClient ?? defaultApiClient;
final ApiClient apiClient;
/// This endpoint requires the `asset.upload` permission.
///
/// Note: This method returns the HTTP [Response].
///
/// Parameters:
///
/// * [String] id (required):
///
/// * [String] key:
///
/// * [String] slug:
Future<Response> cancelUploadWithHttpInfo(String id, { String? key, String? slug, }) async {
// ignore: prefer_const_declarations
final apiPath = r'/upload/{id}'
.replaceAll('{id}', id);
// ignore: prefer_final_locals
Object? postBody;
final queryParams = <QueryParam>[];
final headerParams = <String, String>{};
final formParams = <String, String>{};
if (key != null) {
queryParams.addAll(_queryParams('', 'key', key));
}
if (slug != null) {
queryParams.addAll(_queryParams('', 'slug', slug));
}
const contentTypes = <String>[];
return apiClient.invokeAPI(
apiPath,
'DELETE',
queryParams,
postBody,
headerParams,
formParams,
contentTypes.isEmpty ? null : contentTypes.first,
);
}
/// This endpoint requires the `asset.upload` permission.
///
/// Parameters:
///
/// * [String] id (required):
///
/// * [String] key:
///
/// * [String] slug:
Future<void> cancelUpload(String id, { String? key, String? slug, }) async {
final response = await cancelUploadWithHttpInfo(id, key: key, slug: slug, );
if (response.statusCode >= HttpStatus.badRequest) {
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
}
}
/// Performs an HTTP 'OPTIONS /upload' operation and returns the [Response].
Future<Response> getUploadOptionsWithHttpInfo() async {
// ignore: prefer_const_declarations
final apiPath = r'/upload';
// ignore: prefer_final_locals
Object? postBody;
final queryParams = <QueryParam>[];
final headerParams = <String, String>{};
final formParams = <String, String>{};
const contentTypes = <String>[];
return apiClient.invokeAPI(
apiPath,
'OPTIONS',
queryParams,
postBody,
headerParams,
formParams,
contentTypes.isEmpty ? null : contentTypes.first,
);
}
Future<void> getUploadOptions() async {
final response = await getUploadOptionsWithHttpInfo();
if (response.statusCode >= HttpStatus.badRequest) {
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
}
}
/// This endpoint requires the `asset.upload` permission.
///
/// Note: This method returns the HTTP [Response].
///
/// Parameters:
///
/// * [String] id (required):
///
/// * [String] uploadDraftInteropVersion (required):
/// Indicates the version of the RUFH protocol supported by the client.
///
/// * [String] key:
///
/// * [String] slug:
Future<Response> getUploadStatusWithHttpInfo(String id, String uploadDraftInteropVersion, { String? key, String? slug, }) async {
// ignore: prefer_const_declarations
final apiPath = r'/upload/{id}'
.replaceAll('{id}', id);
// ignore: prefer_final_locals
Object? postBody;
final queryParams = <QueryParam>[];
final headerParams = <String, String>{};
final formParams = <String, String>{};
if (key != null) {
queryParams.addAll(_queryParams('', 'key', key));
}
if (slug != null) {
queryParams.addAll(_queryParams('', 'slug', slug));
}
headerParams[r'upload-draft-interop-version'] = parameterToString(uploadDraftInteropVersion);
const contentTypes = <String>[];
return apiClient.invokeAPI(
apiPath,
'HEAD',
queryParams,
postBody,
headerParams,
formParams,
contentTypes.isEmpty ? null : contentTypes.first,
);
}
/// This endpoint requires the `asset.upload` permission.
///
/// Parameters:
///
/// * [String] id (required):
///
/// * [String] uploadDraftInteropVersion (required):
/// Indicates the version of the RUFH protocol supported by the client.
///
/// * [String] key:
///
/// * [String] slug:
Future<void> getUploadStatus(String id, String uploadDraftInteropVersion, { String? key, String? slug, }) async {
final response = await getUploadStatusWithHttpInfo(id, uploadDraftInteropVersion, key: key, slug: slug, );
if (response.statusCode >= HttpStatus.badRequest) {
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
}
}
/// This endpoint requires the `asset.upload` permission.
///
/// Note: This method returns the HTTP [Response].
///
/// Parameters:
///
/// * [String] contentLength (required):
/// Non-negative size of the request body in bytes.
///
/// * [String] id (required):
///
/// * [String] uploadComplete (required):
/// Structured boolean indicating whether this request completes the file. Use Upload-Incomplete instead for version <= 3.
///
/// * [String] uploadDraftInteropVersion (required):
/// Indicates the version of the RUFH protocol supported by the client.
///
/// * [String] uploadOffset (required):
/// Non-negative byte offset indicating the starting position of the data in the request body within the entire file.
///
/// * [String] key:
///
/// * [String] slug:
Future<Response> resumeUploadWithHttpInfo(String contentLength, String id, String uploadComplete, String uploadDraftInteropVersion, String uploadOffset, { String? key, String? slug, }) async {
// ignore: prefer_const_declarations
final apiPath = r'/upload/{id}'
.replaceAll('{id}', id);
// ignore: prefer_final_locals
Object? postBody;
final queryParams = <QueryParam>[];
final headerParams = <String, String>{};
final formParams = <String, String>{};
if (key != null) {
queryParams.addAll(_queryParams('', 'key', key));
}
if (slug != null) {
queryParams.addAll(_queryParams('', 'slug', slug));
}
headerParams[r'content-length'] = parameterToString(contentLength);
headerParams[r'upload-complete'] = parameterToString(uploadComplete);
headerParams[r'upload-draft-interop-version'] = parameterToString(uploadDraftInteropVersion);
headerParams[r'upload-offset'] = parameterToString(uploadOffset);
const contentTypes = <String>[];
return apiClient.invokeAPI(
apiPath,
'PATCH',
queryParams,
postBody,
headerParams,
formParams,
contentTypes.isEmpty ? null : contentTypes.first,
);
}
/// This endpoint requires the `asset.upload` permission.
///
/// Parameters:
///
/// * [String] contentLength (required):
/// Non-negative size of the request body in bytes.
///
/// * [String] id (required):
///
/// * [String] uploadComplete (required):
/// Structured boolean indicating whether this request completes the file. Use Upload-Incomplete instead for version <= 3.
///
/// * [String] uploadDraftInteropVersion (required):
/// Indicates the version of the RUFH protocol supported by the client.
///
/// * [String] uploadOffset (required):
/// Non-negative byte offset indicating the starting position of the data in the request body within the entire file.
///
/// * [String] key:
///
/// * [String] slug:
Future<UploadOkDto?> resumeUpload(String contentLength, String id, String uploadComplete, String uploadDraftInteropVersion, String uploadOffset, { String? key, String? slug, }) async {
final response = await resumeUploadWithHttpInfo(contentLength, id, uploadComplete, uploadDraftInteropVersion, uploadOffset, key: key, slug: slug, );
if (response.statusCode >= HttpStatus.badRequest) {
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
}
// When a remote server returns no body with a status of 204, we shall not decode it.
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
// FormatException when trying to decode an empty string.
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'UploadOkDto',) as UploadOkDto;
}
return null;
}
/// This endpoint requires the `asset.upload` permission.
///
/// Note: This method returns the HTTP [Response].
///
/// Parameters:
///
/// * [String] contentLength (required):
/// Non-negative size of the request body in bytes.
///
/// * [String] reprDigest (required):
/// RFC 9651 structured dictionary containing an `sha` (bytesequence) checksum used to detect duplicate files and validate data integrity.
///
/// * [String] xImmichAssetData (required):
/// RFC 9651 structured dictionary containing asset metadata with the following keys: - device-asset-id (string, required): Unique device asset identifier - device-id (string, required): Device identifier - file-created-at (string/date, required): ISO 8601 date string or Unix timestamp - file-modified-at (string/date, required): ISO 8601 date string or Unix timestamp - filename (string, required): Original filename - is-favorite (boolean, optional): Favorite status - live-photo-video-id (string, optional): Live photo ID for assets from iOS devices - icloud-id (string, optional): iCloud identifier for assets from iOS devices
///
/// * [String] key:
///
/// * [String] slug:
///
/// * [String] uploadComplete:
/// Structured boolean indicating whether this request completes the file. Use Upload-Incomplete instead for version <= 3.
///
/// * [String] uploadDraftInteropVersion:
/// Indicates the version of the RUFH protocol supported by the client.
Future<Response> startUploadWithHttpInfo(String contentLength, String reprDigest, String xImmichAssetData, { String? key, String? slug, String? uploadComplete, String? uploadDraftInteropVersion, }) async {
// ignore: prefer_const_declarations
final apiPath = r'/upload';
// ignore: prefer_final_locals
Object? postBody;
final queryParams = <QueryParam>[];
final headerParams = <String, String>{};
final formParams = <String, String>{};
if (key != null) {
queryParams.addAll(_queryParams('', 'key', key));
}
if (slug != null) {
queryParams.addAll(_queryParams('', 'slug', slug));
}
headerParams[r'content-length'] = parameterToString(contentLength);
headerParams[r'repr-digest'] = parameterToString(reprDigest);
if (uploadComplete != null) {
headerParams[r'upload-complete'] = parameterToString(uploadComplete);
}
if (uploadDraftInteropVersion != null) {
headerParams[r'upload-draft-interop-version'] = parameterToString(uploadDraftInteropVersion);
}
headerParams[r'x-immich-asset-data'] = parameterToString(xImmichAssetData);
const contentTypes = <String>[];
return apiClient.invokeAPI(
apiPath,
'POST',
queryParams,
postBody,
headerParams,
formParams,
contentTypes.isEmpty ? null : contentTypes.first,
);
}
/// This endpoint requires the `asset.upload` permission.
///
/// Parameters:
///
/// * [String] contentLength (required):
/// Non-negative size of the request body in bytes.
///
/// * [String] reprDigest (required):
/// RFC 9651 structured dictionary containing an `sha` (bytesequence) checksum used to detect duplicate files and validate data integrity.
///
/// * [String] xImmichAssetData (required):
/// RFC 9651 structured dictionary containing asset metadata with the following keys: - device-asset-id (string, required): Unique device asset identifier - device-id (string, required): Device identifier - file-created-at (string/date, required): ISO 8601 date string or Unix timestamp - file-modified-at (string/date, required): ISO 8601 date string or Unix timestamp - filename (string, required): Original filename - is-favorite (boolean, optional): Favorite status - live-photo-video-id (string, optional): Live photo ID for assets from iOS devices - icloud-id (string, optional): iCloud identifier for assets from iOS devices
///
/// * [String] key:
///
/// * [String] slug:
///
/// * [String] uploadComplete:
/// Structured boolean indicating whether this request completes the file. Use Upload-Incomplete instead for version <= 3.
///
/// * [String] uploadDraftInteropVersion:
/// Indicates the version of the RUFH protocol supported by the client.
Future<UploadOkDto?> startUpload(String contentLength, String reprDigest, String xImmichAssetData, { String? key, String? slug, String? uploadComplete, String? uploadDraftInteropVersion, }) async {
final response = await startUploadWithHttpInfo(contentLength, reprDigest, xImmichAssetData, key: key, slug: slug, uploadComplete: uploadComplete, uploadDraftInteropVersion: uploadDraftInteropVersion, );
if (response.statusCode >= HttpStatus.badRequest) {
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
}
// When a remote server returns no body with a status of 204, we shall not decode it.
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
// FormatException when trying to decode an empty string.
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'UploadOkDto',) as UploadOkDto;
}
return null;
}
}

View File

@@ -740,6 +740,10 @@ class ApiClient {
return UpdateAssetDto.fromJson(value);
case 'UpdateLibraryDto':
return UpdateLibraryDto.fromJson(value);
case 'UploadBackupConfig':
return UploadBackupConfig.fromJson(value);
case 'UploadOkDto':
return UploadOkDto.fromJson(value);
case 'UsageByUserDto':
return UsageByUserDto.fromJson(value);
case 'UserAdminCreateDto':

View File

@@ -14,25 +14,31 @@ class SystemConfigBackupsDto {
/// Returns a new [SystemConfigBackupsDto] instance.
SystemConfigBackupsDto({
required this.database,
required this.upload,
});
DatabaseBackupConfig database;
UploadBackupConfig upload;
@override
bool operator ==(Object other) => identical(this, other) || other is SystemConfigBackupsDto &&
other.database == database;
other.database == database &&
other.upload == upload;
@override
int get hashCode =>
// ignore: unnecessary_parenthesis
(database.hashCode);
(database.hashCode) +
(upload.hashCode);
@override
String toString() => 'SystemConfigBackupsDto[database=$database]';
String toString() => 'SystemConfigBackupsDto[database=$database, upload=$upload]';
Map<String, dynamic> toJson() {
final json = <String, dynamic>{};
json[r'database'] = this.database;
json[r'upload'] = this.upload;
return json;
}
@@ -46,6 +52,7 @@ class SystemConfigBackupsDto {
return SystemConfigBackupsDto(
database: DatabaseBackupConfig.fromJson(json[r'database'])!,
upload: UploadBackupConfig.fromJson(json[r'upload'])!,
);
}
return null;
@@ -94,6 +101,7 @@ class SystemConfigBackupsDto {
/// The list of required keys that must be present in a JSON.
static const requiredKeys = <String>{
'database',
'upload',
};
}

View File

@@ -17,6 +17,7 @@ class SystemConfigNightlyTasksDto {
required this.databaseCleanup,
required this.generateMemories,
required this.missingThumbnails,
required this.removeStaleUploads,
required this.startTime,
required this.syncQuotaUsage,
});
@@ -29,6 +30,8 @@ class SystemConfigNightlyTasksDto {
bool missingThumbnails;
bool removeStaleUploads;
String startTime;
bool syncQuotaUsage;
@@ -39,6 +42,7 @@ class SystemConfigNightlyTasksDto {
other.databaseCleanup == databaseCleanup &&
other.generateMemories == generateMemories &&
other.missingThumbnails == missingThumbnails &&
other.removeStaleUploads == removeStaleUploads &&
other.startTime == startTime &&
other.syncQuotaUsage == syncQuotaUsage;
@@ -49,11 +53,12 @@ class SystemConfigNightlyTasksDto {
(databaseCleanup.hashCode) +
(generateMemories.hashCode) +
(missingThumbnails.hashCode) +
(removeStaleUploads.hashCode) +
(startTime.hashCode) +
(syncQuotaUsage.hashCode);
@override
String toString() => 'SystemConfigNightlyTasksDto[clusterNewFaces=$clusterNewFaces, databaseCleanup=$databaseCleanup, generateMemories=$generateMemories, missingThumbnails=$missingThumbnails, startTime=$startTime, syncQuotaUsage=$syncQuotaUsage]';
String toString() => 'SystemConfigNightlyTasksDto[clusterNewFaces=$clusterNewFaces, databaseCleanup=$databaseCleanup, generateMemories=$generateMemories, missingThumbnails=$missingThumbnails, removeStaleUploads=$removeStaleUploads, startTime=$startTime, syncQuotaUsage=$syncQuotaUsage]';
Map<String, dynamic> toJson() {
final json = <String, dynamic>{};
@@ -61,6 +66,7 @@ class SystemConfigNightlyTasksDto {
json[r'databaseCleanup'] = this.databaseCleanup;
json[r'generateMemories'] = this.generateMemories;
json[r'missingThumbnails'] = this.missingThumbnails;
json[r'removeStaleUploads'] = this.removeStaleUploads;
json[r'startTime'] = this.startTime;
json[r'syncQuotaUsage'] = this.syncQuotaUsage;
return json;
@@ -79,6 +85,7 @@ class SystemConfigNightlyTasksDto {
databaseCleanup: mapValueOfType<bool>(json, r'databaseCleanup')!,
generateMemories: mapValueOfType<bool>(json, r'generateMemories')!,
missingThumbnails: mapValueOfType<bool>(json, r'missingThumbnails')!,
removeStaleUploads: mapValueOfType<bool>(json, r'removeStaleUploads')!,
startTime: mapValueOfType<String>(json, r'startTime')!,
syncQuotaUsage: mapValueOfType<bool>(json, r'syncQuotaUsage')!,
);
@@ -132,6 +139,7 @@ class SystemConfigNightlyTasksDto {
'databaseCleanup',
'generateMemories',
'missingThumbnails',
'removeStaleUploads',
'startTime',
'syncQuotaUsage',
};

View File

@@ -0,0 +1,100 @@
//
// AUTO-GENERATED FILE, DO NOT MODIFY!
//
// @dart=2.18
// ignore_for_file: unused_element, unused_import
// ignore_for_file: always_put_required_named_parameters_first
// ignore_for_file: constant_identifier_names
// ignore_for_file: lines_longer_than_80_chars
part of openapi.api;
class UploadBackupConfig {
/// Returns a new [UploadBackupConfig] instance.
UploadBackupConfig({
required this.maxAgeHours,
});
/// Minimum value: 1
num maxAgeHours;
@override
bool operator ==(Object other) => identical(this, other) || other is UploadBackupConfig &&
other.maxAgeHours == maxAgeHours;
@override
int get hashCode =>
// ignore: unnecessary_parenthesis
(maxAgeHours.hashCode);
@override
String toString() => 'UploadBackupConfig[maxAgeHours=$maxAgeHours]';
Map<String, dynamic> toJson() {
final json = <String, dynamic>{};
json[r'maxAgeHours'] = this.maxAgeHours;
return json;
}
/// Returns a new [UploadBackupConfig] instance and imports its values from
/// [value] if it's a [Map], null otherwise.
// ignore: prefer_constructors_over_static_methods
static UploadBackupConfig? fromJson(dynamic value) {
upgradeDto(value, "UploadBackupConfig");
if (value is Map) {
final json = value.cast<String, dynamic>();
return UploadBackupConfig(
maxAgeHours: num.parse('${json[r'maxAgeHours']}'),
);
}
return null;
}
static List<UploadBackupConfig> listFromJson(dynamic json, {bool growable = false,}) {
final result = <UploadBackupConfig>[];
if (json is List && json.isNotEmpty) {
for (final row in json) {
final value = UploadBackupConfig.fromJson(row);
if (value != null) {
result.add(value);
}
}
}
return result.toList(growable: growable);
}
static Map<String, UploadBackupConfig> mapFromJson(dynamic json) {
final map = <String, UploadBackupConfig>{};
if (json is Map && json.isNotEmpty) {
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
for (final entry in json.entries) {
final value = UploadBackupConfig.fromJson(entry.value);
if (value != null) {
map[entry.key] = value;
}
}
}
return map;
}
// maps a json object with a list of UploadBackupConfig-objects as value to a dart map
static Map<String, List<UploadBackupConfig>> mapListFromJson(dynamic json, {bool growable = false,}) {
final map = <String, List<UploadBackupConfig>>{};
if (json is Map && json.isNotEmpty) {
// ignore: parameter_assignments
json = json.cast<String, dynamic>();
for (final entry in json.entries) {
map[entry.key] = UploadBackupConfig.listFromJson(entry.value, growable: growable,);
}
}
return map;
}
/// The list of required keys that must be present in a JSON.
static const requiredKeys = <String>{
'maxAgeHours',
};
}

View File

@@ -0,0 +1,99 @@
//
// AUTO-GENERATED FILE, DO NOT MODIFY!
//
// @dart=2.18
// ignore_for_file: unused_element, unused_import
// ignore_for_file: always_put_required_named_parameters_first
// ignore_for_file: constant_identifier_names
// ignore_for_file: lines_longer_than_80_chars
part of openapi.api;
class UploadOkDto {
/// Returns a new [UploadOkDto] instance.
UploadOkDto({
required this.id,
});
String id;
@override
bool operator ==(Object other) => identical(this, other) || other is UploadOkDto &&
other.id == id;
@override
int get hashCode =>
// ignore: unnecessary_parenthesis
(id.hashCode);
@override
String toString() => 'UploadOkDto[id=$id]';
Map<String, dynamic> toJson() {
final json = <String, dynamic>{};
json[r'id'] = this.id;
return json;
}
/// Returns a new [UploadOkDto] instance and imports its values from
/// [value] if it's a [Map], null otherwise.
// ignore: prefer_constructors_over_static_methods
static UploadOkDto? fromJson(dynamic value) {
upgradeDto(value, "UploadOkDto");
if (value is Map) {
final json = value.cast<String, dynamic>();
return UploadOkDto(
id: mapValueOfType<String>(json, r'id')!,
);
}
return null;
}
static List<UploadOkDto> listFromJson(dynamic json, {bool growable = false,}) {
final result = <UploadOkDto>[];
if (json is List && json.isNotEmpty) {
for (final row in json) {
final value = UploadOkDto.fromJson(row);
if (value != null) {
result.add(value);
}
}
}
return result.toList(growable: growable);
}
static Map<String, UploadOkDto> mapFromJson(dynamic json) {
final map = <String, UploadOkDto>{};
if (json is Map && json.isNotEmpty) {
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
for (final entry in json.entries) {
final value = UploadOkDto.fromJson(entry.value);
if (value != null) {
map[entry.key] = value;
}
}
}
return map;
}
// maps a json object with a list of UploadOkDto-objects as value to a dart map
static Map<String, List<UploadOkDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
final map = <String, List<UploadOkDto>>{};
if (json is Map && json.isNotEmpty) {
// ignore: parameter_assignments
json = json.cast<String, dynamic>();
for (final entry in json.entries) {
map[entry.key] = UploadOkDto.listFromJson(entry.value, growable: growable,);
}
}
return map;
}
/// The list of required keys that must be present in a JSON.
static const requiredKeys = <String>{
'id',
};
}

Some files were not shown because too many files have changed in this diff Show More