Compare commits

...

19 Commits

Author SHA1 Message Date
Jason Rasmussen
4202dd6c9a feat: mise in CI 2025-09-04 13:17:55 -04:00
bo0tzz
7bd79b551c feat: use mise for core dev tools (#21566)
* feat: use mise for core tools

* feat: mise handle dart

* feat: install dcm through mise

* fix: enable experimental in mise config

* feat: use mise.lock

* chore: always pin mise use

---------

Co-authored-by: bwees <brandonwees@gmail.com>
2025-09-04 12:58:42 -04:00
shenlong
5fe954b3c9 fix: use lock to synchronise foreground and background backup (#21522)
* fix: use lock to synchronise foreground and background backup

# Conflicts:
#	mobile/lib/domain/services/background_worker.service.dart
#	mobile/lib/platform/background_worker_api.g.dart
#	mobile/pigeon/background_worker_api.dart

* add timeout to the splash-screen acquire lock

* fix: null check on created date

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-09-04 11:44:33 -05:00
Jason Rasmussen
7f81a5bd6f fix: sidecar check job (#21312) 2025-09-04 16:23:58 +00:00
Arthur Normand
37a79292c0 feat: view similar photos (#21108)
* Enable filteing by example

* Drop `@GenerateSql` for `getEmbedding`?

* Improve error message

* PR Feedback

* Sort en.json

* Add SQL

* Fix lint

* Drop test that is no longer valid

* Fix i18n file sorting

* Fix TS error

* Add a `requireAccess` before pulling the embedding

* Fix decorators

* Run `make open-api`

---------

Co-authored-by: Alex <alex.tran1502@gmail.com>
2025-09-04 09:22:09 -05:00
Brandon Wees
bf6211776f fix: retain filter and sort options when pulling to refresh (#21452)
* fix: retain filter and sort options when pulling to refresh

* chore: use classes to manage state

* chore: format

* chore: refactor to keep local state of filter/sorted albums instead of a global filteredAlbums

* fix: keep sort when page is navigated away and returned

* chore: lint

* chore: format

why is autoformat not working

* fix: default sort direction state

* fix: search clears sorting

we have to cache our sorted albums since sorting is very computationally expensive and cannot be run on every keystroke. For searches, instead of pulling from the list of albums, we now pull from the cached sorted list and then filter which is then shown to the user
2025-09-04 09:08:17 -05:00
waclaw66
6c178a04dc fix(mobile): pinch + move scale (#21332)
* fix: pinch + move scale

* added lost changes from #18744
2025-09-04 09:01:39 -05:00
Snowknight26
036d314cb6 fix(web): Make Manage location utility header responsive (#21480)
* fix(web): Make Manage location utility header responsive

* Consolidate <p> into <Text>
2025-09-04 08:59:26 -05:00
Noel S
1fc5da398a fix(mobile): Hide system UI when entering immersive mode in asset viewer (#21539)
Implement hiding system ui in asset viewer
2025-09-04 08:57:34 -05:00
Sudheer Reddy Puthana
4d84338086 fix(mobile): readonly mode fixes (#21545)
* fix: Enables videotimeline in readonly mode

- Enables only the video controls in the bottom bar when readonlyMode is enabled.
- Fixes the message on the app profile bar when readOnlyMode is enabled **but** betaTimeline is not enabled.

Fixes https://github.com/immich-app/immich/issues/21441

Signed-off-by: Sudheer Puthana <Sud-Puth@users.noreply.github.com>

* cleanup bottom bar handling

---------

Signed-off-by: Sudheer Puthana <Sud-Puth@users.noreply.github.com>
Co-authored-by: bwees <brandonwees@gmail.com>
2025-09-04 08:50:38 -05:00
Yaros
0ac49b00ee feat(mobile): scrubber haptics (beta timeline) (#21351)
* feat(mobile): scrubber haptics beta timeline

* changed haptic to selectionClick
2025-09-04 08:47:16 -05:00
Mert
e427778a96 fix(mobile): pause image loading on inactive state (#21543)
* pause image loading

* make thumbhashes wait too
2025-09-04 08:40:38 -05:00
Pedro Simão
b82e29fbb4 feat(mobile): add to albums from existing albums (#21554)
* feat(mobile): add to albums from existing albums

* formatted files

* used the new t() method for translation

* removed unused import
2025-09-04 08:39:10 -05:00
shenlong
ff19aea4ac fix: keyboard not dismissed in places page (#21583)
Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-09-04 08:38:44 -05:00
Jason Rasmussen
28179a3a1d feat: audit cleanup (#21567) 2025-09-03 22:50:27 +00:00
Min Idzelis
af1e18d07e fix: docker upload_location perm fix for dev (#21501) 2025-09-03 18:27:30 +01:00
shenlong
270a0ff986 chore: log name and createdAt of asset on hash failures (#21546)
* chore: log name and createdAt of asset on hash failures

* add album name to hash failure logs

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-09-03 09:58:03 -05:00
shenlong
9d3f10372d refactor: simplify background worker (#21558)
* chore: log hash starting

* chore: android - bump the min worker delay

* remove local sync only task and always enqueue background workers

---------

Co-authored-by: shenlong-tanwen <139912620+shalong-tanwen@users.noreply.github.com>
2025-09-03 09:57:30 -05:00
bo0tzz
2f1385a236 chore: request LLM disclosure in PR template (#21553)
Suggestions for different wording/placeholder are welcome
2025-09-03 09:11:24 -05:00
117 changed files with 2432 additions and 1533 deletions

1
.github/.nvmrc vendored
View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -1,8 +1,5 @@
{ {
"scripts": { "name": "github",
"format": "prettier --check .",
"format:fix": "prettier --write ."
},
"devDependencies": { "devDependencies": {
"prettier": "^3.5.3" "prettier": "^3.5.3"
} }

View File

@@ -34,3 +34,7 @@ The `/api/something` endpoint is now `/api/something-else`
- [ ] I have followed naming conventions/patterns in the surrounding code - [ ] I have followed naming conventions/patterns in the surrounding code
- [ ] All code in `src/services/` uses repositories implementations for database calls, filesystem operations, etc. - [ ] All code in `src/services/` uses repositories implementations for database calls, filesystem operations, etc.
- [ ] All code in `src/repositories/` is pretty basic/simple and does not have any immich specific logic (that belongs in `src/services/`) - [ ] All code in `src/repositories/` is pretty basic/simple and does not have any immich specific logic (that belongs in `src/services/`)
## Please describe to which degree, if any, an LLM was used in creating this pull request.
...

View File

@@ -33,24 +33,20 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './cli/.nvmrc'
registry-url: 'https://registry.npmjs.org'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Setup typescript-sdk - name: Setup typescript-sdk
run: pnpm install && pnpm run build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
- run: pnpm install --frozen-lockfile - name: Install dependencies
- run: pnpm build run: mise run cli:install
- run: pnpm publish
- name: Run build
run: mise run cli:build
- name: Publish package
run: pnpm publish
if: ${{ github.event_name == 'release' }} if: ${{ github.event_name == 'release' }}
env: env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }} NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -55,24 +55,17 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './docs/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run install - name: Run install
run: pnpm install run: mise run docs:install
- name: Check formatting - name: Check formatting
run: pnpm format run: mise run docs:format-fix
- name: Run build - name: Run build
run: pnpm build run: mise run docs:build
- name: Upload build output - name: Upload build output
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2

View File

@@ -28,15 +28,11 @@ jobs:
token: ${{ steps.generate-token.outputs.token }} token: ${{ steps.generate-token.outputs.token }}
persist-credentials: true persist-credentials: true
- name: Setup Node - name: Setup mise
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Fix formatting - name: Fix formatting
run: make install-all && make format-all run: mise run server:format-fix && mise run web:format-fix && mise run docs:format-fix
- name: Commit and push - name: Commit and push
uses: EndBug/add-and-commit@a94899bca583c204427a224a7af87c02f9b325d5 # v9.1.4 uses: EndBug/add-and-commit@a94899bca583c204427a224a7af87c02f9b325d5 # v9.1.4

View File

@@ -46,15 +46,8 @@ jobs:
- name: Install uv - name: Install uv
uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2 uses: astral-sh/setup-uv@d4b2f3b6ecc6e67c4457f6d3e41ec42d3d0fcb86 # v5.4.2
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Bump version - name: Bump version
env: env:

View File

@@ -20,20 +20,15 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
# Setup .npmrc file to publish to npm
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './open-api/typescript-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Install deps - name: Install deps
run: pnpm install --frozen-lockfile run: mise run sdk:install
- name: Build - name: Build
run: pnpm build run: mise run sdk:build
- name: Publish - name: Publish
run: pnpm publish run: pnpm publish
env: env:

View File

@@ -72,27 +72,21 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run package manager install - name: Run package manager install
run: pnpm install run: mise run server:install
- name: Run linter - name: Run linter
run: pnpm lint run: mise run server:lint
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run formatter - name: Run formatter
run: pnpm format run: mise run server:format
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run tsc - name: Run tsc
run: pnpm check run: mise run server:check
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run small tests & coverage - name: Run small tests & coverage
run: pnpm test run: mise run server:test
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
cli-unit-tests: cli-unit-tests:
name: Unit Test CLI name: Unit Test CLI
@@ -109,30 +103,23 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './cli/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Setup typescript-sdk - name: Setup typescript-sdk
run: pnpm install && pnpm run build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
- name: Install deps - name: Install deps
run: pnpm install run: mise run cli:install
- name: Run linter - name: Run linter
run: pnpm lint run: mise run cli:lint
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run formatter - name: Run formatter
run: pnpm format run: mise run cli:format
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run tsc - name: Run tsc
run: pnpm check run: mise run cli:check
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run unit tests & coverage - name: Run unit tests & coverage
run: pnpm test run: mise run cli:test
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
cli-unit-tests-win: cli-unit-tests-win:
name: Unit Test CLI (Windows) name: Unit Test CLI (Windows)
@@ -149,25 +136,18 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './cli/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Setup typescript-sdk - name: Setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
- name: Install deps - name: Install deps
run: pnpm install --frozen-lockfile run: mise run cli:install
# Skip linter & formatter in Windows test. # Skip linter & formatter in Windows test.
- name: Run tsc - name: Run tsc
run: pnpm check run: mise run cli:check
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run unit tests & coverage - name: Run unit tests & coverage
run: pnpm test run: mise run cli:test
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
web-lint: web-lint:
name: Lint Web name: Lint Web
@@ -176,35 +156,25 @@ jobs:
runs-on: mich runs-on: mich
permissions: permissions:
contents: read contents: read
defaults:
run:
working-directory: ./web
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run setup typescript-sdk - name: Run setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk - name: Run install
- name: Run pnpm install run: mise run web:install
run: pnpm rebuild && pnpm install --frozen-lockfile
- name: Run linter - name: Run linter
run: pnpm lint:p run: mise run web:lint-p
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run formatter - name: Run formatter
run: pnpm format run: mise run web:format
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run svelte checks - name: Run svelte checks
run: pnpm check:svelte run: mise run web:check-svelte
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
web-unit-tests: web-unit-tests:
name: Test Web name: Test Web
@@ -213,32 +183,22 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
contents: read contents: read
defaults:
run:
working-directory: ./web
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run setup typescript-sdk - name: Run setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
- name: Run npm install - name: Run npm install
run: pnpm install --frozen-lockfile run: mise run web:install
- name: Run tsc - name: Run tsc
run: pnpm check:typescript run: mise run web:check
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run unit tests & coverage - name: Run unit tests & coverage
run: pnpm test run: mise run web:test
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
i18n-tests: i18n-tests:
name: Test i18n name: Test i18n
@@ -252,18 +212,12 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './web/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Install dependencies - name: Install dependencies
run: pnpm --filter=immich-web install --frozen-lockfile run: mise run web:install
- name: Format - name: Format
run: pnpm --filter=immich-web format:i18n run: mise run i18n:format-fix
- name: Find file changes - name: Find file changes
uses: tj-actions/verify-changed-files@a1c6acee9df209257a246f2cc6ae8cb6581c1edf # v20.0.4 uses: tj-actions/verify-changed-files@a1c6acee9df209257a246f2cc6ae8cb6581c1edf # v20.0.4
id: verify-changed-files id: verify-changed-files
@@ -293,29 +247,22 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run setup typescript-sdk - name: Run setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Install dependencies - name: Install dependencies
run: pnpm install --frozen-lockfile run: mise run e2e:install
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run linter - name: Run linter
run: pnpm lint run: mise run e2e:lint
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run formatter - name: Run formatter
run: pnpm format run: mise run e2e:format
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run tsc - name: Run tsc
run: pnpm check run: mise run e2e:check
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
server-medium-tests: server-medium-tests:
name: Medium Tests (Server) name: Medium Tests (Server)
@@ -324,26 +271,17 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
contents: read contents: read
defaults:
run:
working-directory: ./server
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node - name: Run install
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0 run: SHARP_IGNORE_GLOBAL_LIBVIPS=true mise run server:install
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run pnpm install
run: SHARP_IGNORE_GLOBAL_LIBVIPS=true pnpm install --frozen-lockfile
- name: Run medium tests - name: Run medium tests
run: pnpm test:medium run: mise run server:test-medium
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
e2e-tests-server-cli: e2e-tests-server-cli:
name: End-to-End Tests (Server & CLI) name: End-to-End Tests (Server & CLI)
@@ -352,9 +290,6 @@ jobs:
runs-on: ${{ matrix.runner }} runs-on: ${{ matrix.runner }}
permissions: permissions:
contents: read contents: read
defaults:
run:
working-directory: ./e2e
strategy: strategy:
matrix: matrix:
runner: [ubuntu-latest, ubuntu-24.04-arm] runner: [ubuntu-latest, ubuntu-24.04-arm]
@@ -364,34 +299,25 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
submodules: 'recursive' submodules: 'recursive'
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run setup typescript-sdk - name: Run setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run setup web - name: Run setup web
run: pnpm install --frozen-lockfile && pnpm exec svelte-kit sync run: mise run web:install && mise run web:svelte-kit-sync
working-directory: ./web
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run setup cli - name: Run setup cli
run: pnpm install --frozen-lockfile && pnpm build run: mise run cli:install && mise run cli:build
working-directory: ./cli
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Install dependencies - name: Install dependencies
run: pnpm install --frozen-lockfile run: mise run e2e:install
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Docker build - name: Docker build
run: docker compose build run: docker compose -f e2e/docker-compose.yml build
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run e2e tests (api & cli) - name: Run e2e tests (api & cli)
run: pnpm test run: mise run e2e:test
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
e2e-tests-web: e2e-tests-web:
name: End-to-End Tests (Web) name: End-to-End Tests (Web)
@@ -400,9 +326,6 @@ jobs:
runs-on: ${{ matrix.runner }} runs-on: ${{ matrix.runner }}
permissions: permissions:
contents: read contents: read
defaults:
run:
working-directory: ./e2e
strategy: strategy:
matrix: matrix:
runner: [ubuntu-latest, ubuntu-24.04-arm] runner: [ubuntu-latest, ubuntu-24.04-arm]
@@ -412,29 +335,26 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
submodules: 'recursive' submodules: 'recursive'
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run setup typescript-sdk - name: Run setup typescript-sdk
run: pnpm install --frozen-lockfile && pnpm build run: mise run sdk:install && mise run sdk:build
working-directory: ./open-api/typescript-sdk
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Install dependencies - name: Run setup web
run: pnpm install --frozen-lockfile run: mise run web:install && mise run web:svelte-kit-sync
if: ${{ !cancelled() }}
- name: Run setup cli
run: mise run cli:install && mise run cli:build
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Install Playwright Browsers - name: Install Playwright Browsers
run: npx playwright install chromium --only-shell run: npx playwright install chromium --only-shell
working-directory: e2e
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Docker build - name: Docker build
run: docker compose build run: docker compose -f e2e/docker-compose.yml build
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
- name: Run e2e tests (web) - name: Run e2e tests (web)
run: npx playwright test run: mise run e2e:test-web
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
success-check-e2e: success-check-e2e:
name: End-to-End Tests Success name: End-to-End Tests Success
@@ -519,18 +439,12 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node - name: Run install
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0 run: mise run github:install
with:
node-version-file: './.github/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Run pnpm install
run: pnpm install --frozen-lockfile
- name: Run formatter - name: Run formatter
run: pnpm format run: mise run github:format
if: ${{ !cancelled() }} if: ${{ !cancelled() }}
shellcheck: shellcheck:
name: ShellCheck name: ShellCheck
@@ -556,18 +470,12 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Install server dependencies - name: Install server dependencies
run: SHARP_IGNORE_GLOBAL_LIBVIPS=true pnpm --filter immich install --frozen-lockfile run: SHARP_IGNORE_GLOBAL_LIBVIPS=true mise run server:install
- name: Build the app - name: Build the app
run: pnpm --filter immich build run: mise run server:build
- name: Run API generation - name: Run API generation
run: ./bin/generate-open-api.sh run: ./bin/generate-open-api.sh
working-directory: open-api working-directory: open-api
@@ -611,25 +519,19 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0 uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with: with:
persist-credentials: false persist-credentials: false
- name: Setup pnpm - name: Setup mise
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0 uses: jdx/mise-action@5ac50f778e26fac95da98d50503682459e86d566 # v3.2.0
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version-file: './server/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Install server dependencies - name: Install server dependencies
run: SHARP_IGNORE_GLOBAL_LIBVIPS=true pnpm install --frozen-lockfile run: SHARP_IGNORE_GLOBAL_LIBVIPS=true pnpm install --frozen-lockfile
- name: Build the app - name: Build the app
run: pnpm build run: mise run server:build
- name: Run existing migrations - name: Run existing migrations
run: pnpm migrations:run run: mise run server:migrations run
- name: Test npm run schema:reset command works - name: Test npm run schema:reset command works
run: pnpm schema:reset run: mise run server:schema-reset
- name: Generate new migrations - name: Generate new migrations
continue-on-error: true continue-on-error: true
run: pnpm migrations:generate src/TestMigration run: mise run server:migrations generate src/TestMigration
- name: Find file changes - name: Find file changes
uses: tj-actions/verify-changed-files@a1c6acee9df209257a246f2cc6ae8cb6581c1edf # v20.0.4 uses: tj-actions/verify-changed-files@a1c6acee9df209257a246f2cc6ae8cb6581c1edf # v20.0.4
id: verify-changed-files id: verify-changed-files
@@ -646,7 +548,7 @@ jobs:
cat ./src/*-TestMigration.ts cat ./src/*-TestMigration.ts
exit 1 exit 1
- name: Run SQL generation - name: Run SQL generation
run: pnpm sync:sql run: mise run server:sql
env: env:
DB_URL: postgres://postgres:postgres@localhost:5432/immich DB_URL: postgres://postgres:postgres@localhost:5432/immich
- name: Find file changes - name: Find file changes

View File

@@ -10,14 +10,14 @@ dev-update: prepare-volumes
dev-scale: prepare-volumes dev-scale: prepare-volumes
@trap 'make dev-down' EXIT; COMPOSE_BAKE=true docker compose -f ./docker/docker-compose.dev.yml up --build -V --scale immich-server=3 --remove-orphans @trap 'make dev-down' EXIT; COMPOSE_BAKE=true docker compose -f ./docker/docker-compose.dev.yml up --build -V --scale immich-server=3 --remove-orphans
dev-docs: prepare-volumes dev-docs:
npm --prefix docs run start npm --prefix docs run start
.PHONY: e2e .PHONY: e2e
e2e: prepare-volumes e2e:
@trap 'make e2e-down' EXIT; COMPOSE_BAKE=true docker compose -f ./e2e/docker-compose.yml up --remove-orphans @trap 'make e2e-down' EXIT; COMPOSE_BAKE=true docker compose -f ./e2e/docker-compose.yml up --remove-orphans
e2e-update: prepare-volumes e2e-update:
@trap 'make e2e-down' EXIT; COMPOSE_BAKE=true docker compose -f ./e2e/docker-compose.yml up --build -V --remove-orphans @trap 'make e2e-down' EXIT; COMPOSE_BAKE=true docker compose -f ./e2e/docker-compose.yml up --build -V --remove-orphans
e2e-down: e2e-down:
@@ -73,6 +73,8 @@ define safe_chown
if chown $(2) $(or $(UID),1000):$(or $(GID),1000) "$(1)" 2>/dev/null; then \ if chown $(2) $(or $(UID),1000):$(or $(GID),1000) "$(1)" 2>/dev/null; then \
true; \ true; \
else \ else \
STATUS=$$?; echo "Exit code: $$STATUS $(1)"; \
echo "$$STATUS $(1)"; \
echo "Permission denied when changing owner of volumes and upload location. Try running 'sudo make prepare-volumes' first."; \ echo "Permission denied when changing owner of volumes and upload location. Try running 'sudo make prepare-volumes' first."; \
exit 1; \ exit 1; \
fi; fi;
@@ -83,11 +85,13 @@ prepare-volumes:
@$(foreach dir,$(VOLUME_DIRS),$(call safe_chown,$(dir),-R)) @$(foreach dir,$(VOLUME_DIRS),$(call safe_chown,$(dir),-R))
ifneq ($(UPLOAD_LOCATION),) ifneq ($(UPLOAD_LOCATION),)
ifeq ($(filter /%,$(UPLOAD_LOCATION)),) ifeq ($(filter /%,$(UPLOAD_LOCATION)),)
@mkdir -p "docker/$(UPLOAD_LOCATION)" @mkdir -p "docker/$(UPLOAD_LOCATION)/photos/upload"
@$(call safe_chown,docker/$(UPLOAD_LOCATION),) @$(call safe_chown,docker/$(UPLOAD_LOCATION),)
@$(call safe_chown,docker/$(UPLOAD_LOCATION)/photos,-R)
else else
@mkdir -p "$(UPLOAD_LOCATION)" @mkdir -p "$(UPLOAD_LOCATION)/photos/upload"
@$(call safe_chown,$(UPLOAD_LOCATION),) @$(call safe_chown,$(UPLOAD_LOCATION),)
@$(call safe_chown,$(UPLOAD_LOCATION)/photos,-R)
endif endif
endif endif

View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -42,17 +42,6 @@
"vitest-fetch-mock": "^0.4.0", "vitest-fetch-mock": "^0.4.0",
"yaml": "^2.3.1" "yaml": "^2.3.1"
}, },
"scripts": {
"build": "vite build",
"lint": "eslint \"src/**/*.ts\" --max-warnings 0",
"lint:fix": "npm run lint -- --fix",
"prepack": "npm run build",
"test": "vitest",
"test:cov": "vitest --coverage",
"format": "prettier --check .",
"format:fix": "prettier --write .",
"check": "tsc --noEmit"
},
"repository": { "repository": {
"type": "git", "type": "git",
"url": "git+https://github.com/immich-app/immich.git", "url": "git+https://github.com/immich-app/immich.git",
@@ -67,8 +56,5 @@
"fastq": "^1.17.1", "fastq": "^1.17.1",
"lodash-es": "^4.17.21", "lodash-es": "^4.17.21",
"micromatch": "^4.0.8" "micromatch": "^4.0.8"
},
"volta": {
"node": "22.18.0"
} }
} }

View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -5,7 +5,7 @@ After making any changes in the `server/src/schema`, a database migration need t
1. Run the command 1. Run the command
```bash ```bash
pnpm run migrations:generate <migration-name> mise run server:migrations generate <migration-name>
``` ```
2. Check if the migration file makes sense. 2. Check if the migration file makes sense.

View File

@@ -8,11 +8,11 @@ When contributing code through a pull request, please check the following:
## Web Checks ## Web Checks
- [ ] `pnpm run lint` (linting via ESLint) - [ ] `mise run web:lint` (linting via ESLint)
- [ ] `pnpm run format` (formatting via Prettier) - [ ] `mise run web:format` (formatting via Prettier)
- [ ] `pnpm run check:svelte` (Type checking via SvelteKit) - [ ] `mise run web:check` (check typescript)
- [ ] `pnpm run check:typescript` (check typescript) - [ ] `mise run web:check-svelte` (Type checking via SvelteKit)
- [ ] `pnpm test` (unit tests) - [ ] `mise run web:test` (unit tests)
## Documentation ## Documentation
@@ -25,17 +25,18 @@ Run all web checks with `pnpm run check:all`
## Server Checks ## Server Checks
- [ ] `pnpm run lint` (linting via ESLint) - [ ] `mise run server:lint` (linting via ESLint)
- [ ] `pnpm run format` (formatting via Prettier) - [ ] `mise run server:format` (formatting via Prettier)
- [ ] `pnpm run check` (Type checking via `tsc`) - [ ] `mise run server:check` (type checking via `tsc`)
- [ ] `pnpm test` (unit tests) - [ ] `mise run server:test` (unit tests)
- [ ] `mise run server:test-medium` (medium tests)
:::tip AIO :::tip AIO
Run all server checks with `pnpm run check:all` Run all server checks with `pnpm run check:all`
::: :::
:::info Auto Fix :::info Auto Fix
You can use `pnpm run __:fix` to potentially correct some issues automatically for `pnpm run format` and `lint`. You can use `mise run server:lint-fix` and `mise run server:format-fix` to potentially correct some issues automatically.
::: :::
## Mobile Checks ## Mobile Checks

View File

@@ -2,20 +2,6 @@
"name": "documentation", "name": "documentation",
"version": "0.0.0", "version": "0.0.0",
"private": true, "private": true,
"scripts": {
"docusaurus": "docusaurus",
"format": "prettier --check .",
"format:fix": "prettier --write .",
"start": "docusaurus start --port 3005",
"copy:openapi": "jq -c < ../open-api/immich-openapi-specs.json > ./static/openapi.json || exit 0",
"build": "npm run copy:openapi && docusaurus build",
"swizzle": "docusaurus swizzle",
"deploy": "docusaurus deploy",
"clear": "docusaurus clear",
"serve": "docusaurus serve",
"write-translations": "docusaurus write-translations",
"write-heading-ids": "docusaurus write-heading-ids"
},
"dependencies": { "dependencies": {
"@docusaurus/core": "~3.8.0", "@docusaurus/core": "~3.8.0",
"@docusaurus/preset-classic": "~3.8.0", "@docusaurus/preset-classic": "~3.8.0",
@@ -58,8 +44,5 @@
}, },
"engines": { "engines": {
"node": ">=20" "node": ">=20"
},
"volta": {
"node": "22.18.0"
} }
} }

View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -4,17 +4,6 @@
"description": "", "description": "",
"main": "index.js", "main": "index.js",
"type": "module", "type": "module",
"scripts": {
"test": "vitest --run",
"test:watch": "vitest",
"test:web": "npx playwright test",
"start:web": "npx playwright test --ui",
"format": "prettier --check .",
"format:fix": "prettier --write .",
"lint": "eslint \"src/**/*.ts\" --max-warnings 0",
"lint:fix": "npm run lint -- --fix",
"check": "tsc --noEmit"
},
"keywords": [], "keywords": [],
"author": "", "author": "",
"license": "GNU Affero General Public License version 3", "license": "GNU Affero General Public License version 3",
@@ -52,8 +41,5 @@
"typescript-eslint": "^8.28.0", "typescript-eslint": "^8.28.0",
"utimes": "^5.2.1", "utimes": "^5.2.1",
"vitest": "^3.0.0" "vitest": "^3.0.0"
},
"volta": {
"node": "22.18.0"
} }
} }

View File

@@ -1557,6 +1557,7 @@
"purchase_server_description_2": "Supporter status", "purchase_server_description_2": "Supporter status",
"purchase_server_title": "Server", "purchase_server_title": "Server",
"purchase_settings_server_activated": "The server product key is managed by the admin", "purchase_settings_server_activated": "The server product key is managed by the admin",
"query_asset_id": "Query Asset ID",
"queue_status": "Queuing {count}/{total}", "queue_status": "Queuing {count}/{total}",
"rating": "Star rating", "rating": "Star rating",
"rating_clear": "Clear rating", "rating_clear": "Clear rating",
@@ -1735,7 +1736,7 @@
"select_user_for_sharing_page_err_album": "Failed to create album", "select_user_for_sharing_page_err_album": "Failed to create album",
"selected": "Selected", "selected": "Selected",
"selected_count": "{count, plural, other {# selected}}", "selected_count": "{count, plural, other {# selected}}",
"selected_gps_coordinates": "selected gps coordinates", "selected_gps_coordinates": "Selected GPS Coordinates",
"send_message": "Send message", "send_message": "Send message",
"send_welcome_email": "Send welcome email", "send_welcome_email": "Send welcome email",
"server_endpoint": "Server Endpoint", "server_endpoint": "Server Endpoint",
@@ -2077,6 +2078,7 @@
"view_next_asset": "View next asset", "view_next_asset": "View next asset",
"view_previous_asset": "View previous asset", "view_previous_asset": "View previous asset",
"view_qr_code": "View QR code", "view_qr_code": "View QR code",
"view_similar_photos": "View similar photos",
"view_stack": "View Stack", "view_stack": "View Stack",
"view_user": "View User", "view_user": "View User",
"viewer_remove_from_stack": "Remove from Stack", "viewer_remove_from_stack": "Remove from Stack",

34
mise.lock Normal file
View File

@@ -0,0 +1,34 @@
[tools.dart]
version = "3.8.2"
backend = "asdf:dart"
[tools.flutter]
version = "3.32.8-stable"
backend = "asdf:flutter"
[tools."github:CQLabs/homebrew-dcm"]
version = "1.31.4"
backend = "github:CQLabs/homebrew-dcm"
[tools."github:CQLabs/homebrew-dcm".platforms.linux-x64]
checksum = "blake3:e9df5b765df327e1248fccf2c6165a89d632a065667f99c01765bf3047b94955"
size = 8821083
url = "https://github.com/CQLabs/homebrew-dcm/releases/download/1.31.4/dcm-linux-x64-release.zip"
[tools.node]
version = "22.18.0"
backend = "core:node"
[tools.node.platforms.linux-x64]
checksum = "sha256:a2e703725d8683be86bb5da967bf8272f4518bdaf10f21389e2b2c9eaeae8c8a"
size = 54824343
url = "https://nodejs.org/dist/v22.18.0/node-v22.18.0-linux-x64.tar.gz"
[tools.pnpm]
version = "10.14.0"
backend = "aqua:pnpm/pnpm"
[tools.pnpm.platforms.linux-x64]
checksum = "blake3:13dfa46b7173d3cad3bad60a756a492ecf0bce48b23eb9f793e7ccec5a09b46d"
size = 66231525
url = "https://github.com/pnpm/pnpm/releases/download/v10.14.0/pnpm-linux-x64"

312
mise.toml Normal file
View File

@@ -0,0 +1,312 @@
[tools]
node = "22.18.0"
flutter = "3.32.8"
pnpm = "10.14.0"
dart = "3.8.2"
[tools."github:CQLabs/homebrew-dcm"]
version = "1.31.4"
bin = "dcm"
postinstall = "chmod +x $MISE_TOOL_INSTALL_PATH/dcm"
[settings]
experimental = true
lockfile = true
pin = true
# .github
[tasks."github:install"]
run = "pnpm install --filter github --frozen-lockfile"
[tasks."github:format"]
env._.path = "./.github/node_modules/.bin"
dir = ".github"
run = "prettier --check ."
[tasks."github:format-fix"]
env._.path = "./.github/node_modules/.bin"
dir = ".github"
run = "prettier --write ."
# @immich/cli
[tasks."cli:install"]
run = "pnpm install --filter @immich/cli --frozen-lockfile"
[tasks."cli:build"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "vite build"
[tasks."cli:test"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "vite"
[tasks."cli:lint"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "eslint \"src/**/*.ts\" --max-warnings 0"
[tasks."cli:lint-fix"]
run = "mise run cli:lint --fix"
[tasks."cli:format"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "prettier --check ."
[tasks."cli:format-fix"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "prettier --write ."
[tasks."cli:check"]
env._.path = "./cli/node_modules/.bin"
dir = "cli"
run = "tsc --noEmit"
# @immich/sdk
[tasks."sdk:install"]
run = "pnpm install --filter @immich/sdk --frozen-lockfile"
[tasks."sdk:build"]
env._.path = "./open-api/typescript-sdk/node_modules/.bin"
dir = "./open-api/typescript-sdk"
run = "tsc"
# docs
[tasks."docs:install"]
run = "pnpm install --filter documentation --frozen-lockfile"
[tasks."docs:start"]
env._.path = "./docs/node_modules/.bin"
dir = "docs"
run = "docusaurus --port 3005"
[tasks."docs:build"]
env._.path = "./docs/node_modules/.bin"
dir = "docs"
run = [
"jq -c < ../open-api/immich-openapi-specs.json > ./static/openapi.json || exit 0",
"docusaurus build",
]
[tasks."docs:preview"]
env._.path = "./docs/node_modules/.bin"
dir = "docs"
run = "docusaurus serve"
[tasks."docs:format"]
env._.path = "./docs/node_modules/.bin"
dir = "docs"
run = "prettier --check ."
[tasks."docs:format-fix"]
env._.path = "./docs/node_modules/.bin"
dir = "docs"
run = "prettier --write ."
# e2e
[tasks."e2e:install"]
run = "pnpm install --filter immich-e2e --frozen-lockfile"
[tasks."e2e:test"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "vitest --run"
[tasks."e2e:test-web"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "playwright test"
[tasks."e2e:format"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "prettier --check ."
[tasks."e2e:format-fix"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "prettier --write ."
[tasks."e2e:lint"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "eslint \"src/**/*.ts\" --max-warnings 0"
[tasks."e2e:lint-fix"]
run = "mise run e2e:lint --fix"
[tasks."e2e:check"]
env._.path = "./e2e/node_modules/.bin"
dir = "e2e"
run = "tsc --noEmit"
# i18n
[tasks."i18n:format"]
run = "mise run i18n:format-fix"
[tasks."i18n:format-fix"]
run = "pnpm dlx sort-json ./i18n/*.json"
# server
[tasks."server:install"]
run = "pnpm install --filter immich --frozen-lockfile"
[tasks."server:build"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "nest build"
[tasks."server:test"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "vitest --config test/vitest.config.mjs"
[tasks."server:test-medium"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "vitest --config test/vitest.config.medium.mjs"
[tasks."server:format"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "prettier --check ."
[tasks."server:format-fix"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "prettier --write ."
[tasks."server:lint"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "eslint \"src/**/*.ts\" \"test/**/*.ts\" --max-warnings 0"
[tasks."server:lint-fix"]
run = "mise run server:lint --fix"
[tasks."server:check"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "tsc --noEmit"
[tasks."server:sql"]
dir = "server"
run = "node ./dist/bin/sync-open-api.js"
[tasks."server:open-api"]
dir = "server"
run = "node ./dist/bin/sync-open-api.js"
[tasks."server:migrations"]
dir = "server"
run = "node ./dist/bin/migrations.js"
description = "Run database migration commands (create, generate, run, debug, or query)"
[tasks."server:schema-drop"]
run = "mise run server:migrations query 'DROP schema public cascade; CREATE schema public;'"
[tasks."server:schema-reset"]
run = "mise run server:schema-drop && mise run server:migrations run"
[tasks."server:email-dev"]
env._.path = "./server/node_modules/.bin"
dir = "server"
run = "email dev -p 3050 --dir src/emails"
[tasks."server:checklist"]
run = [
"mise run server:install",
"mise run server:format",
"mise run server:lint",
"mise run server:check",
"mise run server:test-medium --run",
"mise run server:test --run",
]
# web
[tasks."web:install"]
run = "pnpm install --filter immich-web --frozen-lockfile"
[tasks."web:svelte-kit-sync"]
env._.path = "./web/node_modules/.bin"
dir = "web"
run = "svelte-kit sync"
[tasks."web:build"]
env._.path = "./web/node_modules/.bin"
dir = "web"
run = "vite build"
[tasks."web:build-stats"]
env.BUILD_STATS = "true"
env._.path = "./web/node_modules/.bin"
dir = "web"
run = "vite build"
[tasks."web:preview"]
env._.path = "./web/node_modules/.bin"
dir = "web"
run = "vite preview"
[tasks."web:start"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "vite dev --host 0.0.0.0 --port 3000"
[tasks."web:test"]
depends = "web:svelte-kit-sync"
env._.path = "web/node_modules/.bin"
dir = "web"
run = "vitest"
[tasks."web:format"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "prettier --check ."
[tasks."web:format-fix"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "prettier --write ."
[tasks."web:lint"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "eslint . --max-warnings 0"
[tasks."web:lint-p"]
env._.path = "web/node_modules/.bin"
dir = "web"
run = "eslint-p . --max-warnings 0 --concurrency=4"
[tasks."web:lint-fix"]
run = "mise run web:lint --fix"
[tasks."web:check"]
depends = "web:svelte-kit-sync"
env._.path = "web/node_modules/.bin"
dir = "web"
run = "tsc --noEmit"
[tasks."web:check-svelte"]
depends = "web:svelte-kit-sync"
env._.path = "web/node_modules/.bin"
dir = "web"
run = "svelte-check --no-tsconfig --fail-on-warnings --compiler-warnings 'reactive_declaration_non_reactive_property:ignore' --ignore src/lib/components/photos-page/asset-grid.svelte"
[tasks."web:checklist"]
run = [
"mise run web:install",
"mise run web:format",
"mise run web:check",
"mise run web:test --run",
"mise run web:lint",
]

View File

@@ -61,9 +61,8 @@ private open class BackgroundWorkerPigeonCodec : StandardMessageCodec() {
/** Generated interface from Pigeon that represents a handler of messages from Flutter. */ /** Generated interface from Pigeon that represents a handler of messages from Flutter. */
interface BackgroundWorkerFgHostApi { interface BackgroundWorkerFgHostApi {
fun enableSyncWorker() fun enable()
fun enableUploadWorker() fun disable()
fun disableUploadWorker()
companion object { companion object {
/** The codec used by BackgroundWorkerFgHostApi. */ /** The codec used by BackgroundWorkerFgHostApi. */
@@ -75,11 +74,11 @@ interface BackgroundWorkerFgHostApi {
fun setUp(binaryMessenger: BinaryMessenger, api: BackgroundWorkerFgHostApi?, messageChannelSuffix: String = "") { fun setUp(binaryMessenger: BinaryMessenger, api: BackgroundWorkerFgHostApi?, messageChannelSuffix: String = "") {
val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else "" val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else ""
run { run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableSyncWorker$separatedMessageChannelSuffix", codec) val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enable$separatedMessageChannelSuffix", codec)
if (api != null) { if (api != null) {
channel.setMessageHandler { _, reply -> channel.setMessageHandler { _, reply ->
val wrapped: List<Any?> = try { val wrapped: List<Any?> = try {
api.enableSyncWorker() api.enable()
listOf(null) listOf(null)
} catch (exception: Throwable) { } catch (exception: Throwable) {
BackgroundWorkerPigeonUtils.wrapError(exception) BackgroundWorkerPigeonUtils.wrapError(exception)
@@ -91,27 +90,11 @@ interface BackgroundWorkerFgHostApi {
} }
} }
run { run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableUploadWorker$separatedMessageChannelSuffix", codec) val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disable$separatedMessageChannelSuffix", codec)
if (api != null) { if (api != null) {
channel.setMessageHandler { _, reply -> channel.setMessageHandler { _, reply ->
val wrapped: List<Any?> = try { val wrapped: List<Any?> = try {
api.enableUploadWorker() api.disable()
listOf(null)
} catch (exception: Throwable) {
BackgroundWorkerPigeonUtils.wrapError(exception)
}
reply.reply(wrapped)
}
} else {
channel.setMessageHandler(null)
}
}
run {
val channel = BasicMessageChannel<Any?>(binaryMessenger, "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disableUploadWorker$separatedMessageChannelSuffix", codec)
if (api != null) {
channel.setMessageHandler { _, reply ->
val wrapped: List<Any?> = try {
api.disableUploadWorker()
listOf(null) listOf(null)
} catch (exception: Throwable) { } catch (exception: Throwable) {
BackgroundWorkerPigeonUtils.wrapError(exception) BackgroundWorkerPigeonUtils.wrapError(exception)
@@ -182,23 +165,6 @@ class BackgroundWorkerFlutterApi(private val binaryMessenger: BinaryMessenger, p
BackgroundWorkerPigeonCodec() BackgroundWorkerPigeonCodec()
} }
} }
fun onLocalSync(maxSecondsArg: Long?, callback: (Result<Unit>) -> Unit)
{
val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else ""
val channelName = "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onLocalSync$separatedMessageChannelSuffix"
val channel = BasicMessageChannel<Any?>(binaryMessenger, channelName, codec)
channel.send(listOf(maxSecondsArg)) {
if (it is List<*>) {
if (it.size > 1) {
callback(Result.failure(FlutterError(it[0] as String, it[1] as String, it[2] as String?)))
} else {
callback(Result.success(Unit))
}
} else {
callback(Result.failure(BackgroundWorkerPigeonUtils.createConnectionError(channelName)))
}
}
}
fun onIosUpload(isRefreshArg: Boolean, maxSecondsArg: Long?, callback: (Result<Unit>) -> Unit) fun onIosUpload(isRefreshArg: Boolean, maxSecondsArg: Long?, callback: (Result<Unit>) -> Unit)
{ {
val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else "" val separatedMessageChannelSuffix = if (messageChannelSuffix.isNotEmpty()) ".$messageChannelSuffix" else ""

View File

@@ -16,11 +16,6 @@ import io.flutter.embedding.engine.loader.FlutterLoader
private const val TAG = "BackgroundWorker" private const val TAG = "BackgroundWorker"
enum class BackgroundTaskType {
LOCAL_SYNC,
UPLOAD,
}
class BackgroundWorker(context: Context, params: WorkerParameters) : class BackgroundWorker(context: Context, params: WorkerParameters) :
ListenableWorker(context, params), BackgroundWorkerBgHostApi { ListenableWorker(context, params), BackgroundWorkerBgHostApi {
private val ctx: Context = context.applicationContext private val ctx: Context = context.applicationContext
@@ -84,13 +79,7 @@ class BackgroundWorker(context: Context, params: WorkerParameters) :
* This method acts as a bridge between the native Android background task system and Flutter. * This method acts as a bridge between the native Android background task system and Flutter.
*/ */
override fun onInitialized() { override fun onInitialized() {
val taskTypeIndex = inputData.getInt(BackgroundWorkerApiImpl.WORKER_DATA_TASK_TYPE, 0) flutterApi?.onAndroidUpload { handleHostResult(it) }
val taskType = BackgroundTaskType.entries[taskTypeIndex]
when (taskType) {
BackgroundTaskType.LOCAL_SYNC -> flutterApi?.onLocalSync(null) { handleHostResult(it) }
BackgroundTaskType.UPLOAD -> flutterApi?.onAndroidUpload { handleHostResult(it) }
}
} }
override fun close() { override fun close() {
@@ -141,8 +130,10 @@ class BackgroundWorker(context: Context, params: WorkerParameters) :
* - Parameter success: Indicates whether the background task completed successfully * - Parameter success: Indicates whether the background task completed successfully
*/ */
private fun complete(success: Result) { private fun complete(success: Result) {
Log.d(TAG, "About to complete BackupWorker with result: $success")
isComplete = true isComplete = true
engine?.destroy() engine?.destroy()
engine = null
flutterApi = null flutterApi = null
completionHandler.set(success) completionHandler.set(success)
} }

View File

@@ -3,10 +3,8 @@ package app.alextran.immich.background
import android.content.Context import android.content.Context
import android.provider.MediaStore import android.provider.MediaStore
import android.util.Log import android.util.Log
import androidx.core.content.edit
import androidx.work.BackoffPolicy import androidx.work.BackoffPolicy
import androidx.work.Constraints import androidx.work.Constraints
import androidx.work.Data
import androidx.work.ExistingWorkPolicy import androidx.work.ExistingWorkPolicy
import androidx.work.OneTimeWorkRequest import androidx.work.OneTimeWorkRequest
import androidx.work.WorkManager import androidx.work.WorkManager
@@ -16,18 +14,13 @@ private const val TAG = "BackgroundUploadImpl"
class BackgroundWorkerApiImpl(context: Context) : BackgroundWorkerFgHostApi { class BackgroundWorkerApiImpl(context: Context) : BackgroundWorkerFgHostApi {
private val ctx: Context = context.applicationContext private val ctx: Context = context.applicationContext
override fun enableSyncWorker() {
override fun enable() {
enqueueMediaObserver(ctx) enqueueMediaObserver(ctx)
Log.i(TAG, "Scheduled media observer")
} }
override fun enableUploadWorker() { override fun disable() {
updateUploadEnabled(ctx, true) WorkManager.getInstance(ctx).cancelUniqueWork(OBSERVER_WORKER_NAME)
Log.i(TAG, "Scheduled background upload tasks")
}
override fun disableUploadWorker() {
updateUploadEnabled(ctx, false)
WorkManager.getInstance(ctx).cancelUniqueWork(BACKGROUND_WORKER_NAME) WorkManager.getInstance(ctx).cancelUniqueWork(BACKGROUND_WORKER_NAME)
Log.i(TAG, "Cancelled background upload tasks") Log.i(TAG, "Cancelled background upload tasks")
} }
@@ -36,25 +29,14 @@ class BackgroundWorkerApiImpl(context: Context) : BackgroundWorkerFgHostApi {
private const val BACKGROUND_WORKER_NAME = "immich/BackgroundWorkerV1" private const val BACKGROUND_WORKER_NAME = "immich/BackgroundWorkerV1"
private const val OBSERVER_WORKER_NAME = "immich/MediaObserverV1" private const val OBSERVER_WORKER_NAME = "immich/MediaObserverV1"
const val WORKER_DATA_TASK_TYPE = "taskType"
const val SHARED_PREF_NAME = "Immich::Background"
const val SHARED_PREF_BACKUP_ENABLED = "Background::backup::enabled"
private fun updateUploadEnabled(context: Context, enabled: Boolean) {
context.getSharedPreferences(SHARED_PREF_NAME, Context.MODE_PRIVATE).edit {
putBoolean(SHARED_PREF_BACKUP_ENABLED, enabled)
}
}
fun enqueueMediaObserver(ctx: Context) { fun enqueueMediaObserver(ctx: Context) {
val constraints = Constraints.Builder() val constraints = Constraints.Builder()
.addContentUriTrigger(MediaStore.Images.Media.INTERNAL_CONTENT_URI, true) .addContentUriTrigger(MediaStore.Images.Media.INTERNAL_CONTENT_URI, true)
.addContentUriTrigger(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, true) .addContentUriTrigger(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, true)
.addContentUriTrigger(MediaStore.Video.Media.INTERNAL_CONTENT_URI, true) .addContentUriTrigger(MediaStore.Video.Media.INTERNAL_CONTENT_URI, true)
.addContentUriTrigger(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, true) .addContentUriTrigger(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, true)
.setTriggerContentUpdateDelay(5, TimeUnit.SECONDS) .setTriggerContentUpdateDelay(30, TimeUnit.SECONDS)
.setTriggerContentMaxDelay(1, TimeUnit.MINUTES) .setTriggerContentMaxDelay(3, TimeUnit.MINUTES)
.build() .build()
val work = OneTimeWorkRequest.Builder(MediaObserver::class.java) val work = OneTimeWorkRequest.Builder(MediaObserver::class.java)
@@ -66,15 +48,13 @@ class BackgroundWorkerApiImpl(context: Context) : BackgroundWorkerFgHostApi {
Log.i(TAG, "Enqueued media observer worker with name: $OBSERVER_WORKER_NAME") Log.i(TAG, "Enqueued media observer worker with name: $OBSERVER_WORKER_NAME")
} }
fun enqueueBackgroundWorker(ctx: Context, taskType: BackgroundTaskType) { fun enqueueBackgroundWorker(ctx: Context) {
val constraints = Constraints.Builder().setRequiresBatteryNotLow(true).build() val constraints = Constraints.Builder().setRequiresBatteryNotLow(true).build()
val data = Data.Builder()
data.putInt(WORKER_DATA_TASK_TYPE, taskType.ordinal)
val work = OneTimeWorkRequest.Builder(BackgroundWorker::class.java) val work = OneTimeWorkRequest.Builder(BackgroundWorker::class.java)
.setConstraints(constraints) .setConstraints(constraints)
.setBackoffCriteria(BackoffPolicy.EXPONENTIAL, 1, TimeUnit.MINUTES) .setBackoffCriteria(BackoffPolicy.EXPONENTIAL, 1, TimeUnit.MINUTES)
.setInputData(data.build()).build() .build()
WorkManager.getInstance(ctx) WorkManager.getInstance(ctx)
.enqueueUniqueWork(BACKGROUND_WORKER_NAME, ExistingWorkPolicy.REPLACE, work) .enqueueUniqueWork(BACKGROUND_WORKER_NAME, ExistingWorkPolicy.REPLACE, work)

View File

@@ -6,29 +6,17 @@ import androidx.work.Worker
import androidx.work.WorkerParameters import androidx.work.WorkerParameters
class MediaObserver(context: Context, params: WorkerParameters) : Worker(context, params) { class MediaObserver(context: Context, params: WorkerParameters) : Worker(context, params) {
private val ctx: Context = context.applicationContext private val ctx: Context = context.applicationContext
override fun doWork(): Result { override fun doWork(): Result {
Log.i("MediaObserver", "Content change detected, starting background worker") Log.i("MediaObserver", "Content change detected, starting background worker")
// Re-enqueue itself to listen for future changes
BackgroundWorkerApiImpl.enqueueMediaObserver(ctx)
// Enqueue backup worker only if there are new media changes // Enqueue backup worker only if there are new media changes
if (triggeredContentUris.isNotEmpty()) { if (triggeredContentUris.isNotEmpty()) {
val type = BackgroundWorkerApiImpl.enqueueBackgroundWorker(ctx)
if (isBackupEnabled(ctx)) BackgroundTaskType.UPLOAD else BackgroundTaskType.LOCAL_SYNC
BackgroundWorkerApiImpl.enqueueBackgroundWorker(ctx, type)
}
// Re-enqueue itself to listen for future changes
BackgroundWorkerApiImpl.enqueueMediaObserver(ctx)
return Result.success()
}
private fun isBackupEnabled(context: Context): Boolean {
val prefs =
context.getSharedPreferences(
BackgroundWorkerApiImpl.SHARED_PREF_NAME,
Context.MODE_PRIVATE
)
return prefs.getBoolean(BackgroundWorkerApiImpl.SHARED_PREF_BACKUP_ENABLED, false)
} }
return Result.success()
}
} }

View File

@@ -3,7 +3,7 @@
archiveVersion = 1; archiveVersion = 1;
classes = { classes = {
}; };
objectVersion = 77; objectVersion = 54;
objects = { objects = {
/* Begin PBXBuildFile section */ /* Begin PBXBuildFile section */
@@ -507,14 +507,10 @@
inputFileListPaths = ( inputFileListPaths = (
"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources-${CONFIGURATION}-input-files.xcfilelist", "${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources-${CONFIGURATION}-input-files.xcfilelist",
); );
inputPaths = (
);
name = "[CP] Copy Pods Resources"; name = "[CP] Copy Pods Resources";
outputFileListPaths = ( outputFileListPaths = (
"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources-${CONFIGURATION}-output-files.xcfilelist", "${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources-${CONFIGURATION}-output-files.xcfilelist",
); );
outputPaths = (
);
runOnlyForDeploymentPostprocessing = 0; runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh; shellPath = /bin/sh;
shellScript = "\"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources.sh\"\n"; shellScript = "\"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-resources.sh\"\n";
@@ -543,14 +539,10 @@
inputFileListPaths = ( inputFileListPaths = (
"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks-${CONFIGURATION}-input-files.xcfilelist", "${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks-${CONFIGURATION}-input-files.xcfilelist",
); );
inputPaths = (
);
name = "[CP] Embed Pods Frameworks"; name = "[CP] Embed Pods Frameworks";
outputFileListPaths = ( outputFileListPaths = (
"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks-${CONFIGURATION}-output-files.xcfilelist", "${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks-${CONFIGURATION}-output-files.xcfilelist",
); );
outputPaths = (
);
runOnlyForDeploymentPostprocessing = 0; runOnlyForDeploymentPostprocessing = 0;
shellPath = /bin/sh; shellPath = /bin/sh;
shellScript = "\"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks.sh\"\n"; shellScript = "\"${PODS_ROOT}/Target Support Files/Pods-Runner/Pods-Runner-frameworks.sh\"\n";

View File

@@ -73,9 +73,8 @@ class BackgroundWorkerPigeonCodec: FlutterStandardMessageCodec, @unchecked Senda
/// Generated protocol from Pigeon that represents a handler of messages from Flutter. /// Generated protocol from Pigeon that represents a handler of messages from Flutter.
protocol BackgroundWorkerFgHostApi { protocol BackgroundWorkerFgHostApi {
func enableSyncWorker() throws func enable() throws
func enableUploadWorker() throws func disable() throws
func disableUploadWorker() throws
} }
/// Generated setup class from Pigeon to handle messages through the `binaryMessenger`. /// Generated setup class from Pigeon to handle messages through the `binaryMessenger`.
@@ -84,44 +83,31 @@ class BackgroundWorkerFgHostApiSetup {
/// Sets up an instance of `BackgroundWorkerFgHostApi` to handle messages through the `binaryMessenger`. /// Sets up an instance of `BackgroundWorkerFgHostApi` to handle messages through the `binaryMessenger`.
static func setUp(binaryMessenger: FlutterBinaryMessenger, api: BackgroundWorkerFgHostApi?, messageChannelSuffix: String = "") { static func setUp(binaryMessenger: FlutterBinaryMessenger, api: BackgroundWorkerFgHostApi?, messageChannelSuffix: String = "") {
let channelSuffix = messageChannelSuffix.count > 0 ? ".\(messageChannelSuffix)" : "" let channelSuffix = messageChannelSuffix.count > 0 ? ".\(messageChannelSuffix)" : ""
let enableSyncWorkerChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableSyncWorker\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec) let enableChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enable\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api { if let api = api {
enableSyncWorkerChannel.setMessageHandler { _, reply in enableChannel.setMessageHandler { _, reply in
do { do {
try api.enableSyncWorker() try api.enable()
reply(wrapResult(nil)) reply(wrapResult(nil))
} catch { } catch {
reply(wrapError(error)) reply(wrapError(error))
} }
} }
} else { } else {
enableSyncWorkerChannel.setMessageHandler(nil) enableChannel.setMessageHandler(nil)
} }
let enableUploadWorkerChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableUploadWorker\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec) let disableChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disable\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api { if let api = api {
enableUploadWorkerChannel.setMessageHandler { _, reply in disableChannel.setMessageHandler { _, reply in
do { do {
try api.enableUploadWorker() try api.disable()
reply(wrapResult(nil)) reply(wrapResult(nil))
} catch { } catch {
reply(wrapError(error)) reply(wrapError(error))
} }
} }
} else { } else {
enableUploadWorkerChannel.setMessageHandler(nil) disableChannel.setMessageHandler(nil)
}
let disableUploadWorkerChannel = FlutterBasicMessageChannel(name: "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disableUploadWorker\(channelSuffix)", binaryMessenger: binaryMessenger, codec: codec)
if let api = api {
disableUploadWorkerChannel.setMessageHandler { _, reply in
do {
try api.disableUploadWorker()
reply(wrapResult(nil))
} catch {
reply(wrapError(error))
}
}
} else {
disableUploadWorkerChannel.setMessageHandler(nil)
} }
} }
} }
@@ -167,7 +153,6 @@ class BackgroundWorkerBgHostApiSetup {
} }
/// Generated protocol from Pigeon that represents Flutter messages that can be called from Swift. /// Generated protocol from Pigeon that represents Flutter messages that can be called from Swift.
protocol BackgroundWorkerFlutterApiProtocol { protocol BackgroundWorkerFlutterApiProtocol {
func onLocalSync(maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void)
func onIosUpload(isRefresh isRefreshArg: Bool, maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void) func onIosUpload(isRefresh isRefreshArg: Bool, maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void)
func onAndroidUpload(completion: @escaping (Result<Void, PigeonError>) -> Void) func onAndroidUpload(completion: @escaping (Result<Void, PigeonError>) -> Void)
func cancel(completion: @escaping (Result<Void, PigeonError>) -> Void) func cancel(completion: @escaping (Result<Void, PigeonError>) -> Void)
@@ -182,24 +167,6 @@ class BackgroundWorkerFlutterApi: BackgroundWorkerFlutterApiProtocol {
var codec: BackgroundWorkerPigeonCodec { var codec: BackgroundWorkerPigeonCodec {
return BackgroundWorkerPigeonCodec.shared return BackgroundWorkerPigeonCodec.shared
} }
func onLocalSync(maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void) {
let channelName: String = "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onLocalSync\(messageChannelSuffix)"
let channel = FlutterBasicMessageChannel(name: channelName, binaryMessenger: binaryMessenger, codec: codec)
channel.sendMessage([maxSecondsArg] as [Any?]) { response in
guard let listResponse = response as? [Any?] else {
completion(.failure(createConnectionError(withChannelName: channelName)))
return
}
if listResponse.count > 1 {
let code: String = listResponse[0] as! String
let message: String? = nilOrValue(listResponse[1])
let details: String? = nilOrValue(listResponse[2])
completion(.failure(PigeonError(code: code, message: message, details: details)))
} else {
completion(.success(()))
}
}
}
func onIosUpload(isRefresh isRefreshArg: Bool, maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void) { func onIosUpload(isRefresh isRefreshArg: Bool, maxSeconds maxSecondsArg: Int64?, completion: @escaping (Result<Void, PigeonError>) -> Void) {
let channelName: String = "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onIosUpload\(messageChannelSuffix)" let channelName: String = "dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onIosUpload\(messageChannelSuffix)"
let channel = FlutterBasicMessageChannel(name: channelName, binaryMessenger: binaryMessenger, codec: codec) let channel = FlutterBasicMessageChannel(name: channelName, binaryMessenger: binaryMessenger, codec: codec)

View File

@@ -1,7 +1,7 @@
import BackgroundTasks import BackgroundTasks
import Flutter import Flutter
enum BackgroundTaskType { case localSync, refreshUpload, processingUpload } enum BackgroundTaskType { case refresh, processing }
/* /*
* DEBUG: Testing Background Tasks in Xcode * DEBUG: Testing Background Tasks in Xcode
@@ -9,10 +9,6 @@ enum BackgroundTaskType { case localSync, refreshUpload, processingUpload }
* To test background task functionality during development: * To test background task functionality during development:
* 1. Pause the application in Xcode debugger * 1. Pause the application in Xcode debugger
* 2. In the debugger console, enter one of the following commands: * 2. In the debugger console, enter one of the following commands:
## For local sync (short-running sync):
e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateLaunchForTaskWithIdentifier:@"app.alextran.immich.background.localSync"]
## For background refresh (short-running sync): ## For background refresh (short-running sync):
@@ -24,8 +20,6 @@ enum BackgroundTaskType { case localSync, refreshUpload, processingUpload }
* To simulate task expiration (useful for testing expiration handlers): * To simulate task expiration (useful for testing expiration handlers):
e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateExpirationForTaskWithIdentifier:@"app.alextran.immich.background.localSync"]
e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateExpirationForTaskWithIdentifier:@"app.alextran.immich.background.refreshUpload"] e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateExpirationForTaskWithIdentifier:@"app.alextran.immich.background.refreshUpload"]
e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateExpirationForTaskWithIdentifier:@"app.alextran.immich.background.processingUpload"] e -l objc -- (void)[[BGTaskScheduler sharedScheduler] _simulateExpirationForTaskWithIdentifier:@"app.alextran.immich.background.processingUpload"]
@@ -120,19 +114,11 @@ class BackgroundWorker: BackgroundWorkerBgHostApi {
* This method acts as a bridge between the native iOS background task system and Flutter. * This method acts as a bridge between the native iOS background task system and Flutter.
*/ */
func onInitialized() throws { func onInitialized() throws {
switch self.taskType { flutterApi?.onIosUpload(isRefresh: self.taskType == .refresh, maxSeconds: maxSeconds.map { Int64($0) }, completion: { result in
case .refreshUpload, .processingUpload: self.handleHostResult(result: result)
flutterApi?.onIosUpload(isRefresh: self.taskType == .refreshUpload, })
maxSeconds: maxSeconds.map { Int64($0) }, completion: { result in
self.handleHostResult(result: result)
})
case .localSync:
flutterApi?.onLocalSync(maxSeconds: maxSeconds.map { Int64($0) }, completion: { result in
self.handleHostResult(result: result)
})
}
} }
/** /**
* Cancels the currently running background task, either due to timeout or external request. * Cancels the currently running background task, either due to timeout or external request.
* Sends a cancel signal to the Flutter side and sets up a fallback timer to ensure * Sends a cancel signal to the Flutter side and sets up a fallback timer to ensure
@@ -154,6 +140,7 @@ class BackgroundWorker: BackgroundWorkerBgHostApi {
self.complete(success: false) self.complete(success: false)
} }
} }
/** /**
* Handles the result from Flutter API calls and determines the success/failure status. * Handles the result from Flutter API calls and determines the success/failure status.
@@ -177,6 +164,10 @@ class BackgroundWorker: BackgroundWorkerBgHostApi {
* - Parameter success: Indicates whether the background task completed successfully * - Parameter success: Indicates whether the background task completed successfully
*/ */
private func complete(success: Bool) { private func complete(success: Bool) {
if(isComplete) {
return
}
isComplete = true isComplete = true
engine.destroyContext() engine.destroyContext()
completionHandler(success) completionHandler(success)

View File

@@ -1,77 +1,40 @@
import BackgroundTasks import BackgroundTasks
class BackgroundWorkerApiImpl: BackgroundWorkerFgHostApi { class BackgroundWorkerApiImpl: BackgroundWorkerFgHostApi {
func enableSyncWorker() throws {
BackgroundWorkerApiImpl.scheduleLocalSync()
print("BackgroundUploadImpl:enableSyncWorker Local Sync worker scheduled")
}
func enableUploadWorker() throws {
BackgroundWorkerApiImpl.updateUploadEnabled(true)
BackgroundWorkerApiImpl.scheduleRefreshUpload()
BackgroundWorkerApiImpl.scheduleProcessingUpload()
print("BackgroundUploadImpl:enableUploadWorker Scheduled background upload tasks")
}
func disableUploadWorker() throws {
BackgroundWorkerApiImpl.updateUploadEnabled(false)
BackgroundWorkerApiImpl.cancelUploadTasks()
print("BackgroundUploadImpl:disableUploadWorker Disabled background upload tasks")
}
public static let backgroundUploadEnabledKey = "immich:background:backup:enabled"
private static let localSyncTaskID = "app.alextran.immich.background.localSync"
private static let refreshUploadTaskID = "app.alextran.immich.background.refreshUpload"
private static let processingUploadTaskID = "app.alextran.immich.background.processingUpload"
private static func updateUploadEnabled(_ isEnabled: Bool) { func enable() throws {
return UserDefaults.standard.set(isEnabled, forKey: BackgroundWorkerApiImpl.backgroundUploadEnabledKey) BackgroundWorkerApiImpl.scheduleRefreshWorker()
BackgroundWorkerApiImpl.scheduleProcessingWorker()
print("BackgroundUploadImpl:enbale Background worker scheduled")
} }
private static func cancelUploadTasks() { func disable() throws {
BackgroundWorkerApiImpl.updateUploadEnabled(false) BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: BackgroundWorkerApiImpl.refreshTaskID);
BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: refreshUploadTaskID); BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: BackgroundWorkerApiImpl.processingTaskID);
BGTaskScheduler.shared.cancel(taskRequestWithIdentifier: processingUploadTaskID); print("BackgroundUploadImpl:disableUploadWorker Disabled background workers")
} }
private static let refreshTaskID = "app.alextran.immich.background.refreshUpload"
private static let processingTaskID = "app.alextran.immich.background.processingUpload"
public static func registerBackgroundWorkers() { public static func registerBackgroundWorkers() {
BGTaskScheduler.shared.register( BGTaskScheduler.shared.register(
forTaskWithIdentifier: processingUploadTaskID, using: nil) { task in forTaskWithIdentifier: processingTaskID, using: nil) { task in
if task is BGProcessingTask { if task is BGProcessingTask {
handleBackgroundProcessing(task: task as! BGProcessingTask) handleBackgroundProcessing(task: task as! BGProcessingTask)
} }
} }
BGTaskScheduler.shared.register( BGTaskScheduler.shared.register(
forTaskWithIdentifier: refreshUploadTaskID, using: nil) { task in forTaskWithIdentifier: refreshTaskID, using: nil) { task in
if task is BGAppRefreshTask { if task is BGAppRefreshTask {
handleBackgroundRefresh(task: task as! BGAppRefreshTask, taskType: .refreshUpload) handleBackgroundRefresh(task: task as! BGAppRefreshTask)
} }
} }
BGTaskScheduler.shared.register(
forTaskWithIdentifier: localSyncTaskID, using: nil) { task in
if task is BGAppRefreshTask {
handleBackgroundRefresh(task: task as! BGAppRefreshTask, taskType: .localSync)
}
}
} }
private static func scheduleLocalSync() { private static func scheduleRefreshWorker() {
let backgroundRefresh = BGAppRefreshTaskRequest(identifier: localSyncTaskID) let backgroundRefresh = BGAppRefreshTaskRequest(identifier: refreshTaskID)
backgroundRefresh.earliestBeginDate = Date(timeIntervalSinceNow: 5 * 60) // 5 mins
do {
try BGTaskScheduler.shared.submit(backgroundRefresh)
} catch {
print("Could not schedule the local sync task \(error.localizedDescription)")
}
}
private static func scheduleRefreshUpload() {
let backgroundRefresh = BGAppRefreshTaskRequest(identifier: refreshUploadTaskID)
backgroundRefresh.earliestBeginDate = Date(timeIntervalSinceNow: 5 * 60) // 5 mins backgroundRefresh.earliestBeginDate = Date(timeIntervalSinceNow: 5 * 60) // 5 mins
do { do {
@@ -81,8 +44,8 @@ class BackgroundWorkerApiImpl: BackgroundWorkerFgHostApi {
} }
} }
private static func scheduleProcessingUpload() { private static func scheduleProcessingWorker() {
let backgroundProcessing = BGProcessingTaskRequest(identifier: processingUploadTaskID) let backgroundProcessing = BGProcessingTaskRequest(identifier: processingTaskID)
backgroundProcessing.requiresNetworkConnectivity = true backgroundProcessing.requiresNetworkConnectivity = true
backgroundProcessing.earliestBeginDate = Date(timeIntervalSinceNow: 15 * 60) // 15 mins backgroundProcessing.earliestBeginDate = Date(timeIntervalSinceNow: 15 * 60) // 15 mins
@@ -94,29 +57,16 @@ class BackgroundWorkerApiImpl: BackgroundWorkerFgHostApi {
} }
} }
private static func handleBackgroundRefresh(task: BGAppRefreshTask, taskType: BackgroundTaskType) { private static func handleBackgroundRefresh(task: BGAppRefreshTask) {
let maxSeconds: Int? scheduleRefreshWorker()
switch taskType {
case .localSync:
maxSeconds = 15
scheduleLocalSync()
case .refreshUpload:
maxSeconds = 20
scheduleRefreshUpload()
case .processingUpload:
print("Unexpected background refresh task encountered")
return;
}
// Restrict the refresh task to run only for a maximum of (maxSeconds) seconds // Restrict the refresh task to run only for a maximum of (maxSeconds) seconds
runBackgroundWorker(task: task, taskType: taskType, maxSeconds: maxSeconds) runBackgroundWorker(task: task, taskType: .refresh, maxSeconds: 20)
} }
private static func handleBackgroundProcessing(task: BGProcessingTask) { private static func handleBackgroundProcessing(task: BGProcessingTask) {
scheduleProcessingUpload() scheduleProcessingWorker()
// There are no restrictions for processing tasks. Although, the OS could signal expiration at any time // There are no restrictions for processing tasks. Although, the OS could signal expiration at any time
runBackgroundWorker(task: task, taskType: .processingUpload, maxSeconds: nil) runBackgroundWorker(task: task, taskType: .processing, maxSeconds: nil)
} }
/** /**

View File

@@ -46,6 +46,23 @@ class ThumbnailApiImpl: ThumbnailApi {
assetCache.countLimit = 10000 assetCache.countLimit = 10000
return assetCache return assetCache
}() }()
private static let activitySemaphore = DispatchSemaphore(value: 1)
private static let willResignActiveObserver = NotificationCenter.default.addObserver(
forName: UIApplication.willResignActiveNotification,
object: nil,
queue: .main
) { _ in
processingQueue.suspend()
activitySemaphore.wait()
}
private static let didBecomeActiveObserver = NotificationCenter.default.addObserver(
forName: UIApplication.didBecomeActiveNotification,
object: nil,
queue: .main
) { _ in
processingQueue.resume()
activitySemaphore.signal()
}
func getThumbhash(thumbhash: String, completion: @escaping (Result<[String : Int64], any Error>) -> Void) { func getThumbhash(thumbhash: String, completion: @escaping (Result<[String : Int64], any Error>) -> Void) {
Self.processingQueue.async { Self.processingQueue.async {
@@ -53,6 +70,7 @@ class ThumbnailApiImpl: ThumbnailApi {
else { return completion(.failure(PigeonError(code: "", message: "Invalid base64 string: \(thumbhash)", details: nil)))} else { return completion(.failure(PigeonError(code: "", message: "Invalid base64 string: \(thumbhash)", details: nil)))}
let (width, height, pointer) = thumbHashToRGBA(hash: data) let (width, height, pointer) = thumbHashToRGBA(hash: data)
self.waitForActiveState()
completion(.success(["pointer": Int64(Int(bitPattern: pointer.baseAddress)), "width": Int64(width), "height": Int64(height)])) completion(.success(["pointer": Int64(Int(bitPattern: pointer.baseAddress)), "width": Int64(width), "height": Int64(height)]))
} }
} }
@@ -142,6 +160,7 @@ class ThumbnailApiImpl: ThumbnailApi {
return completion(Self.cancelledResult) return completion(Self.cancelledResult)
} }
self.waitForActiveState()
completion(.success(["pointer": Int64(Int(bitPattern: pointer)), "width": Int64(cgImage.width), "height": Int64(cgImage.height)])) completion(.success(["pointer": Int64(Int(bitPattern: pointer)), "width": Int64(cgImage.width), "height": Int64(cgImage.height)]))
Self.removeRequest(requestId: requestId) Self.removeRequest(requestId: requestId)
} }
@@ -184,4 +203,9 @@ class ThumbnailApiImpl: ThumbnailApi {
assetQueue.async { assetCache.setObject(asset, forKey: assetId as NSString) } assetQueue.async { assetCache.setObject(asset, forKey: assetId as NSString) }
return asset return asset
} }
func waitForActiveState() {
Self.activitySemaphore.wait()
Self.activitySemaphore.signal()
}
} }

View File

@@ -1,190 +1,189 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0"> <plist version="1.0">
<dict> <dict>
<key>AppGroupId</key> <key>AppGroupId</key>
<string>$(CUSTOM_GROUP_ID)</string> <string>$(CUSTOM_GROUP_ID)</string>
<key>BGTaskSchedulerPermittedIdentifiers</key> <key>BGTaskSchedulerPermittedIdentifiers</key>
<array> <array>
<string>app.alextran.immich.background.localSync</string> <string>app.alextran.immich.background.refreshUpload</string>
<string>app.alextran.immich.background.refreshUpload</string> <string>app.alextran.immich.background.processingUpload</string>
<string>app.alextran.immich.background.processingUpload</string> <string>app.alextran.immich.backgroundFetch</string>
<string>app.alextran.immich.backgroundFetch</string> <string>app.alextran.immich.backgroundProcessing</string>
<string>app.alextran.immich.backgroundProcessing</string> </array>
</array> <key>CADisableMinimumFrameDurationOnPhone</key>
<key>CADisableMinimumFrameDurationOnPhone</key> <true/>
<true /> <key>CFBundleDevelopmentRegion</key>
<key>CFBundleDevelopmentRegion</key> <string>$(DEVELOPMENT_LANGUAGE)</string>
<string>$(DEVELOPMENT_LANGUAGE)</string> <key>CFBundleDisplayName</key>
<key>CFBundleDisplayName</key> <string>${PRODUCT_NAME}</string>
<string>${PRODUCT_NAME}</string> <key>CFBundleDocumentTypes</key>
<key>CFBundleDocumentTypes</key> <array>
<array> <dict>
<dict> <key>CFBundleTypeName</key>
<key>CFBundleTypeName</key> <string>ShareHandler</string>
<string>ShareHandler</string> <key>LSHandlerRank</key>
<key>LSHandlerRank</key> <string>Alternate</string>
<string>Alternate</string> <key>LSItemContentTypes</key>
<key>LSItemContentTypes</key> <array>
<array> <string>public.file-url</string>
<string>public.file-url</string> <string>public.image</string>
<string>public.image</string> <string>public.text</string>
<string>public.text</string> <string>public.movie</string>
<string>public.movie</string> <string>public.url</string>
<string>public.url</string> <string>public.data</string>
<string>public.data</string> </array>
</array> </dict>
</dict> </array>
</array> <key>CFBundleExecutable</key>
<key>CFBundleExecutable</key> <string>$(EXECUTABLE_NAME)</string>
<string>$(EXECUTABLE_NAME)</string> <key>CFBundleIdentifier</key>
<key>CFBundleIdentifier</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>CFBundleInfoDictionaryVersion</key>
<key>CFBundleInfoDictionaryVersion</key> <string>6.0</string>
<string>6.0</string> <key>CFBundleLocalizations</key>
<key>CFBundleLocalizations</key> <array>
<array> <string>en</string>
<string>en</string> <string>ar</string>
<string>ar</string> <string>ca</string>
<string>ca</string> <string>cs</string>
<string>cs</string> <string>da</string>
<string>da</string> <string>de</string>
<string>de</string> <string>es</string>
<string>es</string> <string>fi</string>
<string>fi</string> <string>fr</string>
<string>fr</string> <string>he</string>
<string>he</string> <string>hi</string>
<string>hi</string> <string>hu</string>
<string>hu</string> <string>it</string>
<string>it</string> <string>ja</string>
<string>ja</string> <string>ko</string>
<string>ko</string> <string>lv</string>
<string>lv</string> <string>mn</string>
<string>mn</string> <string>nb</string>
<string>nb</string> <string>nl</string>
<string>nl</string> <string>pl</string>
<string>pl</string> <string>pt</string>
<string>pt</string> <string>ro</string>
<string>ro</string> <string>ru</string>
<string>ru</string> <string>sk</string>
<string>sk</string> <string>sl</string>
<string>sl</string> <string>sr</string>
<string>sr</string> <string>sv</string>
<string>sv</string> <string>th</string>
<string>th</string> <string>uk</string>
<string>uk</string> <string>vi</string>
<string>vi</string> <string>zh</string>
<string>zh</string> </array>
</array> <key>CFBundleName</key>
<key>CFBundleName</key> <string>immich_mobile</string>
<string>immich_mobile</string> <key>CFBundlePackageType</key>
<key>CFBundlePackageType</key> <string>APPL</string>
<string>APPL</string> <key>CFBundleShortVersionString</key>
<key>CFBundleShortVersionString</key> <string>1.140.0</string>
<string>1.140.0</string> <key>CFBundleSignature</key>
<key>CFBundleSignature</key> <string>????</string>
<string>????</string> <key>CFBundleURLTypes</key>
<key>CFBundleURLTypes</key> <array>
<array> <dict>
<dict> <key>CFBundleTypeRole</key>
<key>CFBundleTypeRole</key> <string>Editor</string>
<string>Editor</string> <key>CFBundleURLName</key>
<key>CFBundleURLName</key> <string>Share Extension</string>
<string>Share Extension</string> <key>CFBundleURLSchemes</key>
<key>CFBundleURLSchemes</key> <array>
<array> <string>ShareMedia-$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<string>ShareMedia-$(PRODUCT_BUNDLE_IDENTIFIER)</string> </array>
</array> </dict>
</dict> <dict>
<dict> <key>CFBundleTypeRole</key>
<key>CFBundleTypeRole</key> <string>Editor</string>
<string>Editor</string> <key>CFBundleURLName</key>
<key>CFBundleURLName</key> <string>Deep Link</string>
<string>Deep Link</string> <key>CFBundleURLSchemes</key>
<key>CFBundleURLSchemes</key> <array>
<array> <string>immich</string>
<string>immich</string> </array>
</array> </dict>
</dict> </array>
</array> <key>CFBundleVersion</key>
<key>CFBundleVersion</key> <string>219</string>
<string>219</string> <key>FLTEnableImpeller</key>
<key>FLTEnableImpeller</key> <true/>
<true /> <key>ITSAppUsesNonExemptEncryption</key>
<key>ITSAppUsesNonExemptEncryption</key> <false/>
<false /> <key>LSApplicationQueriesSchemes</key>
<key>LSApplicationQueriesSchemes</key> <array>
<array> <string>https</string>
<string>https</string> </array>
</array> <key>LSRequiresIPhoneOS</key>
<key>LSRequiresIPhoneOS</key> <true/>
<true /> <key>LSSupportsOpeningDocumentsInPlace</key>
<key>LSSupportsOpeningDocumentsInPlace</key> <string>No</string>
<string>No</string> <key>MGLMapboxMetricsEnabledSettingShownInApp</key>
<key>MGLMapboxMetricsEnabledSettingShownInApp</key> <true/>
<true /> <key>NSAppTransportSecurity</key>
<key>NSAppTransportSecurity</key> <dict>
<dict> <key>NSAllowsArbitraryLoads</key>
<key>NSAllowsArbitraryLoads</key> <true/>
<true /> </dict>
</dict> <key>NSBonjourServices</key>
<key>NSBonjourServices</key> <array>
<array> <string>_googlecast._tcp</string>
<string>_googlecast._tcp</string> <string>_CC1AD845._googlecast._tcp</string>
<string>_CC1AD845._googlecast._tcp</string> </array>
</array> <key>NSCameraUsageDescription</key>
<key>NSCameraUsageDescription</key> <string>We need to access the camera to let you take beautiful video using this app</string>
<string>We need to access the camera to let you take beautiful video using this app</string> <key>NSFaceIDUsageDescription</key>
<key>NSFaceIDUsageDescription</key> <string>We need to use FaceID to allow access to your locked folder</string>
<string>We need to use FaceID to allow access to your locked folder</string> <key>NSLocalNetworkUsageDescription</key>
<key>NSLocalNetworkUsageDescription</key> <string>We need local network permission to connect to the local server using IP address and
<string>We need local network permission to connect to the local server using IP address and
allow the casting feature to work</string> allow the casting feature to work</string>
<key>NSLocationAlwaysAndWhenInUseUsageDescription</key> <key>NSLocationAlwaysAndWhenInUseUsageDescription</key>
<string>We require this permission to access the local WiFi name for background upload mechanism</string> <string>We require this permission to access the local WiFi name for background upload mechanism</string>
<key>NSLocationUsageDescription</key> <key>NSLocationUsageDescription</key>
<string>We require this permission to access the local WiFi name</string> <string>We require this permission to access the local WiFi name</string>
<key>NSLocationWhenInUseUsageDescription</key> <key>NSLocationWhenInUseUsageDescription</key>
<string>We require this permission to access the local WiFi name</string> <string>We require this permission to access the local WiFi name</string>
<key>NSMicrophoneUsageDescription</key> <key>NSMicrophoneUsageDescription</key>
<string>We need to access the microphone to let you take beautiful video using this app</string> <string>We need to access the microphone to let you take beautiful video using this app</string>
<key>NSPhotoLibraryAddUsageDescription</key> <key>NSPhotoLibraryAddUsageDescription</key>
<string>We need to manage backup your photos album</string> <string>We need to manage backup your photos album</string>
<key>NSPhotoLibraryUsageDescription</key> <key>NSPhotoLibraryUsageDescription</key>
<string>We need to manage backup your photos album</string> <string>We need to manage backup your photos album</string>
<key>NSUserActivityTypes</key> <key>NSUserActivityTypes</key>
<array> <array>
<string>INSendMessageIntent</string> <string>INSendMessageIntent</string>
</array> </array>
<key>UIApplicationSupportsIndirectInputEvents</key> <key>UIApplicationSupportsIndirectInputEvents</key>
<true /> <true/>
<key>UIBackgroundModes</key> <key>UIBackgroundModes</key>
<array> <array>
<string>fetch</string> <string>fetch</string>
<string>processing</string> <string>processing</string>
</array> </array>
<key>UILaunchStoryboardName</key> <key>UILaunchStoryboardName</key>
<string>LaunchScreen</string> <string>LaunchScreen</string>
<key>UIMainStoryboardFile</key> <key>UIMainStoryboardFile</key>
<string>Main</string> <string>Main</string>
<key>UIStatusBarHidden</key> <key>UIStatusBarHidden</key>
<false /> <false/>
<key>UISupportedInterfaceOrientations</key> <key>UISupportedInterfaceOrientations</key>
<array> <array>
<string>UIInterfaceOrientationPortrait</string> <string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationLandscapeLeft</string> <string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string> <string>UIInterfaceOrientationLandscapeRight</string>
</array> </array>
<key>UISupportedInterfaceOrientations~ipad</key> <key>UISupportedInterfaceOrientations~ipad</key>
<array> <array>
<string>UIInterfaceOrientationPortrait</string> <string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string> <string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string> <string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string> <string>UIInterfaceOrientationLandscapeRight</string>
</array> </array>
<key>UIViewControllerBasedStatusBarAppearance</key> <key>UIViewControllerBasedStatusBarAppearance</key>
<true /> <true/>
<key>io.flutter.embedded_views_preview</key> <key>io.flutter.embedded_views_preview</key>
<true /> <true/>
</dict> </dict>
</plist> </plist>

View File

@@ -5,6 +5,7 @@ import 'package:background_downloader/background_downloader.dart';
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/constants/constants.dart'; import 'package:immich_mobile/constants/constants.dart';
import 'package:immich_mobile/domain/utils/isolate_lock_manager.dart';
import 'package:immich_mobile/infrastructure/repositories/db.repository.dart'; import 'package:immich_mobile/infrastructure/repositories/db.repository.dart';
import 'package:immich_mobile/infrastructure/repositories/logger_db.repository.dart'; import 'package:immich_mobile/infrastructure/repositories/logger_db.repository.dart';
import 'package:immich_mobile/platform/background_worker_api.g.dart'; import 'package:immich_mobile/platform/background_worker_api.g.dart';
@@ -30,11 +31,9 @@ class BackgroundWorkerFgService {
const BackgroundWorkerFgService(this._foregroundHostApi); const BackgroundWorkerFgService(this._foregroundHostApi);
// TODO: Move this call to native side once old timeline is removed // TODO: Move this call to native side once old timeline is removed
Future<void> enableSyncService() => _foregroundHostApi.enableSyncWorker(); Future<void> enable() => _foregroundHostApi.enable();
Future<void> enableUploadService() => _foregroundHostApi.enableUploadWorker(); Future<void> disable() => _foregroundHostApi.disable();
Future<void> disableUploadService() => _foregroundHostApi.disableUploadWorker();
} }
class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi { class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
@@ -43,7 +42,8 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
final Drift _drift; final Drift _drift;
final DriftLogger _driftLogger; final DriftLogger _driftLogger;
final BackgroundWorkerBgHostApi _backgroundHostApi; final BackgroundWorkerBgHostApi _backgroundHostApi;
final Logger _logger = Logger('BackgroundWorkerBgService'); final Logger _logger = Logger('BackgroundUploadBgService');
late final IsolateLockManager _lockManager;
bool _isCleanedUp = false; bool _isCleanedUp = false;
@@ -59,6 +59,7 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
driftProvider.overrideWith(driftOverride(drift)), driftProvider.overrideWith(driftOverride(drift)),
], ],
); );
_lockManager = IsolateLockManager(onCloseRequest: _cleanup);
BackgroundWorkerFlutterApi.setUp(this); BackgroundWorkerFlutterApi.setUp(this);
} }
@@ -82,41 +83,31 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
await FileDownloader().trackTasksInGroup(kDownloadGroupLivePhoto, markDownloadedComplete: false); await FileDownloader().trackTasksInGroup(kDownloadGroupLivePhoto, markDownloadedComplete: false);
await FileDownloader().trackTasks(); await FileDownloader().trackTasks();
configureFileDownloaderNotifications(); configureFileDownloaderNotifications();
await _ref.read(fileMediaRepositoryProvider).enableBackgroundAccess(); await _ref.read(fileMediaRepositoryProvider).enableBackgroundAccess();
// Notify the host that the background worker service has been initialized and is ready to use // Notify the host that the background upload service has been initialized and is ready to use
_backgroundHostApi.onInitialized(); debugPrint("Acquiring background worker lock");
if (await _lockManager.acquireLock().timeout(
const Duration(seconds: 5),
onTimeout: () {
_lockManager.cancel();
return false;
},
)) {
_logger.info("Acquired background worker lock");
await _backgroundHostApi.onInitialized();
return;
}
_logger.warning("Failed to acquire background worker lock");
await _cleanup();
await _backgroundHostApi.close();
} catch (error, stack) { } catch (error, stack) {
_logger.severe("Failed to initialize background worker", error, stack); _logger.severe("Failed to initialize background worker", error, stack);
_backgroundHostApi.close(); _backgroundHostApi.close();
} }
} }
@override
Future<void> onLocalSync(int? maxSeconds) async {
try {
_logger.info('Local background syncing started');
final sw = Stopwatch()..start();
final timeout = maxSeconds != null ? Duration(seconds: maxSeconds) : null;
await _syncAssets(hashTimeout: timeout, syncRemote: false);
sw.stop();
_logger.info("Local sync completed in ${sw.elapsed.inSeconds}s");
} catch (error, stack) {
_logger.severe("Failed to complete local sync", error, stack);
} finally {
await _cleanup();
}
}
/* We do the following on Android upload
* - Sync local assets
* - Hash local assets 3 / 6 minutes
* - Sync remote assets
* - Check and requeue upload tasks
*/
@override @override
Future<void> onAndroidUpload() async { Future<void> onAndroidUpload() async {
try { try {
@@ -135,14 +126,6 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
} }
} }
/* We do the following on background upload
* - Sync local assets
* - Hash local assets
* - Sync remote assets
* - Check and requeue upload tasks
*
* The native side will not send the maxSeconds value for processing tasks
*/
@override @override
Future<void> onIosUpload(bool isRefresh, int? maxSeconds) async { Future<void> onIosUpload(bool isRefresh, int? maxSeconds) async {
try { try {
@@ -194,7 +177,8 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
await _drift.close(); await _drift.close();
await _driftLogger.close(); await _driftLogger.close();
_ref.dispose(); _ref.dispose();
debugPrint("Background worker cleaned up"); _lockManager.releaseLock();
_logger.info("Background worker resources cleaned up");
} catch (error, stack) { } catch (error, stack) {
debugPrint('Failed to cleanup background worker: $error with stack: $stack'); debugPrint('Failed to cleanup background worker: $error with stack: $stack');
} }
@@ -222,7 +206,7 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
} }
} }
Future<void> _syncAssets({Duration? hashTimeout, bool syncRemote = true}) async { Future<void> _syncAssets({Duration? hashTimeout}) async {
final futures = <Future<void>>[]; final futures = <Future<void>>[];
final localSyncFuture = _ref.read(backgroundSyncProvider).syncLocal().then((_) async { final localSyncFuture = _ref.read(backgroundSyncProvider).syncLocal().then((_) async {
@@ -244,10 +228,7 @@ class BackgroundWorkerBgService extends BackgroundWorkerFlutterApi {
}); });
futures.add(localSyncFuture); futures.add(localSyncFuture);
if (syncRemote) { futures.add(_ref.read(backgroundSyncProvider).syncRemote());
final remoteSyncFuture = _ref.read(backgroundSyncProvider).syncRemote();
futures.add(remoteSyncFuture);
}
await Future.wait(futures); await Future.wait(futures);
} }

View File

@@ -1,6 +1,7 @@
import 'dart:convert'; import 'dart:convert';
import 'package:immich_mobile/constants/constants.dart'; import 'package:immich_mobile/constants/constants.dart';
import 'package:immich_mobile/domain/models/album/local_album.model.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart'; import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/infrastructure/repositories/local_album.repository.dart'; import 'package:immich_mobile/infrastructure/repositories/local_album.repository.dart';
import 'package:immich_mobile/infrastructure/repositories/local_asset.repository.dart'; import 'package:immich_mobile/infrastructure/repositories/local_asset.repository.dart';
@@ -35,6 +36,7 @@ class HashService {
bool get isCancelled => _cancelChecker?.call() ?? false; bool get isCancelled => _cancelChecker?.call() ?? false;
Future<void> hashAssets() async { Future<void> hashAssets() async {
_log.info("Starting hashing of assets");
final Stopwatch stopwatch = Stopwatch()..start(); final Stopwatch stopwatch = Stopwatch()..start();
// Sorted by backupSelection followed by isCloud // Sorted by backupSelection followed by isCloud
final localAlbums = await _localAlbumRepository.getAll( final localAlbums = await _localAlbumRepository.getAll(
@@ -49,7 +51,7 @@ class HashService {
final assetsToHash = await _localAlbumRepository.getAssetsToHash(album.id); final assetsToHash = await _localAlbumRepository.getAssetsToHash(album.id);
if (assetsToHash.isNotEmpty) { if (assetsToHash.isNotEmpty) {
await _hashAssets(assetsToHash); await _hashAssets(album, assetsToHash);
} }
} }
@@ -60,7 +62,7 @@ class HashService {
/// Processes a list of [LocalAsset]s, storing their hash and updating the assets in the DB /// Processes a list of [LocalAsset]s, storing their hash and updating the assets in the DB
/// with hash for those that were successfully hashed. Hashes are looked up in a table /// with hash for those that were successfully hashed. Hashes are looked up in a table
/// [LocalAssetHashEntity] by local id. Only missing entries are newly hashed and added to the DB. /// [LocalAssetHashEntity] by local id. Only missing entries are newly hashed and added to the DB.
Future<void> _hashAssets(List<LocalAsset> assetsToHash) async { Future<void> _hashAssets(LocalAlbum album, List<LocalAsset> assetsToHash) async {
int bytesProcessed = 0; int bytesProcessed = 0;
final toHash = <_AssetToPath>[]; final toHash = <_AssetToPath>[];
@@ -72,6 +74,9 @@ class HashService {
final file = await _storageRepository.getFileForAsset(asset.id); final file = await _storageRepository.getFileForAsset(asset.id);
if (file == null) { if (file == null) {
_log.warning(
"Cannot get file for asset ${asset.id}, name: ${asset.name}, created on: ${asset.createdAt} from album: ${album.name}",
);
continue; continue;
} }
@@ -79,17 +84,17 @@ class HashService {
toHash.add(_AssetToPath(asset: asset, path: file.path)); toHash.add(_AssetToPath(asset: asset, path: file.path));
if (toHash.length >= batchFileLimit || bytesProcessed >= batchSizeLimit) { if (toHash.length >= batchFileLimit || bytesProcessed >= batchSizeLimit) {
await _processBatch(toHash); await _processBatch(album, toHash);
toHash.clear(); toHash.clear();
bytesProcessed = 0; bytesProcessed = 0;
} }
} }
await _processBatch(toHash); await _processBatch(album, toHash);
} }
/// Processes a batch of assets. /// Processes a batch of assets.
Future<void> _processBatch(List<_AssetToPath> toHash) async { Future<void> _processBatch(LocalAlbum album, List<_AssetToPath> toHash) async {
if (toHash.isEmpty) { if (toHash.isEmpty) {
return; return;
} }
@@ -114,7 +119,9 @@ class HashService {
if (hash?.length == 20) { if (hash?.length == 20) {
hashed.add(asset.copyWith(checksum: base64.encode(hash!))); hashed.add(asset.copyWith(checksum: base64.encode(hash!)));
} else { } else {
_log.warning("Failed to hash file for ${asset.id}: ${asset.name} created at ${asset.createdAt}"); _log.warning(
"Failed to hash file for ${asset.id}: ${asset.name} created at ${asset.createdAt} from album: ${album.name}",
);
} }
} }

View File

@@ -0,0 +1,235 @@
import 'dart:isolate';
import 'dart:ui';
import 'package:flutter/foundation.dart';
import 'package:logging/logging.dart';
const String kIsolateLockManagerPort = "immich://isolate_mutex";
enum _LockStatus { active, released }
class _IsolateRequest {
const _IsolateRequest();
}
class _HeartbeatRequest extends _IsolateRequest {
// Port for the receiver to send replies back
final SendPort sendPort;
const _HeartbeatRequest(this.sendPort);
Map<String, dynamic> toJson() {
return {'type': 'heartbeat', 'sendPort': sendPort};
}
}
class _CloseRequest extends _IsolateRequest {
const _CloseRequest();
Map<String, dynamic> toJson() {
return {'type': 'close'};
}
}
class _IsolateResponse {
const _IsolateResponse();
}
class _HeartbeatResponse extends _IsolateResponse {
final _LockStatus status;
const _HeartbeatResponse(this.status);
Map<String, dynamic> toJson() {
return {'type': 'heartbeat', 'status': status.index};
}
}
typedef OnCloseLockHolderRequest = void Function();
class IsolateLockManager {
final String _portName;
bool _hasLock = false;
ReceivePort? _receivePort;
final OnCloseLockHolderRequest? _onCloseRequest;
final Set<SendPort> _waitingIsolates = {};
// Token object - a new one is created for each acquisition attempt
Object? _currentAcquisitionToken;
IsolateLockManager({String? portName, OnCloseLockHolderRequest? onCloseRequest})
: _portName = portName ?? kIsolateLockManagerPort,
_onCloseRequest = onCloseRequest;
Future<bool> acquireLock() async {
if (_hasLock) {
Logger('BackgroundWorkerLockManager').warning("WARNING: [acquireLock] called more than once");
return true;
}
// Create a new token - this invalidates any previous attempt
final token = _currentAcquisitionToken = Object();
final ReceivePort rp = _receivePort = ReceivePort(_portName);
final SendPort sp = rp.sendPort;
while (!IsolateNameServer.registerPortWithName(sp, _portName)) {
// This attempt was superseded by a newer one in the same isolate
if (_currentAcquisitionToken != token) {
return false;
}
await _lockReleasedByHolder(token);
}
_hasLock = true;
rp.listen(_onRequest);
return true;
}
Future<void> _lockReleasedByHolder(Object token) async {
SendPort? holder = IsolateNameServer.lookupPortByName(_portName);
debugPrint("Found lock holder: $holder");
if (holder == null) {
// No holder, try and acquire lock
return;
}
final ReceivePort tempRp = ReceivePort();
final SendPort tempSp = tempRp.sendPort;
final bs = tempRp.asBroadcastStream();
try {
while (true) {
// Send a heartbeat request with the send port to receive reply from the holder
debugPrint("Sending heartbeat request to lock holder");
holder.send(_HeartbeatRequest(tempSp).toJson());
dynamic answer = await bs.first.timeout(const Duration(seconds: 3), onTimeout: () => null);
debugPrint("Received heartbeat response from lock holder: $answer");
// This attempt was superseded by a newer one in the same isolate
if (_currentAcquisitionToken != token) {
break;
}
if (answer == null) {
// Holder failed, most likely killed without calling releaseLock
// Check if a different waiting isolate took the lock
if (holder == IsolateNameServer.lookupPortByName(_portName)) {
// No, remove the stale lock
IsolateNameServer.removePortNameMapping(_portName);
}
break;
}
// Unknown message type received for heartbeat request. Try again
_IsolateResponse? response = _parseResponse(answer);
if (response == null || response is! _HeartbeatResponse) {
break;
}
if (response.status == _LockStatus.released) {
// Holder has released the lock
break;
}
// If the _LockStatus is active, we check again if the task completed
// by sending a released messaged again, if not, send a new heartbeat again
// Check if the holder completed its task after the heartbeat
answer = await bs.first.timeout(
const Duration(seconds: 3),
onTimeout: () => const _HeartbeatResponse(_LockStatus.active).toJson(),
);
response = _parseResponse(answer);
if (response is _HeartbeatResponse && response.status == _LockStatus.released) {
break;
}
}
} catch (e) {
// Timeout or error
} finally {
tempRp.close();
}
return;
}
_IsolateRequest? _parseRequest(dynamic msg) {
if (msg is! Map<String, dynamic>) {
return null;
}
return switch (msg['type']) {
'heartbeat' => _HeartbeatRequest(msg['sendPort']),
'close' => const _CloseRequest(),
_ => null,
};
}
_IsolateResponse? _parseResponse(dynamic msg) {
if (msg is! Map<String, dynamic>) {
return null;
}
return switch (msg['type']) {
'heartbeat' => _HeartbeatResponse(_LockStatus.values[msg['status']]),
_ => null,
};
}
// Executed in the isolate with the lock
void _onRequest(dynamic msg) {
final request = _parseRequest(msg);
if (request == null) {
return;
}
if (request is _HeartbeatRequest) {
// Add the send port to the list of waiting isolates
_waitingIsolates.add(request.sendPort);
request.sendPort.send(const _HeartbeatResponse(_LockStatus.active).toJson());
return;
}
if (request is _CloseRequest) {
_onCloseRequest?.call();
return;
}
}
void releaseLock() {
if (_hasLock) {
IsolateNameServer.removePortNameMapping(_portName);
// Notify waiting isolates
for (final port in _waitingIsolates) {
port.send(const _HeartbeatResponse(_LockStatus.released).toJson());
}
_waitingIsolates.clear();
_hasLock = false;
}
_receivePort?.close();
_receivePort = null;
}
void cancel() {
if (_hasLock) {
return;
}
debugPrint("Cancelling ongoing acquire lock attempts");
// Create a new token to invalidate ongoing acquire lock attempts
_currentAcquisitionToken = Object();
}
void requestHolderToClose() {
if (_hasLock) {
return;
}
IsolateNameServer.lookupPortByName(_portName)?.send(const _CloseRequest().toJson());
}
}

View File

@@ -16,7 +16,6 @@ import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/extensions/build_context_extensions.dart'; import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/generated/codegen_loader.g.dart'; import 'package:immich_mobile/generated/codegen_loader.g.dart';
import 'package:immich_mobile/providers/app_life_cycle.provider.dart'; import 'package:immich_mobile/providers/app_life_cycle.provider.dart';
import 'package:immich_mobile/providers/app_settings.provider.dart';
import 'package:immich_mobile/providers/asset_viewer/share_intent_upload.provider.dart'; import 'package:immich_mobile/providers/asset_viewer/share_intent_upload.provider.dart';
import 'package:immich_mobile/providers/backup/backup.provider.dart'; import 'package:immich_mobile/providers/backup/backup.provider.dart';
import 'package:immich_mobile/providers/db.provider.dart'; import 'package:immich_mobile/providers/db.provider.dart';
@@ -26,7 +25,6 @@ import 'package:immich_mobile/providers/routes.provider.dart';
import 'package:immich_mobile/providers/theme.provider.dart'; import 'package:immich_mobile/providers/theme.provider.dart';
import 'package:immich_mobile/routing/app_navigation_observer.dart'; import 'package:immich_mobile/routing/app_navigation_observer.dart';
import 'package:immich_mobile/routing/router.dart'; import 'package:immich_mobile/routing/router.dart';
import 'package:immich_mobile/services/app_settings.service.dart';
import 'package:immich_mobile/services/background.service.dart'; import 'package:immich_mobile/services/background.service.dart';
import 'package:immich_mobile/services/deep_link.service.dart'; import 'package:immich_mobile/services/deep_link.service.dart';
import 'package:immich_mobile/services/local_notification.service.dart'; import 'package:immich_mobile/services/local_notification.service.dart';
@@ -207,12 +205,9 @@ class ImmichAppState extends ConsumerState<ImmichApp> with WidgetsBindingObserve
// needs to be delayed so that EasyLocalization is working // needs to be delayed so that EasyLocalization is working
if (Store.isBetaTimelineEnabled) { if (Store.isBetaTimelineEnabled) {
ref.read(backgroundServiceProvider).disableService(); ref.read(backgroundServiceProvider).disableService();
ref.read(driftBackgroundUploadFgService).enableSyncService(); ref.read(driftBackgroundUploadFgService).enable();
if (ref.read(appSettingsServiceProvider).getSetting(AppSettingsEnum.enableBackup)) {
ref.read(driftBackgroundUploadFgService).enableUploadService();
}
} else { } else {
ref.read(driftBackgroundUploadFgService).disableUploadService(); ref.read(driftBackgroundUploadFgService).disable();
ref.read(backgroundServiceProvider).resumeServiceIfEnabled(); ref.read(backgroundServiceProvider).resumeServiceIfEnabled();
} }
}); });

View File

@@ -8,7 +8,6 @@ import 'package:immich_mobile/extensions/theme_extensions.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart'; import 'package:immich_mobile/extensions/translate_extensions.dart';
import 'package:immich_mobile/presentation/widgets/backup/backup_toggle_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/backup/backup_toggle_button.widget.dart';
import 'package:immich_mobile/providers/background_sync.provider.dart'; import 'package:immich_mobile/providers/background_sync.provider.dart';
import 'package:immich_mobile/providers/backup/backup.provider.dart';
import 'package:immich_mobile/providers/backup/backup_album.provider.dart'; import 'package:immich_mobile/providers/backup/backup_album.provider.dart';
import 'package:immich_mobile/providers/backup/drift_backup.provider.dart'; import 'package:immich_mobile/providers/backup/drift_backup.provider.dart';
import 'package:immich_mobile/providers/user.provider.dart'; import 'package:immich_mobile/providers/user.provider.dart';
@@ -43,12 +42,10 @@ class _DriftBackupPageState extends ConsumerState<DriftBackupPage> {
await ref.read(backgroundSyncProvider).syncRemote(); await ref.read(backgroundSyncProvider).syncRemote();
await ref.read(driftBackupProvider.notifier).getBackupStatus(currentUser.id); await ref.read(driftBackupProvider.notifier).getBackupStatus(currentUser.id);
await ref.read(driftBackgroundUploadFgService).enableUploadService();
await ref.read(driftBackupProvider.notifier).startBackup(currentUser.id); await ref.read(driftBackupProvider.notifier).startBackup(currentUser.id);
} }
Future<void> stopBackup() async { Future<void> stopBackup() async {
await ref.read(driftBackgroundUploadFgService).disableUploadService();
await ref.read(driftBackupProvider.notifier).cancel(); await ref.read(driftBackupProvider.notifier).cancel();
} }

View File

@@ -79,7 +79,7 @@ class _ChangeExperiencePageState extends ConsumerState<ChangeExperiencePage> {
ref.read(readonlyModeProvider.notifier).setReadonlyMode(false); ref.read(readonlyModeProvider.notifier).setReadonlyMode(false);
await migrateStoreToIsar(ref.read(isarProvider), ref.read(driftProvider)); await migrateStoreToIsar(ref.read(isarProvider), ref.read(driftProvider));
await ref.read(backgroundServiceProvider).resumeServiceIfEnabled(); await ref.read(backgroundServiceProvider).resumeServiceIfEnabled();
await ref.read(driftBackgroundUploadFgService).disableUploadService(); await ref.read(driftBackgroundUploadFgService).disable();
} }
await IsarStoreRepository(ref.read(isarProvider)).upsert(StoreKey.betaTimeline, widget.switchingToBeta); await IsarStoreRepository(ref.read(isarProvider)).upsert(StoreKey.betaTimeline, widget.switchingToBeta);

View File

@@ -2,8 +2,10 @@ import 'package:auto_route/auto_route.dart';
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/domain/models/store.model.dart'; import 'package:immich_mobile/domain/models/store.model.dart';
import 'package:immich_mobile/domain/utils/isolate_lock_manager.dart';
import 'package:immich_mobile/entities/store.entity.dart'; import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/providers/auth.provider.dart'; import 'package:immich_mobile/providers/auth.provider.dart';
import 'package:immich_mobile/providers/background_sync.provider.dart';
import 'package:immich_mobile/providers/backup/backup.provider.dart'; import 'package:immich_mobile/providers/backup/backup.provider.dart';
import 'package:immich_mobile/providers/gallery_permission.provider.dart'; import 'package:immich_mobile/providers/gallery_permission.provider.dart';
import 'package:immich_mobile/providers/server_info.provider.dart'; import 'package:immich_mobile/providers/server_info.provider.dart';
@@ -21,14 +23,23 @@ class SplashScreenPage extends StatefulHookConsumerWidget {
class SplashScreenPageState extends ConsumerState<SplashScreenPage> { class SplashScreenPageState extends ConsumerState<SplashScreenPage> {
final log = Logger("SplashScreenPage"); final log = Logger("SplashScreenPage");
@override @override
void initState() { void initState() {
super.initState(); super.initState();
ref final lockManager = ref.read(isolateLockManagerProvider(kIsolateLockManagerPort));
.read(authProvider.notifier)
.setOpenApiServiceEndpoint() lockManager.requestHolderToClose();
.then(logConnectionInfo) lockManager
.whenComplete(() => resumeSession()); .acquireLock()
.timeout(const Duration(seconds: 5))
.whenComplete(
() => ref
.read(authProvider.notifier)
.setOpenApiServiceEndpoint()
.then(logConnectionInfo)
.whenComplete(() => resumeSession()),
);
} }
void logConnectionInfo(String? endpoint) { void logConnectionInfo(String? endpoint) {

View File

@@ -59,9 +59,9 @@ class BackgroundWorkerFgHostApi {
final String pigeonVar_messageChannelSuffix; final String pigeonVar_messageChannelSuffix;
Future<void> enableSyncWorker() async { Future<void> enable() async {
final String pigeonVar_channelName = final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableSyncWorker$pigeonVar_messageChannelSuffix'; 'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enable$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>( final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName, pigeonVar_channelName,
pigeonChannelCodec, pigeonChannelCodec,
@@ -82,32 +82,9 @@ class BackgroundWorkerFgHostApi {
} }
} }
Future<void> enableUploadWorker() async { Future<void> disable() async {
final String pigeonVar_channelName = final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.enableUploadWorker$pigeonVar_messageChannelSuffix'; 'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disable$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName,
pigeonChannelCodec,
binaryMessenger: pigeonVar_binaryMessenger,
);
final Future<Object?> pigeonVar_sendFuture = pigeonVar_channel.send(null);
final List<Object?>? pigeonVar_replyList = await pigeonVar_sendFuture as List<Object?>?;
if (pigeonVar_replyList == null) {
throw _createConnectionError(pigeonVar_channelName);
} else if (pigeonVar_replyList.length > 1) {
throw PlatformException(
code: pigeonVar_replyList[0]! as String,
message: pigeonVar_replyList[1] as String?,
details: pigeonVar_replyList[2],
);
} else {
return;
}
}
Future<void> disableUploadWorker() async {
final String pigeonVar_channelName =
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFgHostApi.disableUploadWorker$pigeonVar_messageChannelSuffix';
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>( final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
pigeonVar_channelName, pigeonVar_channelName,
pigeonChannelCodec, pigeonChannelCodec,
@@ -192,8 +169,6 @@ class BackgroundWorkerBgHostApi {
abstract class BackgroundWorkerFlutterApi { abstract class BackgroundWorkerFlutterApi {
static const MessageCodec<Object?> pigeonChannelCodec = _PigeonCodec(); static const MessageCodec<Object?> pigeonChannelCodec = _PigeonCodec();
Future<void> onLocalSync(int? maxSeconds);
Future<void> onIosUpload(bool isRefresh, int? maxSeconds); Future<void> onIosUpload(bool isRefresh, int? maxSeconds);
Future<void> onAndroidUpload(); Future<void> onAndroidUpload();
@@ -206,35 +181,6 @@ abstract class BackgroundWorkerFlutterApi {
String messageChannelSuffix = '', String messageChannelSuffix = '',
}) { }) {
messageChannelSuffix = messageChannelSuffix.isNotEmpty ? '.$messageChannelSuffix' : ''; messageChannelSuffix = messageChannelSuffix.isNotEmpty ? '.$messageChannelSuffix' : '';
{
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onLocalSync$messageChannelSuffix',
pigeonChannelCodec,
binaryMessenger: binaryMessenger,
);
if (api == null) {
pigeonVar_channel.setMessageHandler(null);
} else {
pigeonVar_channel.setMessageHandler((Object? message) async {
assert(
message != null,
'Argument for dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onLocalSync was null.',
);
final List<Object?> args = (message as List<Object?>?)!;
final int? arg_maxSeconds = (args[0] as int?);
try {
await api.onLocalSync(arg_maxSeconds);
return wrapResponse(empty: true);
} on PlatformException catch (e) {
return wrapResponse(error: e);
} catch (e) {
return wrapResponse(
error: PlatformException(code: 'error', message: e.toString()),
);
}
});
}
}
{ {
final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>( final BasicMessageChannel<Object?> pigeonVar_channel = BasicMessageChannel<Object?>(
'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onIosUpload$messageChannelSuffix', 'dev.flutter.pigeon.immich_mobile.BackgroundWorkerFlutterApi.onIosUpload$messageChannelSuffix',

View File

@@ -1,5 +1,6 @@
import 'package:auto_route/auto_route.dart'; import 'package:auto_route/auto_route.dart';
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:flutter_hooks/flutter_hooks.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/extensions/build_context_extensions.dart'; import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart'; import 'package:immich_mobile/extensions/translate_extensions.dart';
@@ -38,14 +39,14 @@ class DriftPlacePage extends StatelessWidget {
} }
} }
class _PlaceSliverAppBar extends StatelessWidget { class _PlaceSliverAppBar extends HookWidget {
const _PlaceSliverAppBar({required this.search}); const _PlaceSliverAppBar({required this.search});
final ValueNotifier<String?> search; final ValueNotifier<String?> search;
@override @override
Widget build(BuildContext context) { Widget build(BuildContext context) {
final searchFocusNode = FocusNode(); final searchFocusNode = useFocusNode();
return SliverAppBar( return SliverAppBar(
floating: true, floating: true,

View File

@@ -19,6 +19,7 @@ import 'package:immich_mobile/providers/infrastructure/current_album.provider.da
import 'package:immich_mobile/providers/timeline/multiselect.provider.dart'; import 'package:immich_mobile/providers/timeline/multiselect.provider.dart';
import 'package:immich_mobile/providers/user.provider.dart'; import 'package:immich_mobile/providers/user.provider.dart';
import 'package:immich_mobile/routing/router.dart'; import 'package:immich_mobile/routing/router.dart';
import 'package:immich_mobile/utils/album_filter.utils.dart';
import 'package:immich_mobile/widgets/common/confirm_dialog.dart'; import 'package:immich_mobile/widgets/common/confirm_dialog.dart';
import 'package:immich_mobile/widgets/common/immich_toast.dart'; import 'package:immich_mobile/widgets/common/immich_toast.dart';
import 'package:immich_mobile/widgets/common/search_field.dart'; import 'package:immich_mobile/widgets/common/search_field.dart';
@@ -39,8 +40,12 @@ class AlbumSelector extends ConsumerStatefulWidget {
class _AlbumSelectorState extends ConsumerState<AlbumSelector> { class _AlbumSelectorState extends ConsumerState<AlbumSelector> {
bool isGrid = false; bool isGrid = false;
final searchController = TextEditingController(); final searchController = TextEditingController();
QuickFilterMode filterMode = QuickFilterMode.all;
final searchFocusNode = FocusNode(); final searchFocusNode = FocusNode();
List<RemoteAlbum> sortedAlbums = [];
List<RemoteAlbum> shownAlbums = [];
AlbumFilter filter = AlbumFilter(query: "", mode: QuickFilterMode.all);
AlbumSort sort = AlbumSort(mode: RemoteAlbumSortMode.lastModified, isReverse: true);
@override @override
void initState() { void initState() {
@@ -52,7 +57,7 @@ class _AlbumSelectorState extends ConsumerState<AlbumSelector> {
}); });
searchController.addListener(() { searchController.addListener(() {
onSearch(searchController.text, filterMode); onSearch(searchController.text, filter.mode);
}); });
searchFocusNode.addListener(() { searchFocusNode.addListener(() {
@@ -62,9 +67,11 @@ class _AlbumSelectorState extends ConsumerState<AlbumSelector> {
}); });
} }
void onSearch(String searchTerm, QuickFilterMode sortMode) { void onSearch(String searchTerm, QuickFilterMode filterMode) {
final userId = ref.watch(currentUserProvider)?.id; final userId = ref.watch(currentUserProvider)?.id;
ref.read(remoteAlbumProvider.notifier).searchAlbums(searchTerm, userId, sortMode); filter = filter.copyWith(query: searchTerm, userId: userId, mode: filterMode);
filterAlbums();
} }
Future<void> onRefresh() async { Future<void> onRefresh() async {
@@ -77,17 +84,60 @@ class _AlbumSelectorState extends ConsumerState<AlbumSelector> {
}); });
} }
void changeFilter(QuickFilterMode sortMode) { void changeFilter(QuickFilterMode mode) {
setState(() { setState(() {
filterMode = sortMode; filter = filter.copyWith(mode: mode);
}); });
filterAlbums();
}
Future<void> changeSort(AlbumSort sort) async {
setState(() {
this.sort = sort;
});
await sortAlbums();
} }
void clearSearch() { void clearSearch() {
setState(() { setState(() {
filterMode = QuickFilterMode.all; filter = filter.copyWith(mode: QuickFilterMode.all, query: null);
searchController.clear(); searchController.clear();
ref.read(remoteAlbumProvider.notifier).clearSearch(); });
filterAlbums();
}
Future<void> sortAlbums() async {
final sorted = await ref
.read(remoteAlbumProvider.notifier)
.sortAlbums(ref.read(remoteAlbumProvider).albums, sort.mode, isReverse: sort.isReverse);
setState(() {
sortedAlbums = sorted;
});
// we need to re-filter the albums after sorting
// so shownAlbums gets updated
filterAlbums();
}
Future<void> filterAlbums() async {
if (filter.query == null) {
setState(() {
shownAlbums = sortedAlbums;
});
return;
}
final filteredAlbums = ref
.read(remoteAlbumProvider.notifier)
.searchAlbums(sortedAlbums, filter.query!, filter.userId, filter.mode);
setState(() {
shownAlbums = filteredAlbums;
}); });
} }
@@ -100,36 +150,41 @@ class _AlbumSelectorState extends ConsumerState<AlbumSelector> {
@override @override
Widget build(BuildContext context) { Widget build(BuildContext context) {
final albums = ref.watch(remoteAlbumProvider.select((s) => s.filteredAlbums));
final userId = ref.watch(currentUserProvider)?.id; final userId = ref.watch(currentUserProvider)?.id;
// refilter and sort when albums change
ref.listen(remoteAlbumProvider.select((state) => state.albums), (_, _) async {
await sortAlbums();
});
return MultiSliver( return MultiSliver(
children: [ children: [
_SearchBar( _SearchBar(
searchController: searchController, searchController: searchController,
searchFocusNode: searchFocusNode, searchFocusNode: searchFocusNode,
onSearch: onSearch, onSearch: onSearch,
filterMode: filterMode, filterMode: filter.mode,
onClearSearch: clearSearch, onClearSearch: clearSearch,
), ),
_QuickFilterButtonRow( _QuickFilterButtonRow(
filterMode: filterMode, filterMode: filter.mode,
onChangeFilter: changeFilter, onChangeFilter: changeFilter,
onSearch: onSearch, onSearch: onSearch,
searchController: searchController, searchController: searchController,
), ),
_QuickSortAndViewMode(isGrid: isGrid, onToggleViewMode: toggleViewMode), _QuickSortAndViewMode(isGrid: isGrid, onToggleViewMode: toggleViewMode, onSortChanged: changeSort),
isGrid isGrid
? _AlbumGrid(albums: albums, userId: userId, onAlbumSelected: widget.onAlbumSelected) ? _AlbumGrid(albums: shownAlbums, userId: userId, onAlbumSelected: widget.onAlbumSelected)
: _AlbumList(albums: albums, userId: userId, onAlbumSelected: widget.onAlbumSelected), : _AlbumList(albums: shownAlbums, userId: userId, onAlbumSelected: widget.onAlbumSelected),
], ],
); );
} }
} }
class _SortButton extends ConsumerStatefulWidget { class _SortButton extends ConsumerStatefulWidget {
const _SortButton(); const _SortButton(this.onSortChanged);
final Future<void> Function(AlbumSort) onSortChanged;
@override @override
ConsumerState<_SortButton> createState() => _SortButtonState(); ConsumerState<_SortButton> createState() => _SortButtonState();
@@ -148,15 +203,15 @@ class _SortButtonState extends ConsumerState<_SortButton> {
albumSortIsReverse = !albumSortIsReverse; albumSortIsReverse = !albumSortIsReverse;
isSorting = true; isSorting = true;
}); });
await ref.read(remoteAlbumProvider.notifier).sortFilteredAlbums(sortMode, isReverse: albumSortIsReverse);
} else { } else {
setState(() { setState(() {
albumSortOption = sortMode; albumSortOption = sortMode;
isSorting = true; isSorting = true;
}); });
await ref.read(remoteAlbumProvider.notifier).sortFilteredAlbums(sortMode, isReverse: albumSortIsReverse);
} }
await widget.onSortChanged.call(AlbumSort(mode: albumSortOption, isReverse: albumSortIsReverse));
setState(() { setState(() {
isSorting = false; isSorting = false;
}); });
@@ -394,10 +449,11 @@ class _QuickFilterButton extends StatelessWidget {
} }
class _QuickSortAndViewMode extends StatelessWidget { class _QuickSortAndViewMode extends StatelessWidget {
const _QuickSortAndViewMode({required this.isGrid, required this.onToggleViewMode}); const _QuickSortAndViewMode({required this.isGrid, required this.onToggleViewMode, required this.onSortChanged});
final bool isGrid; final bool isGrid;
final VoidCallback onToggleViewMode; final VoidCallback onToggleViewMode;
final Future<void> Function(AlbumSort) onSortChanged;
@override @override
Widget build(BuildContext context) { Widget build(BuildContext context) {
@@ -407,7 +463,7 @@ class _QuickSortAndViewMode extends StatelessWidget {
child: Row( child: Row(
mainAxisAlignment: MainAxisAlignment.spaceBetween, mainAxisAlignment: MainAxisAlignment.spaceBetween,
children: [ children: [
const _SortButton(), _SortButton(onSortChanged),
IconButton( IconButton(
icon: Icon(isGrid ? Icons.view_list_outlined : Icons.grid_view_outlined, size: 24), icon: Icon(isGrid ? Icons.view_list_outlined : Icons.grid_view_outlined, size: 24),
onPressed: onToggleViewMode, onPressed: onToggleViewMode,

View File

@@ -3,6 +3,7 @@ import 'dart:async';
import 'package:auto_route/auto_route.dart'; import 'package:auto_route/auto_route.dart';
import 'package:easy_localization/easy_localization.dart'; import 'package:easy_localization/easy_localization.dart';
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart'; import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/domain/models/timeline.model.dart'; import 'package:immich_mobile/domain/models/timeline.model.dart';
@@ -129,6 +130,7 @@ class _AssetViewerState extends ConsumerState<AssetViewer> {
reloadSubscription?.cancel(); reloadSubscription?.cancel();
_prevPreCacheStream?.removeListener(_dummyListener); _prevPreCacheStream?.removeListener(_dummyListener);
_nextPreCacheStream?.removeListener(_dummyListener); _nextPreCacheStream?.removeListener(_dummyListener);
SystemChrome.setEnabledSystemUIMode(SystemUiMode.edgeToEdge);
super.dispose(); super.dispose();
} }
@@ -596,6 +598,7 @@ class _AssetViewerState extends ConsumerState<AssetViewer> {
// Rebuild the widget when the asset viewer state changes // Rebuild the widget when the asset viewer state changes
// Using multiple selectors to avoid unnecessary rebuilds for other state changes // Using multiple selectors to avoid unnecessary rebuilds for other state changes
ref.watch(assetViewerProvider.select((s) => s.showingBottomSheet)); ref.watch(assetViewerProvider.select((s) => s.showingBottomSheet));
ref.watch(assetViewerProvider.select((s) => s.showingControls));
ref.watch(assetViewerProvider.select((s) => s.backgroundOpacity)); ref.watch(assetViewerProvider.select((s) => s.backgroundOpacity));
ref.watch(assetViewerProvider.select((s) => s.stackIndex)); ref.watch(assetViewerProvider.select((s) => s.stackIndex));
ref.watch(isPlayingMotionVideoProvider); ref.watch(isPlayingMotionVideoProvider);
@@ -612,6 +615,15 @@ class _AssetViewerState extends ConsumerState<AssetViewer> {
}); });
}); });
// Listen for control visibility changes and change system UI mode accordingly
ref.listen(assetViewerProvider.select((value) => value.showingControls), (_, showingControls) async {
if (showingControls) {
SystemChrome.setEnabledSystemUIMode(SystemUiMode.edgeToEdge);
} else {
SystemChrome.setEnabledSystemUIMode(SystemUiMode.immersiveSticky);
}
});
// Currently it is not possible to scroll the asset when the bottom sheet is open all the way. // Currently it is not possible to scroll the asset when the bottom sheet is open all the way.
// Issue: https://github.com/flutter/flutter/issues/109037 // Issue: https://github.com/flutter/flutter/issues/109037
// TODO: Add a custom scrum builder once the fix lands on stable // TODO: Add a custom scrum builder once the fix lands on stable

View File

@@ -62,7 +62,7 @@ class ViewerBottomBar extends ConsumerWidget {
duration: Durations.short2, duration: Durations.short2,
child: AnimatedSwitcher( child: AnimatedSwitcher(
duration: Durations.short4, duration: Durations.short4,
child: isSheetOpen || isReadonlyModeEnabled child: isSheetOpen
? const SizedBox.shrink() ? const SizedBox.shrink()
: Theme( : Theme(
data: context.themeData.copyWith( data: context.themeData.copyWith(
@@ -72,14 +72,14 @@ class ViewerBottomBar extends ConsumerWidget {
), ),
), ),
child: Container( child: Container(
height: context.padding.bottom + (asset.isVideo ? 160 : 90),
color: Colors.black.withAlpha(125), color: Colors.black.withAlpha(125),
padding: EdgeInsets.only(bottom: context.padding.bottom), padding: EdgeInsets.only(bottom: context.padding.bottom, top: 16),
child: Column( child: Column(
mainAxisAlignment: MainAxisAlignment.end, mainAxisAlignment: MainAxisAlignment.end,
children: [ children: [
if (asset.isVideo) const VideoControls(), if (asset.isVideo) const VideoControls(),
if (!isInLockedView) Row(mainAxisAlignment: MainAxisAlignment.spaceEvenly, children: actions), if (!isInLockedView && !isReadonlyModeEnabled)
Row(mainAxisAlignment: MainAxisAlignment.spaceEvenly, children: actions),
], ],
), ),
), ),

View File

@@ -1,7 +1,9 @@
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/extensions/translate_extensions.dart';
import 'package:immich_mobile/constants/enums.dart'; import 'package:immich_mobile/constants/enums.dart';
import 'package:immich_mobile/domain/models/album/album.model.dart'; import 'package:immich_mobile/domain/models/album/album.model.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/archive_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/archive_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/delete_permanent_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/delete_permanent_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/delete_local_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/delete_local_action_button.widget.dart';
@@ -16,22 +18,74 @@ import 'package:immich_mobile/presentation/widgets/action_buttons/share_link_act
import 'package:immich_mobile/presentation/widgets/action_buttons/stack_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/stack_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/trash_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/trash_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/action_buttons/upload_action_button.widget.dart'; import 'package:immich_mobile/presentation/widgets/action_buttons/upload_action_button.widget.dart';
import 'package:immich_mobile/presentation/widgets/album/album_selector.widget.dart';
import 'package:immich_mobile/presentation/widgets/bottom_sheet/base_bottom_sheet.widget.dart'; import 'package:immich_mobile/presentation/widgets/bottom_sheet/base_bottom_sheet.widget.dart';
import 'package:immich_mobile/providers/infrastructure/album.provider.dart';
import 'package:immich_mobile/providers/server_info.provider.dart'; import 'package:immich_mobile/providers/server_info.provider.dart';
import 'package:immich_mobile/providers/timeline/multiselect.provider.dart'; import 'package:immich_mobile/providers/timeline/multiselect.provider.dart';
import 'package:immich_mobile/widgets/common/immich_toast.dart';
class RemoteAlbumBottomSheet extends ConsumerWidget { class RemoteAlbumBottomSheet extends ConsumerStatefulWidget {
final RemoteAlbum album; final RemoteAlbum album;
const RemoteAlbumBottomSheet({super.key, required this.album}); const RemoteAlbumBottomSheet({super.key, required this.album});
@override @override
Widget build(BuildContext context, WidgetRef ref) { ConsumerState<RemoteAlbumBottomSheet> createState() => _RemoteAlbumBottomSheetState();
}
class _RemoteAlbumBottomSheetState extends ConsumerState<RemoteAlbumBottomSheet> {
late DraggableScrollableController sheetController;
@override
void initState() {
super.initState();
sheetController = DraggableScrollableController();
}
@override
void dispose() {
sheetController.dispose();
super.dispose();
}
@override
Widget build(BuildContext context) {
final multiselect = ref.watch(multiSelectProvider); final multiselect = ref.watch(multiSelectProvider);
final isTrashEnable = ref.watch(serverInfoProvider.select((state) => state.serverFeatures.trash)); final isTrashEnable = ref.watch(serverInfoProvider.select((state) => state.serverFeatures.trash));
Future<void> addAssetsToAlbum(RemoteAlbum album) async {
final selectedAssets = multiselect.selectedAssets;
if (selectedAssets.isEmpty) {
return;
}
final addedCount = await ref
.read(remoteAlbumProvider.notifier)
.addAssets(album.id, selectedAssets.map((e) => (e as RemoteAsset).id).toList());
if (addedCount != selectedAssets.length) {
ImmichToast.show(
context: context,
msg: 'add_to_album_bottom_sheet_already_exists'.t(context: context, args: {"album": album.name}),
);
} else {
ImmichToast.show(
context: context,
msg: 'add_to_album_bottom_sheet_added'.t(context: context, args: {"album": album.name}),
);
}
ref.read(multiSelectProvider.notifier).reset();
}
Future<void> onKeyboardExpand() {
return sheetController.animateTo(0.85, duration: const Duration(milliseconds: 200), curve: Curves.easeInOut);
}
return BaseBottomSheet( return BaseBottomSheet(
initialChildSize: 0.25, controller: sheetController,
maxChildSize: 0.4, initialChildSize: 0.45,
maxChildSize: 0.85,
shouldCloseOnMinExtent: false, shouldCloseOnMinExtent: false,
actions: [ actions: [
const ShareActionButton(source: ActionSource.timeline), const ShareActionButton(source: ActionSource.timeline),
@@ -52,7 +106,11 @@ class RemoteAlbumBottomSheet extends ConsumerWidget {
const DeleteLocalActionButton(source: ActionSource.timeline), const DeleteLocalActionButton(source: ActionSource.timeline),
const UploadActionButton(source: ActionSource.timeline), const UploadActionButton(source: ActionSource.timeline),
], ],
RemoveFromAlbumActionButton(source: ActionSource.timeline, albumId: album.id), RemoveFromAlbumActionButton(source: ActionSource.timeline, albumId: widget.album.id),
],
slivers: [
const AddToAlbumHeader(),
AlbumSelector(onAlbumSelected: addAssetsToAlbum, onKeyboardExpanded: onKeyboardExpand),
], ],
); );
} }

View File

@@ -10,6 +10,7 @@ import 'package:immich_mobile/presentation/widgets/timeline/constants.dart';
import 'package:immich_mobile/presentation/widgets/timeline/segment.model.dart'; import 'package:immich_mobile/presentation/widgets/timeline/segment.model.dart';
import 'package:immich_mobile/presentation/widgets/timeline/timeline.state.dart'; import 'package:immich_mobile/presentation/widgets/timeline/timeline.state.dart';
import 'package:intl/intl.dart' hide TextDirection; import 'package:intl/intl.dart' hide TextDirection;
import 'package:immich_mobile/providers/haptic_feedback.provider.dart';
/// A widget that will display a BoxScrollView with a ScrollThumb that can be dragged /// A widget that will display a BoxScrollView with a ScrollThumb that can be dragged
/// for quick navigation of the BoxScrollView. /// for quick navigation of the BoxScrollView.
@@ -74,6 +75,7 @@ List<_Segment> _buildSegments({required List<Segment> layoutSegments, required d
} }
class ScrubberState extends ConsumerState<Scrubber> with TickerProviderStateMixin { class ScrubberState extends ConsumerState<Scrubber> with TickerProviderStateMixin {
String? _lastLabel;
double _thumbTopOffset = 0.0; double _thumbTopOffset = 0.0;
bool _isDragging = false; bool _isDragging = false;
List<_Segment> _segments = []; List<_Segment> _segments = [];
@@ -172,6 +174,7 @@ class ScrubberState extends ConsumerState<Scrubber> with TickerProviderStateMixi
_isDragging = true; _isDragging = true;
_labelAnimationController.forward(); _labelAnimationController.forward();
_fadeOutTimer?.cancel(); _fadeOutTimer?.cancel();
_lastLabel = null;
}); });
} }
@@ -189,6 +192,11 @@ class ScrubberState extends ConsumerState<Scrubber> with TickerProviderStateMixi
if (nearestMonthSegment != null) { if (nearestMonthSegment != null) {
_snapToSegment(nearestMonthSegment); _snapToSegment(nearestMonthSegment);
final label = nearestMonthSegment.scrollLabel;
if (_lastLabel != label) {
ref.read(hapticFeedbackProvider.notifier).selectionClick();
_lastLabel = label;
}
} }
} }

View File

@@ -3,6 +3,7 @@ import 'dart:async';
import 'package:flutter/foundation.dart'; import 'package:flutter/foundation.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/domain/services/log.service.dart'; import 'package:immich_mobile/domain/services/log.service.dart';
import 'package:immich_mobile/domain/utils/isolate_lock_manager.dart';
import 'package:immich_mobile/entities/store.entity.dart'; import 'package:immich_mobile/entities/store.entity.dart';
import 'package:immich_mobile/models/backup/backup_state.model.dart'; import 'package:immich_mobile/models/backup/backup_state.model.dart';
import 'package:immich_mobile/providers/album/album.provider.dart'; import 'package:immich_mobile/providers/album/album.provider.dart';
@@ -81,6 +82,12 @@ class AppLifeCycleNotifier extends StateNotifier<AppLifeCycleEnum> {
} }
} else { } else {
_ref.read(backupProvider.notifier).cancelBackup(); _ref.read(backupProvider.notifier).cancelBackup();
final lockManager = _ref.read(isolateLockManagerProvider(kIsolateLockManagerPort));
lockManager.requestHolderToClose();
debugPrint("Requested lock holder to close on resume");
await lockManager.acquireLock();
debugPrint("Lock acquired for background sync on resume");
final backgroundManager = _ref.read(backgroundSyncProvider); final backgroundManager = _ref.read(backgroundSyncProvider);
// Ensure proper cleanup before starting new background tasks // Ensure proper cleanup before starting new background tasks
@@ -130,7 +137,7 @@ class AppLifeCycleNotifier extends StateNotifier<AppLifeCycleEnum> {
// do not stop/clean up anything on inactivity: issued on every orientation change // do not stop/clean up anything on inactivity: issued on every orientation change
} }
void handleAppPause() { Future<void> handleAppPause() async {
state = AppLifeCycleEnum.paused; state = AppLifeCycleEnum.paused;
_wasPaused = true; _wasPaused = true;
@@ -140,6 +147,12 @@ class AppLifeCycleNotifier extends StateNotifier<AppLifeCycleEnum> {
if (_ref.read(backupProvider.notifier).backupProgress != BackUpProgressEnum.manualInProgress) { if (_ref.read(backupProvider.notifier).backupProgress != BackUpProgressEnum.manualInProgress) {
_ref.read(backupProvider.notifier).cancelBackup(); _ref.read(backupProvider.notifier).cancelBackup();
} }
} else {
final backgroundManager = _ref.read(backgroundSyncProvider);
await backgroundManager.cancel();
await backgroundManager.cancelLocal();
_ref.read(isolateLockManagerProvider(kIsolateLockManagerPort)).releaseLock();
debugPrint("Lock released on app pause");
} }
_ref.read(websocketProvider.notifier).disconnect(); _ref.read(websocketProvider.notifier).disconnect();
@@ -173,6 +186,7 @@ class AppLifeCycleNotifier extends StateNotifier<AppLifeCycleEnum> {
} }
if (Store.isBetaTimelineEnabled) { if (Store.isBetaTimelineEnabled) {
_ref.read(isolateLockManagerProvider(kIsolateLockManagerPort)).releaseLock();
return; return;
} }

View File

@@ -1,5 +1,6 @@
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/domain/utils/background_sync.dart'; import 'package:immich_mobile/domain/utils/background_sync.dart';
import 'package:immich_mobile/domain/utils/isolate_lock_manager.dart';
import 'package:immich_mobile/providers/sync_status.provider.dart'; import 'package:immich_mobile/providers/sync_status.provider.dart';
final backgroundSyncProvider = Provider<BackgroundSyncManager>((ref) { final backgroundSyncProvider = Provider<BackgroundSyncManager>((ref) {
@@ -18,3 +19,7 @@ final backgroundSyncProvider = Provider<BackgroundSyncManager>((ref) {
ref.onDispose(manager.cancel); ref.onDispose(manager.cancel);
return manager; return manager;
}); });
final isolateLockManagerProvider = Provider.family<IsolateLockManager, String>((ref, name) {
return IsolateLockManager(portName: name);
});

View File

@@ -12,43 +12,42 @@ import 'album.provider.dart';
class RemoteAlbumState { class RemoteAlbumState {
final List<RemoteAlbum> albums; final List<RemoteAlbum> albums;
final List<RemoteAlbum> filteredAlbums;
const RemoteAlbumState({required this.albums, List<RemoteAlbum>? filteredAlbums}) const RemoteAlbumState({required this.albums});
: filteredAlbums = filteredAlbums ?? albums;
RemoteAlbumState copyWith({List<RemoteAlbum>? albums, List<RemoteAlbum>? filteredAlbums}) { RemoteAlbumState copyWith({List<RemoteAlbum>? albums}) {
return RemoteAlbumState(albums: albums ?? this.albums, filteredAlbums: filteredAlbums ?? this.filteredAlbums); return RemoteAlbumState(albums: albums ?? this.albums);
} }
@override @override
String toString() => 'RemoteAlbumState(albums: ${albums.length}, filteredAlbums: ${filteredAlbums.length})'; String toString() => 'RemoteAlbumState(albums: ${albums.length})';
@override @override
bool operator ==(covariant RemoteAlbumState other) { bool operator ==(covariant RemoteAlbumState other) {
if (identical(this, other)) return true; if (identical(this, other)) return true;
final listEquals = const DeepCollectionEquality().equals; final listEquals = const DeepCollectionEquality().equals;
return listEquals(other.albums, albums) && listEquals(other.filteredAlbums, filteredAlbums); return listEquals(other.albums, albums);
} }
@override @override
int get hashCode => albums.hashCode ^ filteredAlbums.hashCode; int get hashCode => albums.hashCode;
} }
class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> { class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
late RemoteAlbumService _remoteAlbumService; late RemoteAlbumService _remoteAlbumService;
final _logger = Logger('RemoteAlbumNotifier'); final _logger = Logger('RemoteAlbumNotifier');
@override @override
RemoteAlbumState build() { RemoteAlbumState build() {
_remoteAlbumService = ref.read(remoteAlbumServiceProvider); _remoteAlbumService = ref.read(remoteAlbumServiceProvider);
return const RemoteAlbumState(albums: [], filteredAlbums: []); return const RemoteAlbumState(albums: []);
} }
Future<List<RemoteAlbum>> _getAll() async { Future<List<RemoteAlbum>> _getAll() async {
try { try {
final albums = await _remoteAlbumService.getAll(); final albums = await _remoteAlbumService.getAll();
state = state.copyWith(albums: albums, filteredAlbums: albums); state = state.copyWith(albums: albums);
return albums; return albums;
} catch (error, stack) { } catch (error, stack) {
_logger.severe('Failed to fetch albums', error, stack); _logger.severe('Failed to fetch albums', error, stack);
@@ -60,19 +59,21 @@ class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
await _getAll(); await _getAll();
} }
void searchAlbums(String query, String? userId, [QuickFilterMode filterMode = QuickFilterMode.all]) { List<RemoteAlbum> searchAlbums(
final filtered = _remoteAlbumService.searchAlbums(state.albums, query, userId, filterMode); List<RemoteAlbum> albums,
String query,
state = state.copyWith(filteredAlbums: filtered); String? userId, [
QuickFilterMode filterMode = QuickFilterMode.all,
]) {
return _remoteAlbumService.searchAlbums(albums, query, userId, filterMode);
} }
void clearSearch() { Future<List<RemoteAlbum>> sortAlbums(
state = state.copyWith(filteredAlbums: state.albums); List<RemoteAlbum> albums,
} RemoteAlbumSortMode sortMode, {
bool isReverse = false,
Future<void> sortFilteredAlbums(RemoteAlbumSortMode sortMode, {bool isReverse = false}) async { }) async {
final sortedAlbums = await _remoteAlbumService.sortAlbums(state.filteredAlbums, sortMode, isReverse: isReverse); return await _remoteAlbumService.sortAlbums(albums, sortMode, isReverse: isReverse);
state = state.copyWith(filteredAlbums: sortedAlbums);
} }
Future<RemoteAlbum?> createAlbum({ Future<RemoteAlbum?> createAlbum({
@@ -83,7 +84,7 @@ class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
try { try {
final album = await _remoteAlbumService.createAlbum(title: title, description: description, assetIds: assetIds); final album = await _remoteAlbumService.createAlbum(title: title, description: description, assetIds: assetIds);
state = state.copyWith(albums: [...state.albums, album], filteredAlbums: [...state.filteredAlbums, album]); state = state.copyWith(albums: [...state.albums, album]);
return album; return album;
} catch (error, stack) { } catch (error, stack) {
@@ -114,11 +115,7 @@ class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
return album.id == albumId ? updatedAlbum : album; return album.id == albumId ? updatedAlbum : album;
}).toList(); }).toList();
final updatedFilteredAlbums = state.filteredAlbums.map((album) { state = state.copyWith(albums: updatedAlbums);
return album.id == albumId ? updatedAlbum : album;
}).toList();
state = state.copyWith(albums: updatedAlbums, filteredAlbums: updatedFilteredAlbums);
return updatedAlbum; return updatedAlbum;
} catch (error, stack) { } catch (error, stack) {
@@ -139,9 +136,7 @@ class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
await _remoteAlbumService.deleteAlbum(albumId); await _remoteAlbumService.deleteAlbum(albumId);
final updatedAlbums = state.albums.where((album) => album.id != albumId).toList(); final updatedAlbums = state.albums.where((album) => album.id != albumId).toList();
final updatedFilteredAlbums = state.filteredAlbums.where((album) => album.id != albumId).toList(); state = state.copyWith(albums: updatedAlbums);
state = state.copyWith(albums: updatedAlbums, filteredAlbums: updatedFilteredAlbums);
} }
Future<List<RemoteAsset>> getAssets(String albumId) { Future<List<RemoteAsset>> getAssets(String albumId) {
@@ -164,9 +159,7 @@ class RemoteAlbumNotifier extends Notifier<RemoteAlbumState> {
await _remoteAlbumService.removeUser(albumId, userId: userId); await _remoteAlbumService.removeUser(albumId, userId: userId);
final updatedAlbums = state.albums.where((album) => album.id != albumId).toList(); final updatedAlbums = state.albums.where((album) => album.id != albumId).toList();
final updatedFilteredAlbums = state.filteredAlbums.where((album) => album.id != albumId).toList(); state = state.copyWith(albums: updatedAlbums);
state = state.copyWith(albums: updatedAlbums, filteredAlbums: updatedFilteredAlbums);
} }
Future<void> setActivityStatus(String albumId, bool enabled) { Future<void> setActivityStatus(String albumId, bool enabled) {

View File

@@ -0,0 +1,25 @@
import 'package:immich_mobile/domain/services/remote_album.service.dart';
import 'package:immich_mobile/models/albums/album_search.model.dart';
class AlbumFilter {
String? userId;
String? query;
QuickFilterMode mode;
AlbumFilter({required this.mode, this.userId, this.query});
AlbumFilter copyWith({String? userId, String? query, QuickFilterMode? mode}) {
return AlbumFilter(userId: userId ?? this.userId, query: query ?? this.query, mode: mode ?? this.mode);
}
}
class AlbumSort {
RemoteAlbumSortMode mode;
bool isReverse;
AlbumSort({required this.mode, this.isReverse = false});
AlbumSort copyWith({RemoteAlbumSortMode? mode, bool? isReverse}) {
return AlbumSort(mode: mode ?? this.mode, isReverse: isReverse ?? this.isReverse);
}
}

View File

@@ -8,12 +8,14 @@ import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/providers/album/album.provider.dart'; import 'package:immich_mobile/providers/album/album.provider.dart';
import 'package:immich_mobile/providers/routes.provider.dart'; import 'package:immich_mobile/providers/routes.provider.dart';
import 'package:immich_mobile/widgets/album/add_to_album_sliverlist.dart'; import 'package:immich_mobile/widgets/album/add_to_album_sliverlist.dart';
import 'package:immich_mobile/widgets/album/add_to_album_bottom_sheet.dart';
import 'package:immich_mobile/models/asset_selection_state.dart'; import 'package:immich_mobile/models/asset_selection_state.dart';
import 'package:immich_mobile/widgets/asset_grid/delete_dialog.dart'; import 'package:immich_mobile/widgets/asset_grid/delete_dialog.dart';
import 'package:immich_mobile/widgets/asset_grid/upload_dialog.dart'; import 'package:immich_mobile/widgets/asset_grid/upload_dialog.dart';
import 'package:immich_mobile/providers/server_info.provider.dart'; import 'package:immich_mobile/providers/server_info.provider.dart';
import 'package:immich_mobile/widgets/common/drag_sheet.dart'; import 'package:immich_mobile/widgets/common/drag_sheet.dart';
import 'package:immich_mobile/entities/album.entity.dart'; import 'package:immich_mobile/entities/album.entity.dart';
import 'package:immich_mobile/entities/asset.entity.dart';
import 'package:immich_mobile/utils/draggable_scroll_controller.dart'; import 'package:immich_mobile/utils/draggable_scroll_controller.dart';
final controlBottomAppBarNotifier = ControlBottomAppBarNotifier(); final controlBottomAppBarNotifier = ControlBottomAppBarNotifier();
@@ -45,6 +47,7 @@ class ControlBottomAppBar extends HookConsumerWidget {
final bool unfavorite; final bool unfavorite;
final bool unarchive; final bool unarchive;
final AssetSelectionState selectionAssetState; final AssetSelectionState selectionAssetState;
final List<Asset> selectedAssets;
const ControlBottomAppBar({ const ControlBottomAppBar({
super.key, super.key,
@@ -64,6 +67,7 @@ class ControlBottomAppBar extends HookConsumerWidget {
this.onRemoveFromAlbum, this.onRemoveFromAlbum,
this.onToggleLocked, this.onToggleLocked,
this.selectionAssetState = const AssetSelectionState(), this.selectionAssetState = const AssetSelectionState(),
this.selectedAssets = const [],
this.enabled = true, this.enabled = true,
this.unarchive = false, this.unarchive = false,
this.unfavorite = false, this.unfavorite = false,
@@ -100,6 +104,18 @@ class ControlBottomAppBar extends HookConsumerWidget {
); );
} }
/// Show existing AddToAlbumBottomSheet
void showAddToAlbumBottomSheet() {
showModalBottomSheet(
elevation: 0,
shape: const RoundedRectangleBorder(borderRadius: BorderRadius.all(Radius.circular(15.0))),
context: context,
builder: (BuildContext _) {
return AddToAlbumBottomSheet(assets: selectedAssets);
},
);
}
void handleRemoteDelete(bool force, Function(bool) deleteCb, {String? alertMsg}) { void handleRemoteDelete(bool force, Function(bool) deleteCb, {String? alertMsg}) {
if (!force) { if (!force) {
deleteCb(force); deleteCb(force);
@@ -121,6 +137,15 @@ class ControlBottomAppBar extends HookConsumerWidget {
label: "share_link".tr(), label: "share_link".tr(),
onPressed: enabled ? () => onShare(false) : null, onPressed: enabled ? () => onShare(false) : null,
), ),
if (!isInLockedView && hasRemote && albums.isNotEmpty)
ConstrainedBox(
constraints: const BoxConstraints(maxWidth: 100),
child: ControlBoxButton(
iconData: Icons.photo_album,
label: "add_to_album".tr(),
onPressed: enabled ? showAddToAlbumBottomSheet : null,
),
),
if (hasRemote && onArchive != null) if (hasRemote && onArchive != null)
ControlBoxButton( ControlBoxButton(
iconData: unarchive ? Icons.unarchive_outlined : Icons.archive_outlined, iconData: unarchive ? Icons.unarchive_outlined : Icons.archive_outlined,

View File

@@ -440,6 +440,7 @@ class MultiselectGrid extends HookConsumerWidget {
onUpload: onUpload, onUpload: onUpload,
enabled: !processing.value, enabled: !processing.value,
selectionAssetState: selectionAssetState.value, selectionAssetState: selectionAssetState.value,
selectedAssets: selection.value.toList(),
onStack: stackEnabled ? onStack : null, onStack: stackEnabled ? onStack : null,
onEditTime: editEnabled ? onEditTime : null, onEditTime: editEnabled ? onEditTime : null,
onEditLocation: editEnabled ? onEditLocation : null, onEditLocation: editEnabled ? onEditLocation : null,

View File

@@ -1,7 +1,8 @@
import 'package:auto_route/auto_route.dart'; import 'package:auto_route/auto_route.dart';
import 'package:easy_localization/easy_localization.dart'; import 'package:easy_localization/easy_localization.dart';
import 'package:flutter/material.dart'; import 'package:flutter/material.dart';
import 'package:flutter_hooks/flutter_hooks.dart'; import 'package:flutter_hooks/flutter_hooks.dart' hide Store;
import 'package:immich_mobile/entities/store.entity.dart';
import 'package:hooks_riverpod/hooks_riverpod.dart'; import 'package:hooks_riverpod/hooks_riverpod.dart';
import 'package:immich_mobile/extensions/build_context_extensions.dart'; import 'package:immich_mobile/extensions/build_context_extensions.dart';
import 'package:immich_mobile/models/backup/backup_state.model.dart'; import 'package:immich_mobile/models/backup/backup_state.model.dart';
@@ -259,7 +260,7 @@ class ImmichAppBarDialog extends HookConsumerWidget {
const AppBarProfileInfoBox(), const AppBarProfileInfoBox(),
buildStorageInformation(), buildStorageInformation(),
const AppBarServerInfo(), const AppBarServerInfo(),
if (isReadonlyModeEnabled) buildReadonlyMessage(), if (Store.isBetaTimelineEnabled && isReadonlyModeEnabled) buildReadonlyMessage(),
buildAppLogButton(), buildAppLogButton(),
buildSettingButton(), buildSettingButton(),
buildSignOutButton(), buildSignOutButton(),

View File

@@ -121,7 +121,6 @@ class PhotoViewCore extends StatefulWidget {
class PhotoViewCoreState extends State<PhotoViewCore> class PhotoViewCoreState extends State<PhotoViewCore>
with TickerProviderStateMixin, PhotoViewControllerDelegate, HitCornersDetector { with TickerProviderStateMixin, PhotoViewControllerDelegate, HitCornersDetector {
Offset? _normalizedPosition;
double? _scaleBefore; double? _scaleBefore;
double? _rotationBefore; double? _rotationBefore;
@@ -154,7 +153,6 @@ class PhotoViewCoreState extends State<PhotoViewCore>
void onScaleStart(ScaleStartDetails details) { void onScaleStart(ScaleStartDetails details) {
_rotationBefore = controller.rotation; _rotationBefore = controller.rotation;
_scaleBefore = scale; _scaleBefore = scale;
_normalizedPosition = details.focalPoint - controller.position;
_scaleAnimationController.stop(); _scaleAnimationController.stop();
_positionAnimationController.stop(); _positionAnimationController.stop();
_rotationAnimationController.stop(); _rotationAnimationController.stop();
@@ -166,8 +164,14 @@ class PhotoViewCoreState extends State<PhotoViewCore>
}; };
void onScaleUpdate(ScaleUpdateDetails details) { void onScaleUpdate(ScaleUpdateDetails details) {
final centeredFocalPoint = Offset(
details.focalPoint.dx - scaleBoundaries.outerSize.width / 2,
details.focalPoint.dy - scaleBoundaries.outerSize.height / 2,
);
final double newScale = _scaleBefore! * details.scale; final double newScale = _scaleBefore! * details.scale;
Offset delta = details.focalPoint - _normalizedPosition!; final double scaleDelta = newScale / scale;
final Offset newPosition =
(controller.position + details.focalPointDelta) * scaleDelta - centeredFocalPoint * (scaleDelta - 1);
updateScaleStateFromNewScale(newScale); updateScaleStateFromNewScale(newScale);
@@ -176,7 +180,7 @@ class PhotoViewCoreState extends State<PhotoViewCore>
updateMultiple( updateMultiple(
scale: newScale, scale: newScale,
position: panEnabled ? delta : clampPosition(position: delta * details.scale), position: panEnabled ? newPosition : clampPosition(position: newPosition),
rotation: rotationEnabled ? _rotationBefore! + details.rotation : null, rotation: rotationEnabled ? _rotationBefore! + details.rotation : null,
rotationFocusPoint: rotationEnabled ? details.focalPoint : null, rotationFocusPoint: rotationEnabled ? details.focalPoint : null,
); );

View File

@@ -31,7 +31,8 @@ class SmartSearchDto {
this.model, this.model,
this.page, this.page,
this.personIds = const [], this.personIds = const [],
required this.query, this.query,
this.queryAssetId,
this.rating, this.rating,
this.size, this.size,
this.state, this.state,
@@ -151,7 +152,21 @@ class SmartSearchDto {
List<String> personIds; List<String> personIds;
String query; ///
/// Please note: This property should have been non-nullable! Since the specification file
/// does not include a default value (using the "default:" property), however, the generated
/// source code must fall back to having a nullable type.
/// Consider adding a "default:" property in the specification file to hide this note.
///
String? query;
///
/// Please note: This property should have been non-nullable! Since the specification file
/// does not include a default value (using the "default:" property), however, the generated
/// source code must fall back to having a nullable type.
/// Consider adding a "default:" property in the specification file to hide this note.
///
String? queryAssetId;
/// Minimum value: -1 /// Minimum value: -1
/// Maximum value: 5 /// Maximum value: 5
@@ -278,6 +293,7 @@ class SmartSearchDto {
other.page == page && other.page == page &&
_deepEquality.equals(other.personIds, personIds) && _deepEquality.equals(other.personIds, personIds) &&
other.query == query && other.query == query &&
other.queryAssetId == queryAssetId &&
other.rating == rating && other.rating == rating &&
other.size == size && other.size == size &&
other.state == state && other.state == state &&
@@ -314,7 +330,8 @@ class SmartSearchDto {
(model == null ? 0 : model!.hashCode) + (model == null ? 0 : model!.hashCode) +
(page == null ? 0 : page!.hashCode) + (page == null ? 0 : page!.hashCode) +
(personIds.hashCode) + (personIds.hashCode) +
(query.hashCode) + (query == null ? 0 : query!.hashCode) +
(queryAssetId == null ? 0 : queryAssetId!.hashCode) +
(rating == null ? 0 : rating!.hashCode) + (rating == null ? 0 : rating!.hashCode) +
(size == null ? 0 : size!.hashCode) + (size == null ? 0 : size!.hashCode) +
(state == null ? 0 : state!.hashCode) + (state == null ? 0 : state!.hashCode) +
@@ -331,7 +348,7 @@ class SmartSearchDto {
(withExif == null ? 0 : withExif!.hashCode); (withExif == null ? 0 : withExif!.hashCode);
@override @override
String toString() => 'SmartSearchDto[albumIds=$albumIds, city=$city, country=$country, createdAfter=$createdAfter, createdBefore=$createdBefore, deviceId=$deviceId, isEncoded=$isEncoded, isFavorite=$isFavorite, isMotion=$isMotion, isNotInAlbum=$isNotInAlbum, isOffline=$isOffline, language=$language, lensModel=$lensModel, libraryId=$libraryId, make=$make, model=$model, page=$page, personIds=$personIds, query=$query, rating=$rating, size=$size, state=$state, tagIds=$tagIds, takenAfter=$takenAfter, takenBefore=$takenBefore, trashedAfter=$trashedAfter, trashedBefore=$trashedBefore, type=$type, updatedAfter=$updatedAfter, updatedBefore=$updatedBefore, visibility=$visibility, withDeleted=$withDeleted, withExif=$withExif]'; String toString() => 'SmartSearchDto[albumIds=$albumIds, city=$city, country=$country, createdAfter=$createdAfter, createdBefore=$createdBefore, deviceId=$deviceId, isEncoded=$isEncoded, isFavorite=$isFavorite, isMotion=$isMotion, isNotInAlbum=$isNotInAlbum, isOffline=$isOffline, language=$language, lensModel=$lensModel, libraryId=$libraryId, make=$make, model=$model, page=$page, personIds=$personIds, query=$query, queryAssetId=$queryAssetId, rating=$rating, size=$size, state=$state, tagIds=$tagIds, takenAfter=$takenAfter, takenBefore=$takenBefore, trashedAfter=$trashedAfter, trashedBefore=$trashedBefore, type=$type, updatedAfter=$updatedAfter, updatedBefore=$updatedBefore, visibility=$visibility, withDeleted=$withDeleted, withExif=$withExif]';
Map<String, dynamic> toJson() { Map<String, dynamic> toJson() {
final json = <String, dynamic>{}; final json = <String, dynamic>{};
@@ -417,7 +434,16 @@ class SmartSearchDto {
// json[r'page'] = null; // json[r'page'] = null;
} }
json[r'personIds'] = this.personIds; json[r'personIds'] = this.personIds;
if (this.query != null) {
json[r'query'] = this.query; json[r'query'] = this.query;
} else {
// json[r'query'] = null;
}
if (this.queryAssetId != null) {
json[r'queryAssetId'] = this.queryAssetId;
} else {
// json[r'queryAssetId'] = null;
}
if (this.rating != null) { if (this.rating != null) {
json[r'rating'] = this.rating; json[r'rating'] = this.rating;
} else { } else {
@@ -522,7 +548,8 @@ class SmartSearchDto {
personIds: json[r'personIds'] is Iterable personIds: json[r'personIds'] is Iterable
? (json[r'personIds'] as Iterable).cast<String>().toList(growable: false) ? (json[r'personIds'] as Iterable).cast<String>().toList(growable: false)
: const [], : const [],
query: mapValueOfType<String>(json, r'query')!, query: mapValueOfType<String>(json, r'query'),
queryAssetId: mapValueOfType<String>(json, r'queryAssetId'),
rating: num.parse('${json[r'rating']}'), rating: num.parse('${json[r'rating']}'),
size: num.parse('${json[r'size']}'), size: num.parse('${json[r'size']}'),
state: mapValueOfType<String>(json, r'state'), state: mapValueOfType<String>(json, r'state'),
@@ -586,7 +613,6 @@ class SmartSearchDto {
/// The list of required keys that must be present in a JSON. /// The list of required keys that must be present in a JSON.
static const requiredKeys = <String>{ static const requiredKeys = <String>{
'query',
}; };
} }

View File

@@ -69,6 +69,7 @@ class SyncEntityType {
static const userMetadataDeleteV1 = SyncEntityType._(r'UserMetadataDeleteV1'); static const userMetadataDeleteV1 = SyncEntityType._(r'UserMetadataDeleteV1');
static const syncAckV1 = SyncEntityType._(r'SyncAckV1'); static const syncAckV1 = SyncEntityType._(r'SyncAckV1');
static const syncResetV1 = SyncEntityType._(r'SyncResetV1'); static const syncResetV1 = SyncEntityType._(r'SyncResetV1');
static const syncCompleteV1 = SyncEntityType._(r'SyncCompleteV1');
/// List of all possible values in this [enum][SyncEntityType]. /// List of all possible values in this [enum][SyncEntityType].
static const values = <SyncEntityType>[ static const values = <SyncEntityType>[
@@ -118,6 +119,7 @@ class SyncEntityType {
userMetadataDeleteV1, userMetadataDeleteV1,
syncAckV1, syncAckV1,
syncResetV1, syncResetV1,
syncCompleteV1,
]; ];
static SyncEntityType? fromJson(dynamic value) => SyncEntityTypeTypeTransformer().decode(value); static SyncEntityType? fromJson(dynamic value) => SyncEntityTypeTypeTransformer().decode(value);
@@ -202,6 +204,7 @@ class SyncEntityTypeTypeTransformer {
case r'UserMetadataDeleteV1': return SyncEntityType.userMetadataDeleteV1; case r'UserMetadataDeleteV1': return SyncEntityType.userMetadataDeleteV1;
case r'SyncAckV1': return SyncEntityType.syncAckV1; case r'SyncAckV1': return SyncEntityType.syncAckV1;
case r'SyncResetV1': return SyncEntityType.syncResetV1; case r'SyncResetV1': return SyncEntityType.syncResetV1;
case r'SyncCompleteV1': return SyncEntityType.syncCompleteV1;
default: default:
if (!allowNull) { if (!allowNull) {
throw ArgumentError('Unknown enum value to decode: $data'); throw ArgumentError('Unknown enum value to decode: $data');

View File

@@ -13,12 +13,9 @@ import 'package:pigeon/pigeon.dart';
) )
@HostApi() @HostApi()
abstract class BackgroundWorkerFgHostApi { abstract class BackgroundWorkerFgHostApi {
void enableSyncWorker(); void enable();
void enableUploadWorker(); void disable();
// Disables the background upload service
void disableUploadWorker();
} }
@HostApi() @HostApi()
@@ -27,15 +24,12 @@ abstract class BackgroundWorkerBgHostApi {
// required platform channels to notify the native side to start the background upload // required platform channels to notify the native side to start the background upload
void onInitialized(); void onInitialized();
// Called from the background flutter engine to request the native side to cleanup
void close(); void close();
} }
@FlutterApi() @FlutterApi()
abstract class BackgroundWorkerFlutterApi { abstract class BackgroundWorkerFlutterApi {
// Android & iOS: Called when the local sync is triggered
@async
void onLocalSync(int? maxSeconds);
// iOS Only: Called when the iOS background upload is triggered // iOS Only: Called when the iOS background upload is triggered
@async @async
void onIosUpload(bool isRefresh, int? maxSeconds); void onIosUpload(bool isRefresh, int? maxSeconds);

View File

@@ -15,7 +15,7 @@ function dart {
patch --no-backup-if-mismatch -u api.mustache <api.mustache.patch patch --no-backup-if-mismatch -u api.mustache <api.mustache.patch
cd ../../ cd ../../
pnpx @openapitools/openapi-generator-cli generate -g dart -i ./immich-openapi-specs.json -o ../mobile/openapi -t ./templates/mobile pnpm dlx @openapitools/openapi-generator-cli generate -g dart -i ./immich-openapi-specs.json -o ../mobile/openapi -t ./templates/mobile
# Post generate patches # Post generate patches
patch --no-backup-if-mismatch -u ../mobile/openapi/lib/api_client.dart <./patch/api_client.dart.patch patch --no-backup-if-mismatch -u ../mobile/openapi/lib/api_client.dart <./patch/api_client.dart.patch
@@ -27,7 +27,7 @@ function dart {
} }
function typescript { function typescript {
pnpx oazapfts --optimistic --argumentStyle=object --useEnumType immich-openapi-specs.json typescript-sdk/src/fetch-client.ts pnpm dlx oazapfts --optimistic --argumentStyle=object --useEnumType immich-openapi-specs.json typescript-sdk/src/fetch-client.ts
pnpm --filter @immich/sdk install --frozen-lockfile pnpm --filter @immich/sdk install --frozen-lockfile
pnpm --filter @immich/sdk build pnpm --filter @immich/sdk build
} }
@@ -35,8 +35,8 @@ function typescript {
# requires server to be built # requires server to be built
( (
cd .. cd ..
SHARP_IGNORE_GLOBAL_LIBVIPS=true pnpm --filter immich build SHARP_IGNORE_GLOBAL_LIBVIPS=true mise run server:build
pnpm --filter immich sync:open-api mise run server:open-api
) )
if [[ $1 == 'dart' ]]; then if [[ $1 == 'dart' ]]; then

View File

@@ -14571,6 +14571,10 @@
"query": { "query": {
"type": "string" "type": "string"
}, },
"queryAssetId": {
"format": "uuid",
"type": "string"
},
"rating": { "rating": {
"maximum": 5, "maximum": 5,
"minimum": -1, "minimum": -1,
@@ -14638,9 +14642,6 @@
"type": "boolean" "type": "boolean"
} }
}, },
"required": [
"query"
],
"type": "object" "type": "object"
}, },
"SourceType": { "SourceType": {
@@ -15416,6 +15417,10 @@
], ],
"type": "object" "type": "object"
}, },
"SyncCompleteV1": {
"properties": {},
"type": "object"
},
"SyncEntityType": { "SyncEntityType": {
"enum": [ "enum": [
"AuthUserV1", "AuthUserV1",
@@ -15463,7 +15468,8 @@
"UserMetadataV1", "UserMetadataV1",
"UserMetadataDeleteV1", "UserMetadataDeleteV1",
"SyncAckV1", "SyncAckV1",
"SyncResetV1" "SyncResetV1",
"SyncCompleteV1"
], ],
"type": "string" "type": "string"
}, },

View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -11,9 +11,6 @@
"default": "./build/index.js" "default": "./build/index.js"
} }
}, },
"scripts": {
"build": "tsc"
},
"license": "GNU Affero General Public License version 3", "license": "GNU Affero General Public License version 3",
"dependencies": { "dependencies": {
"@oazapfts/runtime": "^1.0.2" "@oazapfts/runtime": "^1.0.2"
@@ -26,8 +23,5 @@
"type": "git", "type": "git",
"url": "git+https://github.com/immich-app/immich.git", "url": "git+https://github.com/immich-app/immich.git",
"directory": "open-api/typescript-sdk" "directory": "open-api/typescript-sdk"
},
"volta": {
"node": "22.18.0"
} }
} }

View File

@@ -1014,7 +1014,8 @@ export type SmartSearchDto = {
model?: string | null; model?: string | null;
page?: number; page?: number;
personIds?: string[]; personIds?: string[];
query: string; query?: string;
queryAssetId?: string;
rating?: number; rating?: number;
size?: number; size?: number;
state?: string | null; state?: string | null;
@@ -4921,7 +4922,8 @@ export enum SyncEntityType {
UserMetadataV1 = "UserMetadataV1", UserMetadataV1 = "UserMetadataV1",
UserMetadataDeleteV1 = "UserMetadataDeleteV1", UserMetadataDeleteV1 = "UserMetadataDeleteV1",
SyncAckV1 = "SyncAckV1", SyncAckV1 = "SyncAckV1",
SyncResetV1 = "SyncResetV1" SyncResetV1 = "SyncResetV1",
SyncCompleteV1 = "SyncCompleteV1"
} }
export enum SyncRequestType { export enum SyncRequestType {
AlbumsV1 = "AlbumsV1", AlbumsV1 = "AlbumsV1",

View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -5,34 +5,6 @@
"author": "", "author": "",
"private": true, "private": true,
"license": "GNU Affero General Public License version 3", "license": "GNU Affero General Public License version 3",
"scripts": {
"build": "nest build",
"format": "prettier --check .",
"format:fix": "prettier --write .",
"start": "npm run start:dev",
"nest": "nest",
"start:dev": "nest start --watch --",
"start:debug": "nest start --debug 0.0.0.0:9230 --watch --",
"lint": "eslint \"src/**/*.ts\" \"test/**/*.ts\" --max-warnings 0",
"lint:fix": "npm run lint -- --fix",
"check": "tsc --noEmit",
"check:code": "npm run format && npm run lint && npm run check",
"check:all": "npm run check:code && npm run test:cov",
"test": "vitest --config test/vitest.config.mjs",
"test:cov": "vitest --config test/vitest.config.mjs --coverage",
"test:medium": "vitest --config test/vitest.config.medium.mjs",
"typeorm": "typeorm",
"lifecycle": "node ./dist/utils/lifecycle.js",
"migrations:debug": "node ./dist/bin/migrations.js debug",
"migrations:generate": "node ./dist/bin/migrations.js generate",
"migrations:create": "node ./dist/bin/migrations.js create",
"migrations:run": "node ./dist/bin/migrations.js run",
"schema:drop": "node ./dist/bin/migrations.js query 'DROP schema public cascade; CREATE schema public;'",
"schema:reset": "npm run schema:drop && npm run migrations:run",
"sync:open-api": "node ./dist/bin/sync-open-api.js",
"sync:sql": "node ./dist/bin/sync-sql.js",
"email:dev": "email dev -p 3050 --dir src/emails"
},
"dependencies": { "dependencies": {
"@nestjs/bullmq": "^11.0.1", "@nestjs/bullmq": "^11.0.1",
"@nestjs/common": "^11.0.4", "@nestjs/common": "^11.0.4",
@@ -172,9 +144,6 @@
"vite-tsconfig-paths": "^5.0.0", "vite-tsconfig-paths": "^5.0.0",
"vitest": "^3.0.0" "vitest": "^3.0.0"
}, },
"volta": {
"node": "22.18.0"
},
"overrides": { "overrides": {
"sharp": "^0.34.2" "sharp": "^0.34.2"
} }

View File

@@ -128,12 +128,6 @@ describe(SearchController.name, () => {
await request(ctx.getHttpServer()).post('/search/smart'); await request(ctx.getHttpServer()).post('/search/smart');
expect(ctx.authenticate).toHaveBeenCalled(); expect(ctx.authenticate).toHaveBeenCalled();
}); });
it('should require a query', async () => {
const { status, body } = await request(ctx.getHttpServer()).post('/search/smart').send({});
expect(status).toBe(400);
expect(body).toEqual(errorDto.badRequest(['query should not be empty', 'query must be a string']));
});
}); });
describe('GET /search/explore', () => { describe('GET /search/explore', () => {

View File

@@ -199,7 +199,12 @@ export class StatisticsSearchDto extends BaseSearchDto {
export class SmartSearchDto extends BaseSearchWithResultsDto { export class SmartSearchDto extends BaseSearchWithResultsDto {
@IsString() @IsString()
@IsNotEmpty() @IsNotEmpty()
query!: string; @Optional()
query?: string;
@ValidateUUID({ optional: true })
@Optional()
queryAssetId?: string;
@IsString() @IsString()
@IsNotEmpty() @IsNotEmpty()

View File

@@ -336,6 +336,9 @@ export class SyncAckV1 {}
@ExtraModel() @ExtraModel()
export class SyncResetV1 {} export class SyncResetV1 {}
@ExtraModel()
export class SyncCompleteV1 {}
export type SyncItem = { export type SyncItem = {
[SyncEntityType.AuthUserV1]: SyncAuthUserV1; [SyncEntityType.AuthUserV1]: SyncAuthUserV1;
[SyncEntityType.UserV1]: SyncUserV1; [SyncEntityType.UserV1]: SyncUserV1;
@@ -382,6 +385,7 @@ export type SyncItem = {
[SyncEntityType.UserMetadataV1]: SyncUserMetadataV1; [SyncEntityType.UserMetadataV1]: SyncUserMetadataV1;
[SyncEntityType.UserMetadataDeleteV1]: SyncUserMetadataDeleteV1; [SyncEntityType.UserMetadataDeleteV1]: SyncUserMetadataDeleteV1;
[SyncEntityType.SyncAckV1]: SyncAckV1; [SyncEntityType.SyncAckV1]: SyncAckV1;
[SyncEntityType.SyncCompleteV1]: SyncCompleteV1;
[SyncEntityType.SyncResetV1]: SyncResetV1; [SyncEntityType.SyncResetV1]: SyncResetV1;
}; };

View File

@@ -530,6 +530,7 @@ export enum JobName {
AssetGenerateThumbnails = 'AssetGenerateThumbnails', AssetGenerateThumbnails = 'AssetGenerateThumbnails',
AuditLogCleanup = 'AuditLogCleanup', AuditLogCleanup = 'AuditLogCleanup',
AuditTableCleanup = 'AuditTableCleanup',
DatabaseBackup = 'DatabaseBackup', DatabaseBackup = 'DatabaseBackup',
@@ -570,8 +571,7 @@ export enum JobName {
SendMail = 'SendMail', SendMail = 'SendMail',
SidecarQueueAll = 'SidecarQueueAll', SidecarQueueAll = 'SidecarQueueAll',
SidecarDiscovery = 'SidecarDiscovery', SidecarCheck = 'SidecarCheck',
SidecarSync = 'SidecarSync',
SidecarWrite = 'SidecarWrite', SidecarWrite = 'SidecarWrite',
SmartSearchQueueAll = 'SmartSearchQueueAll', SmartSearchQueueAll = 'SmartSearchQueueAll',
@@ -708,6 +708,7 @@ export enum SyncEntityType {
SyncAckV1 = 'SyncAckV1', SyncAckV1 = 'SyncAckV1',
SyncResetV1 = 'SyncResetV1', SyncResetV1 = 'SyncResetV1',
SyncCompleteV1 = 'SyncCompleteV1',
} }
export enum NotificationLevel { export enum NotificationLevel {

View File

@@ -43,6 +43,18 @@ where
limit limit
$2 $2
-- AssetJobRepository.getForSidecarCheckJob
select
"id",
"sidecarPath",
"originalPath"
from
"asset"
where
"asset"."id" = $1::uuid
limit
$2
-- AssetJobRepository.streamForThumbnailJob -- AssetJobRepository.streamForThumbnailJob
select select
"asset"."id", "asset"."id",

View File

@@ -123,6 +123,14 @@ offset
$8 $8
commit commit
-- SearchRepository.getEmbedding
select
*
from
"smart_search"
where
"assetId" = $1
-- SearchRepository.searchFaces -- SearchRepository.searchFaces
begin begin
set set

View File

@@ -957,7 +957,7 @@ where
order by order by
"stack"."updateId" asc "stack"."updateId" asc
-- SyncRepository.people.getDeletes -- SyncRepository.person.getDeletes
select select
"id", "id",
"personId" "personId"
@@ -970,7 +970,7 @@ where
order by order by
"person_audit"."id" asc "person_audit"."id" asc
-- SyncRepository.people.getUpserts -- SyncRepository.person.getUpserts
select select
"id", "id",
"createdAt", "createdAt",

View File

@@ -39,10 +39,8 @@ export class AssetJobRepository {
return this.db return this.db
.selectFrom('asset') .selectFrom('asset')
.where('asset.id', '=', asUuid(id)) .where('asset.id', '=', asUuid(id))
.select((eb) => [ .select(['id', 'sidecarPath', 'originalPath'])
'id', .select((eb) =>
'sidecarPath',
'originalPath',
jsonArrayFrom( jsonArrayFrom(
eb eb
.selectFrom('tag') .selectFrom('tag')
@@ -50,7 +48,17 @@ export class AssetJobRepository {
.innerJoin('tag_asset', 'tag.id', 'tag_asset.tagsId') .innerJoin('tag_asset', 'tag.id', 'tag_asset.tagsId')
.whereRef('asset.id', '=', 'tag_asset.assetsId'), .whereRef('asset.id', '=', 'tag_asset.assetsId'),
).as('tags'), ).as('tags'),
]) )
.limit(1)
.executeTakeFirst();
}
@GenerateSql({ params: [DummyValue.UUID] })
getForSidecarCheckJob(id: string) {
return this.db
.selectFrom('asset')
.where('asset.id', '=', asUuid(id))
.select(['id', 'sidecarPath', 'originalPath'])
.limit(1) .limit(1)
.executeTakeFirst(); .executeTakeFirst();
} }

View File

@@ -293,6 +293,13 @@ export class SearchRepository {
}); });
} }
@GenerateSql({
params: [DummyValue.UUID],
})
async getEmbedding(assetId: string) {
return this.db.selectFrom('smart_search').selectAll().where('assetId', '=', assetId).executeTakeFirst();
}
@GenerateSql({ @GenerateSql({
params: [ params: [
{ {

View File

@@ -1,5 +1,5 @@
import { Injectable } from '@nestjs/common'; import { Injectable } from '@nestjs/common';
import { Kysely } from 'kysely'; import { Kysely, sql } from 'kysely';
import { InjectKysely } from 'nestjs-kysely'; import { InjectKysely } from 'nestjs-kysely';
import { columns } from 'src/database'; import { columns } from 'src/database';
import { DummyValue, GenerateSql } from 'src/decorators'; import { DummyValue, GenerateSql } from 'src/decorators';
@@ -62,7 +62,7 @@ export class SyncRepository {
partnerAsset: PartnerAssetsSync; partnerAsset: PartnerAssetsSync;
partnerAssetExif: PartnerAssetExifsSync; partnerAssetExif: PartnerAssetExifsSync;
partnerStack: PartnerStackSync; partnerStack: PartnerStackSync;
people: PersonSync; person: PersonSync;
stack: StackSync; stack: StackSync;
user: UserSync; user: UserSync;
userMetadata: UserMetadataSync; userMetadata: UserMetadataSync;
@@ -84,7 +84,7 @@ export class SyncRepository {
this.partnerAsset = new PartnerAssetsSync(this.db); this.partnerAsset = new PartnerAssetsSync(this.db);
this.partnerAssetExif = new PartnerAssetExifsSync(this.db); this.partnerAssetExif = new PartnerAssetExifsSync(this.db);
this.partnerStack = new PartnerStackSync(this.db); this.partnerStack = new PartnerStackSync(this.db);
this.people = new PersonSync(this.db); this.person = new PersonSync(this.db);
this.stack = new StackSync(this.db); this.stack = new StackSync(this.db);
this.user = new UserSync(this.db); this.user = new UserSync(this.db);
this.userMetadata = new UserMetadataSync(this.db); this.userMetadata = new UserMetadataSync(this.db);
@@ -117,6 +117,15 @@ class BaseSync {
.orderBy(idRef, 'asc'); .orderBy(idRef, 'asc');
} }
protected auditCleanup<T extends keyof DB>(t: T, days: number) {
const { table, ref } = this.db.dynamic;
return this.db
.deleteFrom(table(t).as(t))
.where(ref(`${t}.deletedAt`), '<', sql.raw(`now() - interval '${days} days'`))
.execute();
}
protected upsertQuery<T extends keyof DB>(t: T, { nowId, ack }: SyncQueryOptions) { protected upsertQuery<T extends keyof DB>(t: T, { nowId, ack }: SyncQueryOptions) {
const { table, ref } = this.db.dynamic; const { table, ref } = this.db.dynamic;
const updateIdRef = ref(`${t}.updateId`); const updateIdRef = ref(`${t}.updateId`);
@@ -150,6 +159,10 @@ class AlbumSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('album_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
const userId = options.userId; const userId = options.userId;
@@ -286,6 +299,10 @@ class AlbumToAssetSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('album_asset_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
const userId = options.userId; const userId = options.userId;
@@ -334,6 +351,10 @@ class AlbumUserSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('album_user_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
const userId = options.userId; const userId = options.userId;
@@ -371,6 +392,10 @@ class AssetSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('asset_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('asset', options) return this.upsertQuery('asset', options)
@@ -400,6 +425,10 @@ class PersonSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('person_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('person', options) return this.upsertQuery('person', options)
@@ -431,6 +460,10 @@ class AssetFaceSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('asset_face_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('asset_face', options) return this.upsertQuery('asset_face', options)
@@ -473,6 +506,10 @@ class MemorySync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('memory_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('memory', options) return this.upsertQuery('memory', options)
@@ -505,6 +542,10 @@ class MemoryToAssetSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('memory_asset_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('memory_asset', options) return this.upsertQuery('memory_asset', options)
@@ -537,6 +578,10 @@ class PartnerSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('partner_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
const userId = options.userId; const userId = options.userId;
@@ -616,6 +661,10 @@ class StackSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('stack_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('stack', options) return this.upsertQuery('stack', options)
@@ -664,6 +713,10 @@ class UserSync extends BaseSync {
return this.auditQuery('user_audit', options).select(['id', 'userId']).stream(); return this.auditQuery('user_audit', options).select(['id', 'userId']).stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('user_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('user', options).select(columns.syncUser).stream(); return this.upsertQuery('user', options).select(columns.syncUser).stream();
@@ -679,6 +732,10 @@ class UserMetadataSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('user_metadata_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions], stream: true }) @GenerateSql({ params: [dummyQueryOptions], stream: true })
getUpserts(options: SyncQueryOptions) { getUpserts(options: SyncQueryOptions) {
return this.upsertQuery('user_metadata', options) return this.upsertQuery('user_metadata', options)
@@ -698,6 +755,10 @@ class AssetMetadataSync extends BaseSync {
.stream(); .stream();
} }
cleanupAuditTable(daysAgo: number) {
return this.auditCleanup('asset_metadata_audit', daysAgo);
}
@GenerateSql({ params: [dummyQueryOptions, DummyValue.UUID], stream: true }) @GenerateSql({ params: [dummyQueryOptions, DummyValue.UUID], stream: true })
getUpserts(options: SyncQueryOptions, userId: string) { getUpserts(options: SyncQueryOptions, userId: string) {
return this.upsertQuery('asset_metadata', options) return this.upsertQuery('asset_metadata', options)

View File

@@ -166,6 +166,7 @@ export interface DB {
api_key: ApiKeyTable; api_key: ApiKeyTable;
asset: AssetTable; asset: AssetTable;
asset_audit: AssetAuditTable;
asset_exif: AssetExifTable; asset_exif: AssetExifTable;
asset_face: AssetFaceTable; asset_face: AssetFaceTable;
asset_face_audit: AssetFaceAuditTable; asset_face_audit: AssetFaceAuditTable;
@@ -173,7 +174,6 @@ export interface DB {
asset_metadata: AssetMetadataTable; asset_metadata: AssetMetadataTable;
asset_metadata_audit: AssetMetadataAuditTable; asset_metadata_audit: AssetMetadataAuditTable;
asset_job_status: AssetJobStatusTable; asset_job_status: AssetJobStatusTable;
asset_audit: AssetAuditTable;
audit: AuditTable; audit: AuditTable;

View File

@@ -1,11 +1,11 @@
import { PrimaryGeneratedUuidV7Column } from 'src/decorators'; import { PrimaryGeneratedUuidV7Column } from 'src/decorators';
import { MemoryTable } from 'src/schema/tables/memory.table'; import { MemoryTable } from 'src/schema/tables/memory.table';
import { Column, CreateDateColumn, ForeignKeyColumn, Table } from 'src/sql-tools'; import { Column, CreateDateColumn, ForeignKeyColumn, Generated, Table, Timestamp } from 'src/sql-tools';
@Table('memory_asset_audit') @Table('memory_asset_audit')
export class MemoryAssetAuditTable { export class MemoryAssetAuditTable {
@PrimaryGeneratedUuidV7Column() @PrimaryGeneratedUuidV7Column()
id!: string; id!: Generated<string>;
@ForeignKeyColumn(() => MemoryTable, { type: 'uuid', onDelete: 'CASCADE', onUpdate: 'CASCADE' }) @ForeignKeyColumn(() => MemoryTable, { type: 'uuid', onDelete: 'CASCADE', onUpdate: 'CASCADE' })
memoryId!: string; memoryId!: string;
@@ -14,5 +14,5 @@ export class MemoryAssetAuditTable {
assetId!: string; assetId!: string;
@CreateDateColumn({ default: () => 'clock_timestamp()', index: true }) @CreateDateColumn({ default: () => 'clock_timestamp()', index: true })
deletedAt!: Date; deletedAt!: Generated<Timestamp>;
} }

View File

@@ -42,6 +42,7 @@ describe(JobService.name, () => {
{ name: JobName.PersonCleanup }, { name: JobName.PersonCleanup },
{ name: JobName.MemoryCleanup }, { name: JobName.MemoryCleanup },
{ name: JobName.SessionCleanup }, { name: JobName.SessionCleanup },
{ name: JobName.AuditTableCleanup },
{ name: JobName.AuditLogCleanup }, { name: JobName.AuditLogCleanup },
{ name: JobName.MemoryGenerate }, { name: JobName.MemoryGenerate },
{ name: JobName.UserSyncUsage }, { name: JobName.UserSyncUsage },
@@ -238,11 +239,11 @@ describe(JobService.name, () => {
const tests: Array<{ item: JobItem; jobs: JobName[]; stub?: any }> = [ const tests: Array<{ item: JobItem; jobs: JobName[]; stub?: any }> = [
{ {
item: { name: JobName.SidecarSync, data: { id: 'asset-1' } }, item: { name: JobName.SidecarCheck, data: { id: 'asset-1' } },
jobs: [JobName.AssetExtractMetadata], jobs: [JobName.AssetExtractMetadata],
}, },
{ {
item: { name: JobName.SidecarDiscovery, data: { id: 'asset-1' } }, item: { name: JobName.SidecarCheck, data: { id: 'asset-1' } },
jobs: [JobName.AssetExtractMetadata], jobs: [JobName.AssetExtractMetadata],
}, },
{ {

View File

@@ -281,6 +281,7 @@ export class JobService extends BaseService {
{ name: JobName.PersonCleanup }, { name: JobName.PersonCleanup },
{ name: JobName.MemoryCleanup }, { name: JobName.MemoryCleanup },
{ name: JobName.SessionCleanup }, { name: JobName.SessionCleanup },
{ name: JobName.AuditTableCleanup },
{ name: JobName.AuditLogCleanup }, { name: JobName.AuditLogCleanup },
); );
} }
@@ -309,8 +310,7 @@ export class JobService extends BaseService {
*/ */
private async onDone(item: JobItem) { private async onDone(item: JobItem) {
switch (item.name) { switch (item.name) {
case JobName.SidecarSync: case JobName.SidecarCheck: {
case JobName.SidecarDiscovery: {
await this.jobRepository.queue({ name: JobName.AssetExtractMetadata, data: item.data }); await this.jobRepository.queue({ name: JobName.AssetExtractMetadata, data: item.data });
break; break;
} }

View File

@@ -527,7 +527,7 @@ describe(LibraryService.name, () => {
expect(mocks.job.queueAll).toHaveBeenCalledWith([ expect(mocks.job.queueAll).toHaveBeenCalledWith([
{ {
name: JobName.SidecarDiscovery, name: JobName.SidecarCheck,
data: { data: {
id: assetStub.external.id, id: assetStub.external.id,
source: 'upload', source: 'upload',
@@ -573,7 +573,7 @@ describe(LibraryService.name, () => {
expect(mocks.job.queueAll).toHaveBeenCalledWith([ expect(mocks.job.queueAll).toHaveBeenCalledWith([
{ {
name: JobName.SidecarDiscovery, name: JobName.SidecarCheck,
data: { data: {
id: assetStub.image.id, id: assetStub.image.id,
source: 'upload', source: 'upload',

View File

@@ -414,7 +414,7 @@ export class LibraryService extends BaseService {
// We queue a sidecar discovery which, in turn, queues metadata extraction // We queue a sidecar discovery which, in turn, queues metadata extraction
await this.jobRepository.queueAll( await this.jobRepository.queueAll(
assetIds.map((assetId) => ({ assetIds.map((assetId) => ({
name: JobName.SidecarDiscovery, name: JobName.SidecarCheck,
data: { id: assetId, source: 'upload' }, data: { id: assetId, source: 'upload' },
})), })),
); );

View File

@@ -1,7 +1,6 @@
import { BinaryField, ExifDateTime } from 'exiftool-vendored'; import { BinaryField, ExifDateTime } from 'exiftool-vendored';
import { randomBytes } from 'node:crypto'; import { randomBytes } from 'node:crypto';
import { Stats } from 'node:fs'; import { Stats } from 'node:fs';
import { constants } from 'node:fs/promises';
import { defaults } from 'src/config'; import { defaults } from 'src/config';
import { MapAsset } from 'src/dtos/asset-response.dto'; import { MapAsset } from 'src/dtos/asset-response.dto';
import { AssetType, AssetVisibility, ExifOrientation, ImmichWorker, JobName, JobStatus, SourceType } from 'src/enum'; import { AssetType, AssetVisibility, ExifOrientation, ImmichWorker, JobName, JobStatus, SourceType } from 'src/enum';
@@ -15,6 +14,21 @@ import { tagStub } from 'test/fixtures/tag.stub';
import { factory } from 'test/small.factory'; import { factory } from 'test/small.factory';
import { makeStream, newTestService, ServiceMocks } from 'test/utils'; import { makeStream, newTestService, ServiceMocks } from 'test/utils';
const forSidecarJob = (
asset: {
id?: string;
originalPath?: string;
sidecarPath?: string | null;
} = {},
) => {
return {
id: factory.uuid(),
originalPath: '/path/to/IMG_123.jpg',
sidecarPath: null,
...asset,
};
};
const makeFaceTags = (face: Partial<{ Name: string }> = {}, orientation?: ImmichTags['Orientation']) => ({ const makeFaceTags = (face: Partial<{ Name: string }> = {}, orientation?: ImmichTags['Orientation']) => ({
Orientation: orientation, Orientation: orientation,
RegionInfo: { RegionInfo: {
@@ -1457,7 +1471,7 @@ describe(MetadataService.name, () => {
expect(mocks.job.queueAll).toHaveBeenCalledWith([ expect(mocks.job.queueAll).toHaveBeenCalledWith([
{ {
name: JobName.SidecarSync, name: JobName.SidecarCheck,
data: { id: assetStub.sidecar.id }, data: { id: assetStub.sidecar.id },
}, },
]); ]);
@@ -1471,133 +1485,65 @@ describe(MetadataService.name, () => {
expect(mocks.assetJob.streamForSidecar).toHaveBeenCalledWith(false); expect(mocks.assetJob.streamForSidecar).toHaveBeenCalledWith(false);
expect(mocks.job.queueAll).toHaveBeenCalledWith([ expect(mocks.job.queueAll).toHaveBeenCalledWith([
{ {
name: JobName.SidecarDiscovery, name: JobName.SidecarCheck,
data: { id: assetStub.image.id }, data: { id: assetStub.image.id },
}, },
]); ]);
}); });
}); });
describe('handleSidecarSync', () => { describe('handleSidecarCheck', () => {
it('should do nothing if asset could not be found', async () => { it('should do nothing if asset could not be found', async () => {
mocks.asset.getByIds.mockResolvedValue([]); mocks.assetJob.getForSidecarCheckJob.mockResolvedValue(void 0);
await expect(sut.handleSidecarSync({ id: assetStub.image.id })).resolves.toBe(JobStatus.Failed);
await expect(sut.handleSidecarCheck({ id: assetStub.image.id })).resolves.toBeUndefined();
expect(mocks.asset.update).not.toHaveBeenCalled(); expect(mocks.asset.update).not.toHaveBeenCalled();
}); });
it('should do nothing if asset has no sidecar path', async () => { it('should detect a new sidecar at .jpg.xmp', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.image]); const asset = forSidecarJob({ originalPath: '/path/to/IMG_123.jpg' });
await expect(sut.handleSidecarSync({ id: assetStub.image.id })).resolves.toBe(JobStatus.Failed);
expect(mocks.asset.update).not.toHaveBeenCalled(); mocks.assetJob.getForSidecarCheckJob.mockResolvedValue(asset);
mocks.storage.checkFileExists.mockResolvedValueOnce(true);
await expect(sut.handleSidecarCheck({ id: asset.id })).resolves.toBe(JobStatus.Success);
expect(mocks.asset.update).toHaveBeenCalledWith({ id: asset.id, sidecarPath: `/path/to/IMG_123.jpg.xmp` });
}); });
it('should set sidecar path if exists (sidecar named photo.ext.xmp)', async () => { it('should detect a new sidecar at .xmp', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.sidecar]); const asset = forSidecarJob({ originalPath: '/path/to/IMG_123.jpg' });
mocks.storage.checkFileExists.mockResolvedValue(true);
await expect(sut.handleSidecarSync({ id: assetStub.sidecar.id })).resolves.toBe(JobStatus.Success); mocks.assetJob.getForSidecarCheckJob.mockResolvedValue(asset);
expect(mocks.storage.checkFileExists).toHaveBeenCalledWith(
`${assetStub.sidecar.originalPath}.xmp`,
constants.R_OK,
);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.sidecar.id,
sidecarPath: assetStub.sidecar.sidecarPath,
});
});
it('should set sidecar path if exists (sidecar named photo.xmp)', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.sidecarWithoutExt as any]);
mocks.storage.checkFileExists.mockResolvedValueOnce(false); mocks.storage.checkFileExists.mockResolvedValueOnce(false);
mocks.storage.checkFileExists.mockResolvedValueOnce(true); mocks.storage.checkFileExists.mockResolvedValueOnce(true);
await expect(sut.handleSidecarSync({ id: assetStub.sidecarWithoutExt.id })).resolves.toBe(JobStatus.Success); await expect(sut.handleSidecarCheck({ id: asset.id })).resolves.toBe(JobStatus.Success);
expect(mocks.storage.checkFileExists).toHaveBeenNthCalledWith(
2,
assetStub.sidecarWithoutExt.sidecarPath,
constants.R_OK,
);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.sidecarWithoutExt.id,
sidecarPath: assetStub.sidecarWithoutExt.sidecarPath,
});
});
it('should set sidecar path if exists (two sidecars named photo.ext.xmp and photo.xmp, should pick photo.ext.xmp)', async () => { expect(mocks.asset.update).toHaveBeenCalledWith({ id: asset.id, sidecarPath: '/path/to/IMG_123.xmp' });
mocks.asset.getByIds.mockResolvedValue([assetStub.sidecar]);
mocks.storage.checkFileExists.mockResolvedValueOnce(true);
mocks.storage.checkFileExists.mockResolvedValueOnce(true);
await expect(sut.handleSidecarSync({ id: assetStub.sidecar.id })).resolves.toBe(JobStatus.Success);
expect(mocks.storage.checkFileExists).toHaveBeenNthCalledWith(1, assetStub.sidecar.sidecarPath, constants.R_OK);
expect(mocks.storage.checkFileExists).toHaveBeenNthCalledWith(
2,
assetStub.sidecarWithoutExt.sidecarPath,
constants.R_OK,
);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.sidecar.id,
sidecarPath: assetStub.sidecar.sidecarPath,
});
}); });
it('should unset sidecar path if file does not exist anymore', async () => { it('should unset sidecar path if file does not exist anymore', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.sidecar]); const asset = forSidecarJob({ originalPath: '/path/to/IMG_123.jpg', sidecarPath: '/path/to/IMG_123.jpg.xmp' });
mocks.assetJob.getForSidecarCheckJob.mockResolvedValue(asset);
mocks.storage.checkFileExists.mockResolvedValue(false); mocks.storage.checkFileExists.mockResolvedValue(false);
await expect(sut.handleSidecarSync({ id: assetStub.sidecar.id })).resolves.toBe(JobStatus.Success); await expect(sut.handleSidecarCheck({ id: asset.id })).resolves.toBe(JobStatus.Success);
expect(mocks.storage.checkFileExists).toHaveBeenCalledWith(
`${assetStub.sidecar.originalPath}.xmp`,
constants.R_OK,
);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.sidecar.id,
sidecarPath: null,
});
});
});
describe('handleSidecarDiscovery', () => { expect(mocks.asset.update).toHaveBeenCalledWith({ id: asset.id, sidecarPath: null });
it('should skip hidden assets', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.livePhotoMotionAsset as any]);
await sut.handleSidecarDiscovery({ id: assetStub.livePhotoMotionAsset.id });
expect(mocks.storage.checkFileExists).not.toHaveBeenCalled();
}); });
it('should skip assets with a sidecar path', async () => { it('should do nothing if the sidecar file still exists', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.sidecar]); const asset = forSidecarJob({ originalPath: '/path/to/IMG_123.jpg', sidecarPath: '/path/to/IMG_123.jpg' });
await sut.handleSidecarDiscovery({ id: assetStub.sidecar.id });
expect(mocks.storage.checkFileExists).not.toHaveBeenCalled(); mocks.assetJob.getForSidecarCheckJob.mockResolvedValue(asset);
}); mocks.storage.checkFileExists.mockResolvedValueOnce(true);
await expect(sut.handleSidecarCheck({ id: asset.id })).resolves.toBe(JobStatus.Skipped);
it('should do nothing when a sidecar is not found ', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.image]);
mocks.storage.checkFileExists.mockResolvedValue(false);
await sut.handleSidecarDiscovery({ id: assetStub.image.id });
expect(mocks.asset.update).not.toHaveBeenCalled(); expect(mocks.asset.update).not.toHaveBeenCalled();
}); });
it('should update a image asset when a sidecar is found', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.image]);
mocks.storage.checkFileExists.mockResolvedValue(true);
await sut.handleSidecarDiscovery({ id: assetStub.image.id });
expect(mocks.storage.checkFileExists).toHaveBeenCalledWith('/original/path.jpg.xmp', constants.R_OK);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.image.id,
sidecarPath: '/original/path.jpg.xmp',
});
});
it('should update a video asset when a sidecar is found', async () => {
mocks.asset.getByIds.mockResolvedValue([assetStub.video]);
mocks.storage.checkFileExists.mockResolvedValue(true);
await sut.handleSidecarDiscovery({ id: assetStub.video.id });
expect(mocks.storage.checkFileExists).toHaveBeenCalledWith('/original/path.ext.xmp', constants.R_OK);
expect(mocks.asset.update).toHaveBeenCalledWith({
id: assetStub.image.id,
sidecarPath: '/original/path.ext.xmp',
});
});
}); });
describe('handleSidecarWrite', () => { describe('handleSidecarWrite', () => {

View File

@@ -5,7 +5,7 @@ import _ from 'lodash';
import { Duration } from 'luxon'; import { Duration } from 'luxon';
import { Stats } from 'node:fs'; import { Stats } from 'node:fs';
import { constants } from 'node:fs/promises'; import { constants } from 'node:fs/promises';
import path from 'node:path'; import { join, parse } from 'node:path';
import { JOBS_ASSET_PAGINATION_SIZE } from 'src/constants'; import { JOBS_ASSET_PAGINATION_SIZE } from 'src/constants';
import { StorageCore } from 'src/cores/storage.core'; import { StorageCore } from 'src/cores/storage.core';
import { Asset, AssetFace } from 'src/database'; import { Asset, AssetFace } from 'src/database';
@@ -331,7 +331,7 @@ export class MetadataService extends BaseService {
const assets = this.assetJobRepository.streamForSidecar(force); const assets = this.assetJobRepository.streamForSidecar(force);
for await (const asset of assets) { for await (const asset of assets) {
jobs.push({ name: force ? JobName.SidecarSync : JobName.SidecarDiscovery, data: { id: asset.id } }); jobs.push({ name: JobName.SidecarCheck, data: { id: asset.id } });
if (jobs.length >= JOBS_ASSET_PAGINATION_SIZE) { if (jobs.length >= JOBS_ASSET_PAGINATION_SIZE) {
await queueAll(); await queueAll();
} }
@@ -342,14 +342,37 @@ export class MetadataService extends BaseService {
return JobStatus.Success; return JobStatus.Success;
} }
@OnJob({ name: JobName.SidecarSync, queue: QueueName.Sidecar }) @OnJob({ name: JobName.SidecarCheck, queue: QueueName.Sidecar })
handleSidecarSync({ id }: JobOf<JobName.SidecarSync>): Promise<JobStatus> { async handleSidecarCheck({ id }: JobOf<JobName.SidecarCheck>): Promise<JobStatus | undefined> {
return this.processSidecar(id, true); const asset = await this.assetJobRepository.getForSidecarCheckJob(id);
} if (!asset) {
return;
}
@OnJob({ name: JobName.SidecarDiscovery, queue: QueueName.Sidecar }) let sidecarPath = null;
handleSidecarDiscovery({ id }: JobOf<JobName.SidecarDiscovery>): Promise<JobStatus> { for (const candidate of this.getSidecarCandidates(asset)) {
return this.processSidecar(id, false); const exists = await this.storageRepository.checkFileExists(candidate, constants.R_OK);
if (!exists) {
continue;
}
sidecarPath = candidate;
break;
}
const isChanged = sidecarPath !== asset.sidecarPath;
this.logger.debug(
`Sidecar check found old=${asset.sidecarPath}, new=${sidecarPath} will ${isChanged ? 'update' : 'do nothing for'} asset ${asset.id}: ${asset.originalPath}`,
);
if (!isChanged) {
return JobStatus.Skipped;
}
await this.assetRepository.update({ id: asset.id, sidecarPath });
return JobStatus.Success;
} }
@OnEvent({ name: 'AssetTag' }) @OnEvent({ name: 'AssetTag' })
@@ -399,6 +422,25 @@ export class MetadataService extends BaseService {
return JobStatus.Success; return JobStatus.Success;
} }
private getSidecarCandidates({ sidecarPath, originalPath }: { sidecarPath: string | null; originalPath: string }) {
const candidates: string[] = [];
if (sidecarPath) {
candidates.push(sidecarPath);
}
const assetPath = parse(originalPath);
candidates.push(
// IMG_123.jpg.xmp
`${originalPath}.xmp`,
// IMG_123.xmp
`${join(assetPath.dir, assetPath.name)}.xmp`,
);
return candidates;
}
private getImageDimensions(exifTags: ImmichTags): { width?: number; height?: number } { private getImageDimensions(exifTags: ImmichTags): { width?: number; height?: number } {
/* /*
* The "true" values for width and height are a bit hidden, depending on the camera model and file format. * The "true" values for width and height are a bit hidden, depending on the camera model and file format.
@@ -564,7 +606,7 @@ export class MetadataService extends BaseService {
checksum, checksum,
ownerId: asset.ownerId, ownerId: asset.ownerId,
originalPath: StorageCore.getAndroidMotionPath(asset, motionAssetId), originalPath: StorageCore.getAndroidMotionPath(asset, motionAssetId),
originalFileName: `${path.parse(asset.originalFileName).name}.mp4`, originalFileName: `${parse(asset.originalFileName).name}.mp4`,
visibility: AssetVisibility.Hidden, visibility: AssetVisibility.Hidden,
deviceAssetId: 'NONE', deviceAssetId: 'NONE',
deviceId: 'NONE', deviceId: 'NONE',
@@ -905,60 +947,4 @@ export class MetadataService extends BaseService {
return tags; return tags;
} }
private async processSidecar(id: string, isSync: boolean): Promise<JobStatus> {
const [asset] = await this.assetRepository.getByIds([id]);
if (!asset) {
return JobStatus.Failed;
}
if (isSync && !asset.sidecarPath) {
return JobStatus.Failed;
}
if (!isSync && (asset.visibility === AssetVisibility.Hidden || asset.sidecarPath) && !asset.isExternal) {
return JobStatus.Failed;
}
// XMP sidecars can come in two filename formats. For a photo named photo.ext, the filenames are photo.ext.xmp and photo.xmp
const assetPath = path.parse(asset.originalPath);
const assetPathWithoutExt = path.join(assetPath.dir, assetPath.name);
const sidecarPathWithoutExt = `${assetPathWithoutExt}.xmp`;
const sidecarPathWithExt = `${asset.originalPath}.xmp`;
const [sidecarPathWithExtExists, sidecarPathWithoutExtExists] = await Promise.all([
this.storageRepository.checkFileExists(sidecarPathWithExt, constants.R_OK),
this.storageRepository.checkFileExists(sidecarPathWithoutExt, constants.R_OK),
]);
let sidecarPath = null;
if (sidecarPathWithExtExists) {
sidecarPath = sidecarPathWithExt;
} else if (sidecarPathWithoutExtExists) {
sidecarPath = sidecarPathWithoutExt;
}
if (asset.isExternal) {
if (sidecarPath !== asset.sidecarPath) {
await this.assetRepository.update({ id: asset.id, sidecarPath });
}
return JobStatus.Success;
}
if (sidecarPath) {
this.logger.debug(`Detected sidecar at '${sidecarPath}' for asset ${asset.id}: ${asset.originalPath}`);
await this.assetRepository.update({ id: asset.id, sidecarPath });
return JobStatus.Success;
}
if (!isSync) {
return JobStatus.Failed;
}
this.logger.debug(`No sidecar found for asset ${asset.id}: ${asset.originalPath}`);
await this.assetRepository.update({ id: asset.id, sidecarPath: null });
return JobStatus.Success;
}
} }

View File

@@ -18,7 +18,7 @@ import {
SmartSearchDto, SmartSearchDto,
StatisticsSearchDto, StatisticsSearchDto,
} from 'src/dtos/search.dto'; } from 'src/dtos/search.dto';
import { AssetOrder, AssetVisibility } from 'src/enum'; import { AssetOrder, AssetVisibility, Permission } from 'src/enum';
import { BaseService } from 'src/services/base.service'; import { BaseService } from 'src/services/base.service';
import { requireElevatedPermission } from 'src/utils/access'; import { requireElevatedPermission } from 'src/utils/access';
import { getMyPartnerIds } from 'src/utils/asset.util'; import { getMyPartnerIds } from 'src/utils/asset.util';
@@ -113,14 +113,27 @@ export class SearchService extends BaseService {
} }
const userIds = this.getUserIdsToSearch(auth); const userIds = this.getUserIdsToSearch(auth);
const key = machineLearning.clip.modelName + dto.query + dto.language; let embedding;
let embedding = this.embeddingCache.get(key); if (dto.query) {
if (!embedding) { const key = machineLearning.clip.modelName + dto.query + dto.language;
embedding = await this.machineLearningRepository.encodeText(machineLearning.urls, dto.query, { embedding = this.embeddingCache.get(key);
modelName: machineLearning.clip.modelName, if (!embedding) {
language: dto.language, embedding = await this.machineLearningRepository.encodeText(machineLearning.urls, dto.query, {
}); modelName: machineLearning.clip.modelName,
this.embeddingCache.set(key, embedding); language: dto.language,
});
this.embeddingCache.set(key, embedding);
}
} else if (dto.queryAssetId) {
await this.requireAccess({ auth, permission: Permission.AssetRead, ids: [dto.queryAssetId] });
const getEmbeddingResponse = await this.searchRepository.getEmbedding(dto.queryAssetId);
const assetEmbedding = getEmbeddingResponse?.embedding;
if (!assetEmbedding) {
throw new BadRequestException(`Asset ${dto.queryAssetId} has no embedding`);
}
embedding = assetEmbedding;
} else {
throw new BadRequestException('Either `query` or `queryAssetId` must be set');
} }
const page = dto.page ?? 1; const page = dto.page ?? 1;
const size = dto.size || 100; const size = dto.size || 100;

View File

@@ -1,8 +1,9 @@
import { BadRequestException, ForbiddenException, Injectable } from '@nestjs/common'; import { BadRequestException, ForbiddenException, Injectable } from '@nestjs/common';
import { Insertable } from 'kysely'; import { Insertable } from 'kysely';
import { DateTime } from 'luxon'; import { DateTime, Duration } from 'luxon';
import { Writable } from 'node:stream'; import { Writable } from 'node:stream';
import { AUDIT_LOG_MAX_DURATION } from 'src/constants'; import { AUDIT_LOG_MAX_DURATION } from 'src/constants';
import { OnJob } from 'src/decorators';
import { AssetResponseDto, mapAsset } from 'src/dtos/asset-response.dto'; import { AssetResponseDto, mapAsset } from 'src/dtos/asset-response.dto';
import { AuthDto } from 'src/dtos/auth.dto'; import { AuthDto } from 'src/dtos/auth.dto';
import { import {
@@ -15,7 +16,16 @@ import {
SyncItem, SyncItem,
SyncStreamDto, SyncStreamDto,
} from 'src/dtos/sync.dto'; } from 'src/dtos/sync.dto';
import { AssetVisibility, DatabaseAction, EntityType, Permission, SyncEntityType, SyncRequestType } from 'src/enum'; import {
AssetVisibility,
DatabaseAction,
EntityType,
JobName,
Permission,
QueueName,
SyncEntityType,
SyncRequestType,
} from 'src/enum';
import { SyncQueryOptions } from 'src/repositories/sync.repository'; import { SyncQueryOptions } from 'src/repositories/sync.repository';
import { SessionSyncCheckpointTable } from 'src/schema/tables/sync-checkpoint.table'; import { SessionSyncCheckpointTable } from 'src/schema/tables/sync-checkpoint.table';
import { BaseService } from 'src/services/base.service'; import { BaseService } from 'src/services/base.service';
@@ -32,6 +42,8 @@ type AssetLike = Omit<SyncAssetV1, 'checksum' | 'thumbhash'> & {
}; };
const COMPLETE_ID = 'complete'; const COMPLETE_ID = 'complete';
const MAX_DAYS = 30;
const MAX_DURATION = Duration.fromObject({ days: MAX_DAYS });
const mapSyncAssetV1 = ({ checksum, thumbhash, ...data }: AssetLike): SyncAssetV1 => ({ const mapSyncAssetV1 = ({ checksum, thumbhash, ...data }: AssetLike): SyncAssetV1 => ({
...data, ...data,
@@ -137,19 +149,24 @@ export class SyncService extends BaseService {
} }
const isPendingSyncReset = await this.sessionRepository.isPendingSyncReset(session.id); const isPendingSyncReset = await this.sessionRepository.isPendingSyncReset(session.id);
if (isPendingSyncReset) { if (isPendingSyncReset) {
send(response, { type: SyncEntityType.SyncResetV1, ids: ['reset'], data: {} }); send(response, { type: SyncEntityType.SyncResetV1, ids: ['reset'], data: {} });
response.end(); response.end();
return; return;
} }
const checkpoints = await this.syncCheckpointRepository.getAll(session.id);
const checkpointMap: CheckpointMap = Object.fromEntries(checkpoints.map(({ type, ack }) => [type, fromAck(ack)]));
if (this.needsFullSync(checkpointMap)) {
send(response, { type: SyncEntityType.SyncResetV1, ids: ['reset'], data: {} });
response.end();
return;
}
const { nowId } = await this.syncCheckpointRepository.getNow(); const { nowId } = await this.syncCheckpointRepository.getNow();
const options: SyncQueryOptions = { nowId, userId: auth.user.id }; const options: SyncQueryOptions = { nowId, userId: auth.user.id };
const checkpoints = await this.syncCheckpointRepository.getAll(session.id);
const checkpointMap: CheckpointMap = Object.fromEntries(checkpoints.map(({ type, ack }) => [type, fromAck(ack)]));
const handlers: Record<SyncRequestType, () => Promise<void>> = { const handlers: Record<SyncRequestType, () => Promise<void>> = {
[SyncRequestType.AuthUsersV1]: () => this.syncAuthUsersV1(options, response, checkpointMap), [SyncRequestType.AuthUsersV1]: () => this.syncAuthUsersV1(options, response, checkpointMap),
[SyncRequestType.UsersV1]: () => this.syncUsersV1(options, response, checkpointMap), [SyncRequestType.UsersV1]: () => this.syncUsersV1(options, response, checkpointMap),
@@ -180,9 +197,41 @@ export class SyncService extends BaseService {
await handler(); await handler();
} }
send(response, { type: SyncEntityType.SyncCompleteV1, ids: [nowId], data: {} });
response.end(); response.end();
} }
@OnJob({ name: JobName.AuditTableCleanup, queue: QueueName.BackgroundTask })
async onAuditTableCleanup() {
const pruneThreshold = MAX_DAYS + 1;
await this.syncRepository.album.cleanupAuditTable(pruneThreshold);
await this.syncRepository.albumUser.cleanupAuditTable(pruneThreshold);
await this.syncRepository.albumToAsset.cleanupAuditTable(pruneThreshold);
await this.syncRepository.asset.cleanupAuditTable(pruneThreshold);
await this.syncRepository.assetFace.cleanupAuditTable(pruneThreshold);
await this.syncRepository.assetMetadata.cleanupAuditTable(pruneThreshold);
await this.syncRepository.memory.cleanupAuditTable(pruneThreshold);
await this.syncRepository.memoryToAsset.cleanupAuditTable(pruneThreshold);
await this.syncRepository.partner.cleanupAuditTable(pruneThreshold);
await this.syncRepository.person.cleanupAuditTable(pruneThreshold);
await this.syncRepository.stack.cleanupAuditTable(pruneThreshold);
await this.syncRepository.user.cleanupAuditTable(pruneThreshold);
await this.syncRepository.userMetadata.cleanupAuditTable(pruneThreshold);
}
private needsFullSync(checkpointMap: CheckpointMap) {
const completeAck = checkpointMap[SyncEntityType.SyncCompleteV1];
if (!completeAck) {
return false;
}
const milliseconds = Number.parseInt(completeAck.updateId.replaceAll('-', '').slice(0, 12), 16);
return DateTime.fromMillis(milliseconds) < DateTime.now().minus(MAX_DURATION);
}
private async syncAuthUsersV1(options: SyncQueryOptions, response: Writable, checkpointMap: CheckpointMap) { private async syncAuthUsersV1(options: SyncQueryOptions, response: Writable, checkpointMap: CheckpointMap) {
const upsertType = SyncEntityType.AuthUserV1; const upsertType = SyncEntityType.AuthUserV1;
const upserts = this.syncRepository.authUser.getUpserts({ ...options, ack: checkpointMap[upsertType] }); const upserts = this.syncRepository.authUser.getUpserts({ ...options, ack: checkpointMap[upsertType] });
@@ -719,13 +768,13 @@ export class SyncService extends BaseService {
private async syncPeopleV1(options: SyncQueryOptions, response: Writable, checkpointMap: CheckpointMap) { private async syncPeopleV1(options: SyncQueryOptions, response: Writable, checkpointMap: CheckpointMap) {
const deleteType = SyncEntityType.PersonDeleteV1; const deleteType = SyncEntityType.PersonDeleteV1;
const deletes = this.syncRepository.people.getDeletes({ ...options, ack: checkpointMap[deleteType] }); const deletes = this.syncRepository.person.getDeletes({ ...options, ack: checkpointMap[deleteType] });
for await (const { id, ...data } of deletes) { for await (const { id, ...data } of deletes) {
send(response, { type: deleteType, ids: [id], data }); send(response, { type: deleteType, ids: [id], data });
} }
const upsertType = SyncEntityType.PersonV1; const upsertType = SyncEntityType.PersonV1;
const upserts = this.syncRepository.people.getUpserts({ ...options, ack: checkpointMap[upsertType] }); const upserts = this.syncRepository.person.getUpserts({ ...options, ack: checkpointMap[upsertType] });
for await (const { updateId, ...data } of upserts) { for await (const { updateId, ...data } of upserts) {
send(response, { type: upsertType, ids: [updateId], data }); send(response, { type: upsertType, ids: [updateId], data });
} }

View File

@@ -275,6 +275,9 @@ export interface QueueStatus {
} }
export type JobItem = export type JobItem =
// Audit
| { name: JobName.AuditTableCleanup; data?: IBaseJob }
// Backups // Backups
| { name: JobName.DatabaseBackup; data?: IBaseJob } | { name: JobName.DatabaseBackup; data?: IBaseJob }
@@ -309,8 +312,7 @@ export type JobItem =
// Sidecar Scanning // Sidecar Scanning
| { name: JobName.SidecarQueueAll; data: IBaseJob } | { name: JobName.SidecarQueueAll; data: IBaseJob }
| { name: JobName.SidecarDiscovery; data: IEntityJob } | { name: JobName.SidecarCheck; data: IEntityJob }
| { name: JobName.SidecarSync; data: IEntityJob }
| { name: JobName.SidecarWrite; data: ISidecarWriteJob } | { name: JobName.SidecarWrite; data: ISidecarWriteJob }
// Facial Recognition // Facial Recognition
@@ -397,8 +399,8 @@ export interface VectorUpdateResult {
} }
export interface ImmichFile extends Express.Multer.File { export interface ImmichFile extends Express.Multer.File {
/** sha1 hash of file */
uuid: string; uuid: string;
/** sha1 hash of file */
checksum: Buffer; checksum: Buffer;
} }

View File

@@ -35,7 +35,7 @@ export const stackStub = (stackId: string, assets: (MapAsset & { exifInfo: Exif
primaryAssetId: assets[0].id, primaryAssetId: assets[0].id,
createdAt: new Date('2023-02-23T05:06:29.716Z'), createdAt: new Date('2023-02-23T05:06:29.716Z'),
updatedAt: new Date('2023-02-23T05:06:29.716Z'), updatedAt: new Date('2023-02-23T05:06:29.716Z'),
updateId: 'uuid-v7', updateId: expect.any(String),
}; };
}; };

View File

@@ -1,5 +1,6 @@
import { Tag } from 'src/database'; import { Tag } from 'src/database';
import { TagResponseDto } from 'src/dtos/tag.dto'; import { TagResponseDto } from 'src/dtos/tag.dto';
import { newUuidV7 } from 'test/small.factory';
const parent = Object.freeze<Tag>({ const parent = Object.freeze<Tag>({
id: 'tag-parent', id: 'tag-parent',
@@ -37,7 +38,10 @@ const color = {
parentId: null, parentId: null,
}; };
const upsert = { userId: 'tag-user', updateId: 'uuid-v7' }; const upsert = {
userId: 'tag-user',
updateId: newUuidV7(),
};
export const tagStub = { export const tagStub = {
tag, tag,

View File

@@ -258,6 +258,12 @@ export class SyncTestContext extends MediumTestContext<SyncService> {
return stream.getResponse(); return stream.getResponse();
} }
async assertSyncIsComplete(auth: AuthDto, types: SyncRequestType[]) {
await expect(this.syncStream(auth, types)).resolves.toEqual([
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
}
async syncAckAll(auth: AuthDto, response: Array<{ type: string; ack: string }>) { async syncAckAll(auth: AuthDto, response: Array<{ type: string; ack: string }>) {
const acks: Record<string, string> = {}; const acks: Record<string, string> = {};
const syncAcks: string[] = []; const syncAcks: string[] = [];

View File

@@ -0,0 +1,226 @@
import { Kysely } from 'kysely';
import { DateTime } from 'luxon';
import { AssetMetadataKey, UserMetadataKey } from 'src/enum';
import { DatabaseRepository } from 'src/repositories/database.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
import { SyncRepository } from 'src/repositories/sync.repository';
import { DB } from 'src/schema';
import { SyncService } from 'src/services/sync.service';
import { newMediumService } from 'test/medium.factory';
import { getKyselyDB } from 'test/utils';
import { v4 } from 'uuid';
let defaultDatabase: Kysely<DB>;
const setup = (db?: Kysely<DB>) => {
return newMediumService(SyncService, {
database: db || defaultDatabase,
real: [DatabaseRepository, SyncRepository],
mock: [LoggingRepository],
});
};
beforeAll(async () => {
defaultDatabase = await getKyselyDB();
});
const deletedLongAgo = DateTime.now().minus({ days: 35 }).toISO();
const assertTableCount = async <T extends keyof DB>(db: Kysely<DB>, t: T, count: number) => {
const { table } = db.dynamic;
const results = await db.selectFrom(table(t).as(t)).selectAll().execute();
expect(results).toHaveLength(count);
};
describe(SyncService.name, () => {
describe('onAuditTableCleanup', () => {
it('should work', async () => {
const { sut } = setup();
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
});
it('should cleanup the album_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'album_audit';
await ctx.database
.insertInto(tableName)
.values({ albumId: v4(), userId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the album_asset_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'album_asset_audit';
const { user } = await ctx.newUser();
const { album } = await ctx.newAlbum({ ownerId: user.id });
await ctx.database
.insertInto(tableName)
.values({ albumId: album.id, assetId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the album_user_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'album_user_audit';
await ctx.database
.insertInto(tableName)
.values({ albumId: v4(), userId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the asset_audit table', async () => {
const { sut, ctx } = setup();
await ctx.database
.insertInto('asset_audit')
.values({ assetId: v4(), ownerId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, 'asset_audit', 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, 'asset_audit', 0);
});
it('should cleanup the asset_face_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'asset_face_audit';
await ctx.database
.insertInto(tableName)
.values({ assetFaceId: v4(), assetId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the asset_metadata_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'asset_metadata_audit';
await ctx.database
.insertInto(tableName)
.values({ assetId: v4(), key: AssetMetadataKey.MobileApp, deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the memory_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'memory_audit';
await ctx.database
.insertInto(tableName)
.values({ memoryId: v4(), userId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the memory_asset_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'memory_asset_audit';
const { user } = await ctx.newUser();
const { memory } = await ctx.newMemory({ ownerId: user.id });
await ctx.database
.insertInto(tableName)
.values({ memoryId: memory.id, assetId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the partner_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'partner_audit';
await ctx.database
.insertInto(tableName)
.values({ sharedById: v4(), sharedWithId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the stack_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'stack_audit';
await ctx.database
.insertInto(tableName)
.values({ stackId: v4(), userId: v4(), deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the user_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'user_audit';
await ctx.database.insertInto(tableName).values({ userId: v4(), deletedAt: deletedLongAgo }).execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should cleanup the user_metadata_audit table', async () => {
const { sut, ctx } = setup();
const tableName = 'user_metadata_audit';
await ctx.database
.insertInto(tableName)
.values({ userId: v4(), key: UserMetadataKey.Onboarding, deletedAt: deletedLongAgo })
.execute();
await assertTableCount(ctx.database, tableName, 1);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
await assertTableCount(ctx.database, tableName, 0);
});
it('should skip recent records', async () => {
const { sut, ctx } = setup();
const keep = {
id: v4(),
assetId: v4(),
ownerId: v4(),
deletedAt: DateTime.now().minus({ days: 25 }).toISO(),
};
const remove = {
id: v4(),
assetId: v4(),
ownerId: v4(),
deletedAt: DateTime.now().minus({ days: 35 }).toISO(),
};
await ctx.database.insertInto('asset_audit').values([keep, remove]).execute();
await assertTableCount(ctx.database, 'asset_audit', 2);
await expect(sut.onAuditTableCleanup()).resolves.toBeUndefined();
const after = await ctx.database.selectFrom('asset_audit').select(['id']).execute();
expect(after).toHaveLength(1);
expect(after[0].id).toBe(keep.id);
});
});
});

View File

@@ -74,11 +74,11 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}, },
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(response).toHaveLength(2);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetExifsV1]);
}); });
it('should sync album asset exif for own user', async () => { it('should sync album asset exif for own user', async () => {
@@ -88,8 +88,15 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
const { album } = await ctx.newAlbum({ ownerId: auth.user.id }); const { album } = await ctx.newAlbum({ ownerId: auth.user.id });
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
await expect(ctx.syncStream(auth, [SyncRequestType.AssetExifsV1])).resolves.toHaveLength(1); await expect(ctx.syncStream(auth, [SyncRequestType.AssetExifsV1])).resolves.toEqual([
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toHaveLength(2); expect.objectContaining({ type: SyncEntityType.AssetExifV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([
expect.objectContaining({ type: SyncEntityType.SyncAckV1 }),
expect.objectContaining({ type: SyncEntityType.AlbumAssetExifCreateV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
}); });
it('should not sync album asset exif for unrelated user', async () => { it('should not sync album asset exif for unrelated user', async () => {
@@ -104,8 +111,11 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
const { session } = await ctx.newSession({ userId: user3.id }); const { session } = await ctx.newSession({ userId: user3.id });
const authUser3 = factory.auth({ session, user: user3 }); const authUser3 = factory.auth({ session, user: user3 });
await expect(ctx.syncStream(authUser3, [SyncRequestType.AssetExifsV1])).resolves.toHaveLength(1); await expect(ctx.syncStream(authUser3, [SyncRequestType.AssetExifsV1])).resolves.toEqual([
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetExifV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetExifsV1]);
}); });
it('should backfill album assets exif when a user shares an album with you', async () => { it('should backfill album assets exif when a user shares an album with you', async () => {
@@ -139,8 +149,8 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(response).toHaveLength(2);
// ack initial album asset exif sync // ack initial album asset exif sync
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -174,11 +184,11 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(newResponse).toHaveLength(5);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetExifsV1]);
}); });
it('should sync old asset exif when a user adds them to an album they share you', async () => { it('should sync old asset exif when a user adds them to an album they share you', async () => {
@@ -207,8 +217,8 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(firstAlbumResponse).toHaveLength(2);
await ctx.syncAckAll(auth, firstAlbumResponse); await ctx.syncAckAll(auth, firstAlbumResponse);
@@ -224,8 +234,8 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
type: SyncEntityType.AlbumAssetExifBackfillV1, type: SyncEntityType.AlbumAssetExifBackfillV1,
}, },
backfillSyncAck, backfillSyncAck,
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(response).toHaveLength(2);
// ack initial album asset sync // ack initial album asset sync
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -244,11 +254,11 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(newResponse).toHaveLength(2);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetExifsV1]);
}); });
it('should sync asset exif updates for an album shared with you', async () => { it('should sync asset exif updates for an album shared with you', async () => {
@@ -262,7 +272,6 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1]);
expect(response).toHaveLength(2);
expect(response).toEqual([ expect(response).toEqual([
updateSyncAck, updateSyncAck,
{ {
@@ -272,6 +281,7 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -283,9 +293,7 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
city: 'New City', city: 'New City',
}); });
const updateResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1]); await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([
expect(updateResponse).toHaveLength(1);
expect(updateResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: expect.objectContaining({ data: expect.objectContaining({
@@ -294,6 +302,7 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifUpdateV1, type: SyncEntityType.AlbumAssetExifUpdateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
}); });
@@ -330,8 +339,8 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifCreateV1, type: SyncEntityType.AlbumAssetExifCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(response).toHaveLength(3);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -342,8 +351,7 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
city: 'Delayed Exif', city: 'Delayed Exif',
}); });
const updateResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1]); await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetExifsV1])).resolves.toEqual([
expect(updateResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: expect.objectContaining({ data: expect.objectContaining({
@@ -352,7 +360,7 @@ describe(SyncRequestType.AlbumAssetExifsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetExifUpdateV1, type: SyncEntityType.AlbumAssetExifUpdateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
expect(updateResponse).toHaveLength(1);
}); });
}); });

View File

@@ -58,7 +58,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
expect(response).toHaveLength(2);
expect(response).toEqual([ expect(response).toEqual([
updateSyncAck, updateSyncAck,
{ {
@@ -83,10 +82,11 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}, },
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetsV1]);
}); });
it('should sync album asset for own user', async () => { it('should sync album asset for own user', async () => {
@@ -95,8 +95,15 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
const { album } = await ctx.newAlbum({ ownerId: auth.user.id }); const { album } = await ctx.newAlbum({ ownerId: auth.user.id });
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
await expect(ctx.syncStream(auth, [SyncRequestType.AssetsV1])).resolves.toHaveLength(1); await expect(ctx.syncStream(auth, [SyncRequestType.AssetsV1])).resolves.toEqual([
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toHaveLength(2); expect.objectContaining({ type: SyncEntityType.AssetV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toEqual([
expect.objectContaining({ type: SyncEntityType.SyncAckV1 }),
expect.objectContaining({ type: SyncEntityType.AlbumAssetCreateV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
}); });
it('should not sync album asset for unrelated user', async () => { it('should not sync album asset for unrelated user', async () => {
@@ -110,8 +117,11 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
const { session } = await ctx.newSession({ userId: user3.id }); const { session } = await ctx.newSession({ userId: user3.id });
const authUser3 = factory.auth({ session, user: user3 }); const authUser3 = factory.auth({ session, user: user3 });
await expect(ctx.syncStream(authUser3, [SyncRequestType.AssetsV1])).resolves.toHaveLength(1); await expect(ctx.syncStream(authUser3, [SyncRequestType.AssetsV1])).resolves.toEqual([
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetsV1]);
}); });
it('should backfill album assets when a user shares an album with you', async () => { it('should backfill album assets when a user shares an album with you', async () => {
@@ -133,7 +143,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album1.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album1.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
expect(response).toHaveLength(2);
expect(response).toEqual([ expect(response).toEqual([
updateSyncAck, updateSyncAck,
{ {
@@ -143,6 +152,7 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
// ack initial album asset sync // ack initial album asset sync
@@ -176,10 +186,11 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetsV1]);
}); });
it('should sync old assets when a user adds them to an album they share you', async () => { it('should sync old assets when a user adds them to an album they share you', async () => {
@@ -196,7 +207,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album1.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album1.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const firstAlbumResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const firstAlbumResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
expect(firstAlbumResponse).toHaveLength(2);
expect(firstAlbumResponse).toEqual([ expect(firstAlbumResponse).toEqual([
updateSyncAck, updateSyncAck,
{ {
@@ -206,6 +216,7 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, firstAlbumResponse); await ctx.syncAckAll(auth, firstAlbumResponse);
@@ -213,7 +224,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album2.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album2.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
// expect(response).toHaveLength(2);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -223,6 +233,7 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
type: SyncEntityType.AlbumAssetBackfillV1, type: SyncEntityType.AlbumAssetBackfillV1,
}, },
backfillSyncAck, backfillSyncAck,
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
// ack initial album asset sync // ack initial album asset sync
@@ -242,10 +253,11 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumAssetsV1]);
}); });
it('should sync asset updates for an album shared with you', async () => { it('should sync asset updates for an album shared with you', async () => {
@@ -258,7 +270,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
expect(response).toHaveLength(2);
expect(response).toEqual([ expect(response).toEqual([
updateSyncAck, updateSyncAck,
{ {
@@ -268,6 +279,7 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetCreateV1, type: SyncEntityType.AlbumAssetCreateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -280,7 +292,6 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}); });
const updateResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]); const updateResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumAssetsV1]);
expect(updateResponse).toHaveLength(1);
expect(updateResponse).toEqual([ expect(updateResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -290,6 +301,7 @@ describe(SyncRequestType.AlbumAssetsV1, () => {
}), }),
type: SyncEntityType.AlbumAssetUpdateV1, type: SyncEntityType.AlbumAssetUpdateV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
}); });
}); });

View File

@@ -28,7 +28,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -38,10 +37,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should sync album to asset for owned albums', async () => { it('should sync album to asset for owned albums', async () => {
@@ -51,7 +51,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -61,10 +60,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should detect and sync the album to asset for shared albums', async () => { it('should detect and sync the album to asset for shared albums', async () => {
@@ -76,7 +76,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -86,10 +85,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should not sync album to asset for an album owned by another user', async () => { it('should not sync album to asset for an album owned by another user', async () => {
@@ -98,7 +98,7 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
const { asset } = await ctx.newAsset({ ownerId: user2.id }); const { asset } = await ctx.newAsset({ ownerId: user2.id });
const { album } = await ctx.newAlbum({ ownerId: user2.id }); const { album } = await ctx.newAlbum({ ownerId: user2.id });
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should backfill album to assets when a user shares an album with you', async () => { it('should backfill album to assets when a user shares an album with you', async () => {
@@ -114,7 +114,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumAsset({ albumId: album1.id, assetId: album1Asset.id }); await ctx.newAlbumAsset({ albumId: album1.id, assetId: album1Asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -124,6 +123,7 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
// ack initial album to asset sync // ack initial album to asset sync
@@ -148,10 +148,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
data: {}, data: {},
type: SyncEntityType.SyncAckV1, type: SyncEntityType.SyncAckV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should detect and sync a deleted album to asset relation', async () => { it('should detect and sync a deleted album to asset relation', async () => {
@@ -162,7 +163,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -172,6 +172,7 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -179,7 +180,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await wait(2); await wait(2);
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -189,10 +189,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetDeleteV1, type: SyncEntityType.AlbumToAssetDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should detect and sync a deleted album to asset relation when an asset is deleted', async () => { it('should detect and sync a deleted album to asset relation when an asset is deleted', async () => {
@@ -203,7 +204,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -213,6 +213,7 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -220,7 +221,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await wait(2); await wait(2);
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -230,10 +230,11 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetDeleteV1, type: SyncEntityType.AlbumToAssetDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
it('should not sync a deleted album to asset relation when the album is deleted', async () => { it('should not sync a deleted album to asset relation when the album is deleted', async () => {
@@ -244,7 +245,6 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id }); await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -254,11 +254,12 @@ describe(SyncRequestType.AlbumToAssetsV1, () => {
}, },
type: SyncEntityType.AlbumToAssetV1, type: SyncEntityType.AlbumToAssetV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await albumRepo.delete(album.id); await albumRepo.delete(album.id);
await wait(2); await wait(2);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumToAssetsV1]);
}); });
}); });

View File

@@ -34,6 +34,7 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
}); });
@@ -45,7 +46,6 @@ describe(SyncRequestType.AlbumUsersV1, () => {
const { albumUser } = await ctx.newAlbumUser({ albumId: album.id, userId: user1.id, role: AlbumUserRole.Editor }); const { albumUser } = await ctx.newAlbumUser({ albumId: album.id, userId: user1.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -56,10 +56,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
it('should detect and sync an updated shared user', async () => { it('should detect and sync an updated shared user', async () => {
@@ -71,11 +72,10 @@ describe(SyncRequestType.AlbumUsersV1, () => {
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
await albumUserRepo.update({ albumsId: album.id, usersId: user1.id }, { role: AlbumUserRole.Viewer }); await albumUserRepo.update({ albumsId: album.id, usersId: user1.id }, { role: AlbumUserRole.Viewer });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -86,10 +86,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
it('should detect and sync a deleted shared user', async () => { it('should detect and sync a deleted shared user', async () => {
@@ -100,9 +101,8 @@ describe(SyncRequestType.AlbumUsersV1, () => {
const { albumUser } = await ctx.newAlbumUser({ albumId: album.id, userId: user1.id, role: AlbumUserRole.Editor }); const { albumUser } = await ctx.newAlbumUser({ albumId: album.id, userId: user1.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(1);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
await albumUserRepo.delete({ albumsId: album.id, usersId: user1.id }); await albumUserRepo.delete({ albumsId: album.id, usersId: user1.id });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
@@ -115,10 +115,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserDeleteV1, type: SyncEntityType.AlbumUserDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
}); });
@@ -134,7 +135,6 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}); });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -145,10 +145,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
it('should detect and sync an updated shared user', async () => { it('should detect and sync an updated shared user', async () => {
@@ -161,10 +162,14 @@ describe(SyncRequestType.AlbumUsersV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(2); expect(response).toEqual([
expect.objectContaining({ type: SyncEntityType.AlbumUserV1 }),
expect.objectContaining({ type: SyncEntityType.AlbumUserV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
await albumUserRepo.update({ albumsId: album.id, usersId: user.id }, { role: AlbumUserRole.Viewer }); await albumUserRepo.update({ albumsId: album.id, usersId: user.id }, { role: AlbumUserRole.Viewer });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
@@ -178,10 +183,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
it('should detect and sync a deleted shared user', async () => { it('should detect and sync a deleted shared user', async () => {
@@ -194,10 +200,14 @@ describe(SyncRequestType.AlbumUsersV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(2); expect(response).toEqual([
expect.objectContaining({ type: SyncEntityType.AlbumUserV1 }),
expect.objectContaining({ type: SyncEntityType.AlbumUserV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
await albumUserRepo.delete({ albumsId: album.id, usersId: user.id }); await albumUserRepo.delete({ albumsId: album.id, usersId: user.id });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
@@ -210,10 +220,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserDeleteV1, type: SyncEntityType.AlbumUserDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
it('should backfill album users when a user shares an album with you', async () => { it('should backfill album users when a user shares an album with you', async () => {
@@ -232,7 +243,6 @@ describe(SyncRequestType.AlbumUsersV1, () => {
await ctx.newAlbumUser({ albumId: album1.id, userId: user2.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album1.id, userId: user2.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -243,6 +253,7 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
// ack initial user // ack initial user
@@ -285,10 +296,11 @@ describe(SyncRequestType.AlbumUsersV1, () => {
}), }),
type: SyncEntityType.AlbumUserV1, type: SyncEntityType.AlbumUserV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumUsersV1]);
}); });
}); });
}); });

View File

@@ -24,7 +24,6 @@ describe(SyncRequestType.AlbumsV1, () => {
const { album } = await ctx.newAlbum({ ownerId: auth.user.id }); const { album } = await ctx.newAlbum({ ownerId: auth.user.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -35,10 +34,11 @@ describe(SyncRequestType.AlbumsV1, () => {
}), }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync a new album', async () => { it('should detect and sync a new album', async () => {
@@ -46,7 +46,6 @@ describe(SyncRequestType.AlbumsV1, () => {
const { album } = await ctx.newAlbum({ ownerId: auth.user.id }); const { album } = await ctx.newAlbum({ ownerId: auth.user.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -55,10 +54,11 @@ describe(SyncRequestType.AlbumsV1, () => {
}), }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync an album delete', async () => { it('should detect and sync an album delete', async () => {
@@ -67,7 +67,6 @@ describe(SyncRequestType.AlbumsV1, () => {
const { album } = await ctx.newAlbum({ ownerId: auth.user.id }); const { album } = await ctx.newAlbum({ ownerId: auth.user.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -76,12 +75,12 @@ describe(SyncRequestType.AlbumsV1, () => {
}), }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await albumRepo.delete(album.id); await albumRepo.delete(album.id);
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -90,10 +89,11 @@ describe(SyncRequestType.AlbumsV1, () => {
}, },
type: SyncEntityType.AlbumDeleteV1, type: SyncEntityType.AlbumDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
describe('shared albums', () => { describe('shared albums', () => {
@@ -104,17 +104,17 @@ describe(SyncRequestType.AlbumsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: expect.objectContaining({ id: album.id }), data: expect.objectContaining({ id: album.id }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync an album share (share before sync)', async () => { it('should detect and sync an album share (share before sync)', async () => {
@@ -124,17 +124,17 @@ describe(SyncRequestType.AlbumsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: expect.objectContaining({ id: album.id }), data: expect.objectContaining({ id: album.id }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync an album share (share after sync)', async () => { it('should detect and sync an album share (share after sync)', async () => {
@@ -150,23 +150,24 @@ describe(SyncRequestType.AlbumsV1, () => {
data: expect.objectContaining({ id: userAlbum.id }), data: expect.objectContaining({ id: userAlbum.id }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await ctx.newAlbumUser({ userId: auth.user.id, albumId: user2Album.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ userId: auth.user.id, albumId: user2Album.id, role: AlbumUserRole.Editor });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: expect.objectContaining({ id: user2Album.id }), data: expect.objectContaining({ id: user2Album.id }),
type: SyncEntityType.AlbumV1, type: SyncEntityType.AlbumV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync an album delete`', async () => { it('should detect and sync an album delete`', async () => {
@@ -177,24 +178,27 @@ describe(SyncRequestType.AlbumsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1); expect(response).toEqual([
expect.objectContaining({ type: SyncEntityType.AlbumV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
await albumRepo.delete(album.id); await albumRepo.delete(album.id);
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
data: { albumId: album.id }, data: { albumId: album.id },
type: SyncEntityType.AlbumDeleteV1, type: SyncEntityType.AlbumDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
it('should detect and sync an album unshare as an album delete', async () => { it('should detect and sync an album unshare as an album delete', async () => {
@@ -205,10 +209,13 @@ describe(SyncRequestType.AlbumsV1, () => {
await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor }); await ctx.newAlbumUser({ albumId: album.id, userId: auth.user.id, role: AlbumUserRole.Editor });
const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
expect(response).toHaveLength(1); expect(response).toEqual([
expect.objectContaining({ type: SyncEntityType.AlbumV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
await albumUserRepo.delete({ albumsId: album.id, usersId: auth.user.id }); await albumUserRepo.delete({ albumsId: album.id, usersId: auth.user.id });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AlbumsV1]);
@@ -218,10 +225,11 @@ describe(SyncRequestType.AlbumsV1, () => {
data: { albumId: album.id }, data: { albumId: album.id },
type: SyncEntityType.AlbumDeleteV1, type: SyncEntityType.AlbumDeleteV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, newResponse); await ctx.syncAckAll(auth, newResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AlbumsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AlbumsV1]);
}); });
}); });
}); });

View File

@@ -24,7 +24,6 @@ describe(SyncRequestType.AssetExifsV1, () => {
await ctx.newExif({ assetId: asset.id, make: 'Canon' }); await ctx.newExif({ assetId: asset.id, make: 'Canon' });
const response = await ctx.syncStream(auth, [SyncRequestType.AssetExifsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetExifsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -57,10 +56,11 @@ describe(SyncRequestType.AssetExifsV1, () => {
}, },
type: SyncEntityType.AssetExifV1, type: SyncEntityType.AssetExifV1,
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetExifsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetExifsV1]);
}); });
it('should only sync asset exif for own user', async () => { it('should only sync asset exif for own user', async () => {
@@ -72,7 +72,10 @@ describe(SyncRequestType.AssetExifsV1, () => {
const { session } = await ctx.newSession({ userId: user2.id }); const { session } = await ctx.newSession({ userId: user2.id });
const auth2 = factory.auth({ session, user: user2 }); const auth2 = factory.auth({ session, user: user2 });
await expect(ctx.syncStream(auth2, [SyncRequestType.AssetExifsV1])).resolves.toHaveLength(1); await expect(ctx.syncStream(auth2, [SyncRequestType.AssetExifsV1])).resolves.toEqual([
await expect(ctx.syncStream(auth, [SyncRequestType.AssetExifsV1])).resolves.toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetExifV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetExifsV1]);
}); });
}); });

View File

@@ -26,7 +26,6 @@ describe(SyncEntityType.AssetFaceV1, () => {
const { assetFace } = await ctx.newAssetFace({ assetId: asset.id, personId: person.id }); const { assetFace } = await ctx.newAssetFace({ assetId: asset.id, personId: person.id });
const response = await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -44,10 +43,11 @@ describe(SyncEntityType.AssetFaceV1, () => {
}), }),
type: 'AssetFaceV1', type: 'AssetFaceV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetFacesV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetFacesV1]);
}); });
it('should detect and sync a deleted asset face', async () => { it('should detect and sync a deleted asset face', async () => {
@@ -58,7 +58,6 @@ describe(SyncEntityType.AssetFaceV1, () => {
await personRepo.deleteAssetFace(assetFace.id); await personRepo.deleteAssetFace(assetFace.id);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -67,10 +66,11 @@ describe(SyncEntityType.AssetFaceV1, () => {
}, },
type: 'AssetFaceDeleteV1', type: 'AssetFaceDeleteV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetFacesV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetFacesV1]);
}); });
it('should not sync an asset face or asset face delete for an unrelated user', async () => { it('should not sync an asset face or asset face delete for an unrelated user', async () => {
@@ -82,11 +82,18 @@ describe(SyncEntityType.AssetFaceV1, () => {
const { assetFace } = await ctx.newAssetFace({ assetId: asset.id }); const { assetFace } = await ctx.newAssetFace({ assetId: asset.id });
const auth2 = factory.auth({ session, user: user2 }); const auth2 = factory.auth({ session, user: user2 });
expect(await ctx.syncStream(auth2, [SyncRequestType.AssetFacesV1])).toHaveLength(1); expect(await ctx.syncStream(auth2, [SyncRequestType.AssetFacesV1])).toEqual([
expect(await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1])).toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetFaceV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetFacesV1]);
await personRepo.deleteAssetFace(assetFace.id); await personRepo.deleteAssetFace(assetFace.id);
expect(await ctx.syncStream(auth2, [SyncRequestType.AssetFacesV1])).toHaveLength(1);
expect(await ctx.syncStream(auth, [SyncRequestType.AssetFacesV1])).toHaveLength(0); expect(await ctx.syncStream(auth2, [SyncRequestType.AssetFacesV1])).toEqual([
expect.objectContaining({ type: SyncEntityType.AssetFaceDeleteV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetFacesV1]);
}); });
}); });

View File

@@ -26,7 +26,6 @@ describe(SyncEntityType.AssetMetadataV1, () => {
await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]); await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -37,10 +36,11 @@ describe(SyncEntityType.AssetMetadataV1, () => {
}, },
type: 'AssetMetadataV1', type: 'AssetMetadataV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetMetadataV1]);
}); });
it('should update asset metadata', async () => { it('should update asset metadata', async () => {
@@ -51,7 +51,6 @@ describe(SyncEntityType.AssetMetadataV1, () => {
await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]); await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -62,6 +61,7 @@ describe(SyncEntityType.AssetMetadataV1, () => {
}, },
type: 'AssetMetadataV1', type: 'AssetMetadataV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -79,10 +79,11 @@ describe(SyncEntityType.AssetMetadataV1, () => {
}, },
type: 'AssetMetadataV1', type: 'AssetMetadataV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, updatedResponse); await ctx.syncAckAll(auth, updatedResponse);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetMetadataV1]);
}); });
}); });
@@ -95,7 +96,6 @@ describe(SyncEntityType.AssetMetadataDeleteV1, () => {
await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]); await assetRepo.upsertMetadata(asset.id, [{ key: AssetMetadataKey.MobileApp, value: { iCloudId: 'abc123' } }]);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetMetadataV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -106,6 +106,7 @@ describe(SyncEntityType.AssetMetadataDeleteV1, () => {
}, },
type: 'AssetMetadataV1', type: 'AssetMetadataV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -121,6 +122,7 @@ describe(SyncEntityType.AssetMetadataDeleteV1, () => {
}, },
type: 'AssetMetadataDeleteV1', type: 'AssetMetadataDeleteV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
}); });
}); });

View File

@@ -40,7 +40,6 @@ describe(SyncEntityType.AssetV1, () => {
}); });
const response = await ctx.syncStream(auth, [SyncRequestType.AssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -64,10 +63,11 @@ describe(SyncEntityType.AssetV1, () => {
}, },
type: 'AssetV1', type: 'AssetV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
}); });
it('should detect and sync a deleted asset', async () => { it('should detect and sync a deleted asset', async () => {
@@ -77,7 +77,6 @@ describe(SyncEntityType.AssetV1, () => {
await assetRepo.remove(asset); await assetRepo.remove(asset);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -86,10 +85,11 @@ describe(SyncEntityType.AssetV1, () => {
}, },
type: 'AssetDeleteV1', type: 'AssetDeleteV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
}); });
it('should not sync an asset or asset delete for an unrelated user', async () => { it('should not sync an asset or asset delete for an unrelated user', async () => {
@@ -100,11 +100,17 @@ describe(SyncEntityType.AssetV1, () => {
const { asset } = await ctx.newAsset({ ownerId: user2.id }); const { asset } = await ctx.newAsset({ ownerId: user2.id });
const auth2 = factory.auth({ session, user: user2 }); const auth2 = factory.auth({ session, user: user2 });
expect(await ctx.syncStream(auth2, [SyncRequestType.AssetsV1])).toHaveLength(1); expect(await ctx.syncStream(auth2, [SyncRequestType.AssetsV1])).toEqual([
expect(await ctx.syncStream(auth, [SyncRequestType.AssetsV1])).toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
await assetRepo.remove(asset); await assetRepo.remove(asset);
expect(await ctx.syncStream(auth2, [SyncRequestType.AssetsV1])).toHaveLength(1); expect(await ctx.syncStream(auth2, [SyncRequestType.AssetsV1])).toEqual([
expect(await ctx.syncStream(auth, [SyncRequestType.AssetsV1])).toHaveLength(0); expect.objectContaining({ type: SyncEntityType.AssetDeleteV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
}); });
}); });

View File

@@ -22,7 +22,6 @@ describe(SyncEntityType.AuthUserV1, () => {
const { auth, user, ctx } = await setup(await getKyselyDB()); const { auth, user, ctx } = await setup(await getKyselyDB());
const response = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -43,10 +42,11 @@ describe(SyncEntityType.AuthUserV1, () => {
}, },
type: 'AuthUserV1', type: 'AuthUserV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.AuthUsersV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.AuthUsersV1]);
}); });
it('should sync a change and then another change to that same user', async () => { it('should sync a change and then another change to that same user', async () => {
@@ -55,7 +55,6 @@ describe(SyncEntityType.AuthUserV1, () => {
const userRepo = ctx.get(UserRepository); const userRepo = ctx.get(UserRepository);
const response = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]); const response = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -65,6 +64,7 @@ describe(SyncEntityType.AuthUserV1, () => {
}), }),
type: 'AuthUserV1', type: 'AuthUserV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
@@ -72,7 +72,6 @@ describe(SyncEntityType.AuthUserV1, () => {
await userRepo.update(user.id, { isAdmin: true }); await userRepo.update(user.id, { isAdmin: true });
const newResponse = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]); const newResponse = await ctx.syncStream(auth, [SyncRequestType.AuthUsersV1]);
expect(newResponse).toHaveLength(1);
expect(newResponse).toEqual([ expect(newResponse).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -82,6 +81,7 @@ describe(SyncEntityType.AuthUserV1, () => {
}), }),
type: 'AuthUserV1', type: 'AuthUserV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
}); });
}); });

View File

@@ -0,0 +1,60 @@
import { Kysely } from 'kysely';
import { DateTime } from 'luxon';
import { SyncEntityType, SyncRequestType } from 'src/enum';
import { SyncCheckpointRepository } from 'src/repositories/sync-checkpoint.repository';
import { DB } from 'src/schema';
import { toAck } from 'src/utils/sync';
import { SyncTestContext } from 'test/medium.factory';
import { getKyselyDB } from 'test/utils';
import { v7 } from 'uuid';
let defaultDatabase: Kysely<DB>;
const setup = async (db?: Kysely<DB>) => {
const ctx = new SyncTestContext(db || defaultDatabase);
const { auth, user, session } = await ctx.newSyncAuthUser();
return { auth, user, session, ctx };
};
beforeAll(async () => {
defaultDatabase = await getKyselyDB();
});
describe(SyncEntityType.SyncCompleteV1, () => {
it('should work', async () => {
const { auth, ctx } = await setup();
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
});
it('should detect an old checkpoint and send back a reset', async () => {
const { auth, session, ctx } = await setup();
const updateId = v7({ msecs: DateTime.now().minus({ days: 60 }).toMillis() });
await ctx.get(SyncCheckpointRepository).upsertAll([
{
type: SyncEntityType.SyncCompleteV1,
sessionId: session.id,
ack: toAck({ type: SyncEntityType.SyncCompleteV1, updateId }),
},
]);
const response = await ctx.syncStream(auth, [SyncRequestType.AssetsV1]);
expect(response).toEqual([{ type: SyncEntityType.SyncResetV1, data: {}, ack: 'SyncResetV1|reset' }]);
});
it('should not send back a reset if the checkpoint is recent', async () => {
const { auth, session, ctx } = await setup();
const updateId = v7({ msecs: DateTime.now().minus({ days: 7 }).toMillis() });
await ctx.get(SyncCheckpointRepository).upsertAll([
{
type: SyncEntityType.SyncCompleteV1,
sessionId: session.id,
ack: toAck({ type: SyncEntityType.SyncCompleteV1, updateId }),
},
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.AssetsV1]);
});
});

View File

@@ -25,7 +25,6 @@ describe(SyncEntityType.MemoryToAssetV1, () => {
await ctx.newMemoryAsset({ memoryId: memory.id, assetId: asset.id }); await ctx.newMemoryAsset({ memoryId: memory.id, assetId: asset.id });
const response = await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -35,10 +34,11 @@ describe(SyncEntityType.MemoryToAssetV1, () => {
}, },
type: 'MemoryToAssetV1', type: 'MemoryToAssetV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.MemoryToAssetsV1]);
}); });
it('should detect and sync a deleted memory to asset relation', async () => { it('should detect and sync a deleted memory to asset relation', async () => {
@@ -50,7 +50,6 @@ describe(SyncEntityType.MemoryToAssetV1, () => {
await memoryRepo.removeAssetIds(memory.id, [asset.id]); await memoryRepo.removeAssetIds(memory.id, [asset.id]);
const response = await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1]); const response = await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1]);
expect(response).toHaveLength(1);
expect(response).toEqual([ expect(response).toEqual([
{ {
ack: expect.any(String), ack: expect.any(String),
@@ -60,10 +59,11 @@ describe(SyncEntityType.MemoryToAssetV1, () => {
}, },
type: 'MemoryToAssetDeleteV1', type: 'MemoryToAssetDeleteV1',
}, },
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]); ]);
await ctx.syncAckAll(auth, response); await ctx.syncAckAll(auth, response);
await expect(ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1])).resolves.toEqual([]); await ctx.assertSyncIsComplete(auth, [SyncRequestType.MemoryToAssetsV1]);
}); });
it('should not sync a memory to asset relation or delete for an unrelated user', async () => { it('should not sync a memory to asset relation or delete for an unrelated user', async () => {
@@ -74,11 +74,18 @@ describe(SyncEntityType.MemoryToAssetV1, () => {
const { memory } = await ctx.newMemory({ ownerId: user2.id }); const { memory } = await ctx.newMemory({ ownerId: user2.id });
await ctx.newMemoryAsset({ memoryId: memory.id, assetId: asset.id }); await ctx.newMemoryAsset({ memoryId: memory.id, assetId: asset.id });
expect(await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1])).toHaveLength(0); expect(await ctx.syncStream(auth2, [SyncRequestType.MemoryToAssetsV1])).toEqual([
expect(await ctx.syncStream(auth2, [SyncRequestType.MemoryToAssetsV1])).toHaveLength(1); expect.objectContaining({ type: SyncEntityType.MemoryToAssetV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.MemoryToAssetsV1]);
await memoryRepo.removeAssetIds(memory.id, [asset.id]); await memoryRepo.removeAssetIds(memory.id, [asset.id]);
expect(await ctx.syncStream(auth, [SyncRequestType.MemoryToAssetsV1])).toHaveLength(0);
expect(await ctx.syncStream(auth2, [SyncRequestType.MemoryToAssetsV1])).toHaveLength(1); expect(await ctx.syncStream(auth2, [SyncRequestType.MemoryToAssetsV1])).toEqual([
expect.objectContaining({ type: SyncEntityType.MemoryToAssetDeleteV1 }),
expect.objectContaining({ type: SyncEntityType.SyncCompleteV1 }),
]);
await ctx.assertSyncIsComplete(auth, [SyncRequestType.MemoryToAssetsV1]);
}); });
}); });

Some files were not shown because too many files have changed in this diff Show More