Three-piece workflow that imports a self-hosted Immich library and emits
new facesets without disturbing existing identity numbering:
- work/immich_stage.py (WSL): pages /search/metadata, parallel-fetches
/faces?id= per asset, prefilters by face_short>=90 against bbox scaled
to original-image coords, downloads originals, sha256-dedups against
nl_full.npz and same-run staged files. 8-worker ThreadPoolExecutor
doing the full /faces->filter->/original chain per asset; resumable
via state.json. API URL + key come from IMMICH_URL / IMMICH_API_KEY
env vars, label->UUID map from work/immich/users.json (gitignored).
- work/embed_worker.py (Windows venv at C:\face_embed_venv): runs
insightface.FaceAnalysis(buffalo_l) with the DmlExecutionProvider on
AMD Radeon Vega via onnxruntime-directml. Produces a cache file in
the same .npz schema as sort_faces.cmd_embed (loadable via
load_cache). ~7.5x speedup over CPU end-to-end; embeddings bit-
identical to CPU (cosine similarity 1.0000 across 8 sample faces).
- work/cluster_immich.py (WSL): mirrors cluster_osrc.py against an
immich_<user>.npz. Builds existing identity centroids from canonical
faceset_NNN/ in facesets_swap_ready/, drops matches at <=0.45,
clusters the rest at 0.55, applies refine gates, hands off to
cmd_export_swap. Numbers new facesets past the existing maximum.
- work/finalize_immich.sh: chains queue->Windows embed->cache copy->
cluster_immich, with logging.
The 2026-04-26 run on https://fotos.computerliebe.org (Immich v2.7.2)
processed 53,842 admin-accessible assets, staged 10,261, embedded
19,462 face records on Vega DML in 64.6 min, matched 8,103 (42%) to
existing identities, and emitted 185 new facesets (faceset_026..264
with gaps). facesets_swap_ready/ went from 31 to 216 substantive
facesets.
Important caveat surfaced: /search/metadata's userIds filter is
silently ignored when the API key is bound to a different user, so
this run can't enumerate other users' libraries from the admin key.
A per-user API key would be required for nic.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>