Download size: 84.04MB immich_machine_learning | [11/16/25 21:28:53] INFO Successfully saved to: immich_machine_learning | /cache/ocr/PP-OCRv5_server/detection/model.onnx immich_machine_learning | [11/16/25 21:28:53] INFO Loading detection model 'PP-OCRv5_server' to memory immich_machine_learning | [11/16/25 21:28:53] INFO Setting execution providers to immich_machine_learning | ['OpenVINOExecutionProvider', immich_machine_learning | 'CPUExecutionProvider'], in descending order of immich_machine_learning | preference immich_machine_learning | [11/16/25 21:29:12] INFO Downloading recognition model 'PP-OCRv5_server' to immich_machine_learning | /cache/ocr/PP-OCRv5_server/recognition/model.onnx. immich_machine_learning | This may take a while. immich_machine_learning | [11/16/25 21:29:12] INFO Initiating download: immich_machine_learning | https://www.modelscope.cn/models/RapidAI/RapidOCR/r immich_machine_learning | esolve/v3.4.0/onnx/PP-OCRv5/rec/ch_PP-OCRv5_rec_ser immich_machine_learning | ver_infer.onnx immich_machine_learning | [11/16/25 21:29:13] INFO Download size: 80.66MB immich_machine_learning | [11/16/25 21:29:22] INFO Successfully saved to: immich_machine_learning | /cache/ocr/PP-OCRv5_server/recognition/model.onnx immich_machine_learning | [11/16/25 21:29:22] INFO Loading recognition model 'PP-OCRv5_server' to immich_machine_learning | memory immich_machine_learning | [11/16/25 21:29:22] INFO Setting execution providers to immich_machine_learning | ['OpenVINOExecutionProvider', immich_machine_learning | 'CPUExecutionProvider'], in descending order of immich_machine_learning | preference immich_machine_learning | [11/16/25 21:29:25] INFO Using engine_name: onnxruntime immich_machine_learning | 2025-11-16 21:29:43.926179199 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running OpenVINO-EP-subgraph_1 node. Name:'OpenVINOExecutionProvider_OpenVINO-EP-subgraph_1_0' Status Message: /onnxruntime/onnxruntime/core/providers/openvino/ov_interface.cc:243 void onnxruntime::openvino_ep::OVInferRequest::WaitRequest() [OpenVINO-EP] Wait Model Failed: Exception from src/inference/src/cpp/infer_request.cpp:245: immich_machine_learning | Check 'layout.bytes_count() <= get_device_info().max_alloc_mem_size' failed at src/plugins/intel_gpu/src/runtime/ocl/ocl_engine.cpp:135: immich_machine_learning | [GPU] Exceeded max size of memory object allocation: requested 5103511272 bytes, but max alloc size supported by device is 4294959104 bytes.Please try to reduce batch size or use lower precision. immich_machine_learning | immich_machine_learning | immich_machine_learning | immich_machine_learning | [11/16/25 21:29:43] ERROR Exception in ASGI application immich_machine_learning | immich_machine_learning | ╭─────── Traceback (most recent call last) ───────╮ immich_machine_learning | │ /usr/src/immich_ml/main.py:177 in predict │ immich_machine_learning | │ │ immich_machine_learning | │ 174 │ │ inputs = text │ immich_machine_learning | │ 175 │ else: │ immich_machine_learning | │ 176 │ │ raise HTTPException(400, "Either │ immich_machine_learning | │ ❱ 177 │ response = await run_inference(inputs │ immich_machine_learning | │ 178 │ return ORJSONResponse(response) │ immich_machine_learning | │ 179 │ immich_machine_learning | │ 180 │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/main.py:202 in run_inference │ immich_machine_learning | │ │ immich_machine_learning | │ 199 │ │ response[entry["task"]] = output │ immich_machine_learning | │ 200 │ │ immich_machine_learning | │ 201 │ without_deps, with_deps = entries │ immich_machine_learning | │ ❱ 202 │ await asyncio.gather(*[_run_inference │ immich_machine_learning | │ 203 │ if with_deps: │ immich_machine_learning | │ 204 │ │ await asyncio.gather(*[_run_infer │ immich_machine_learning | │ 205 │ if isinstance(payload, Image): │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/main.py:197 in │ immich_machine_learning | │ _run_inference │ immich_machine_learning | │ │ immich_machine_learning | │ 194 │ │ │ │ message = f"Task {entry[' │ immich_machine_learning | │ output of {dep}" │ immich_machine_learning | │ 195 │ │ │ │ raise HTTPException(400, │ immich_machine_learning | │ 196 │ │ model = await load(model) │ immich_machine_learning | │ ❱ 197 │ │ output = await run(model.predict, │ immich_machine_learning | │ 198 │ │ outputs[model.identity] = output │ immich_machine_learning | │ 199 │ │ response[entry["task"]] = output │ immich_machine_learning | │ 200 │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/main.py:215 in run │ immich_machine_learning | │ │ immich_machine_learning | │ 212 │ if thread_pool is None: │ immich_machine_learning | │ 213 │ │ return func(*args, **kwargs) │ immich_machine_learning | │ 214 │ partial_func = partial(func, *args, * │ immich_machine_learning | │ ❱ 215 │ return await asyncio.get_running_loop │ immich_machine_learning | │ 216 │ immich_machine_learning | │ 217 │ immich_machine_learning | │ 218 async def load(model: InferenceModel) -> │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/local/lib/python3.11/concurrent/futures/th │ immich_machine_learning | │ read.py:58 in run │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/models/base.py:60 in predict │ immich_machine_learning | │ │ immich_machine_learning | │ 57 │ │ self.load() │ immich_machine_learning | │ 58 │ │ if model_kwargs: │ immich_machine_learning | │ 59 │ │ │ self.configure(**model_kwargs │ immich_machine_learning | │ ❱ 60 │ │ return self._predict(*inputs) │ immich_machine_learning | │ 61 │ │ immich_machine_learning | │ 62 │ @abstractmethod │ immich_machine_learning | │ 63 │ def _predict(self, *inputs: Any, **mo │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/models/ocr/detection.py:70 │ immich_machine_learning | │ in _predict │ immich_machine_learning | │ │ immich_machine_learning | │ 67 │ │ w, h = inputs.size │ immich_machine_learning | │ 68 │ │ if w < 32 or h < 32: │ immich_machine_learning | │ 69 │ │ │ return self._empty │ immich_machine_learning | │ ❱ 70 │ │ out = self.session.run(None, {"x" │ immich_machine_learning | │ 71 │ │ boxes, scores = self.postprocess( │ immich_machine_learning | │ 72 │ │ if len(boxes) == 0: │ immich_machine_learning | │ 73 │ │ │ return self._empty │ immich_machine_learning | │ │ immich_machine_learning | │ /usr/src/immich_ml/sessions/ort.py:51 in run │ immich_machine_learning | │ │ immich_machine_learning | │ 48 │ │ input_feed: dict[str, NDArray[np. │ immich_machine_learning | │ 49 │ │ run_options: Any = None, │ immich_machine_learning | │ 50 │ ) -> list[NDArray[np.float32]]: │ immich_machine_learning | │ ❱ 51 │ │ outputs: list[NDArray[np.float32] │ immich_machine_learning | │ run_options) │ immich_machine_learning | │ 52 │ │ return outputs │ immich_machine_learning | │ 53 │ │ immich_machine_learning | │ 54 │ @property │ immich_machine_learning | │ │ immich_machine_learning | │ /opt/venv/lib/python3.11/site-packages/onnxrunt │ immich_machine_learning | │ ime/capi/onnxruntime_inference_collection.py:22 │ immich_machine_learning | │ 0 in run │ immich_machine_learning | │ │ immich_machine_learning | │ 217 │ │ if not output_names: │ immich_machine_learning | │ 218 │ │ │ output_names = [output.name │ immich_machine_learning | │ 219 │ │ try: │ immich_machine_learning | │ ❱ 220 │ │ │ return self._sess.run(output │ immich_machine_learning | │ 221 │ │ except C.EPFail as err: │ immich_machine_learning | │ 222 │ │ │ if self._enable_fallback: │ immich_machine_learning | │ 223 │ │ │ │ print(f"EP Error: {err!s │ immich_machine_learning | ╰─────────────────────────────────────────────────╯ immich_machine_learning | Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero immich_machine_learning | status code returned while running immich_machine_learning | OpenVINO-EP-subgraph_1 node. immich_machine_learning | Name:'OpenVINOExecutionProvider_OpenVINO-EP-subgrap immich_machine_learning | h_1_0' Status Message: immich_machine_learning | /onnxruntime/onnxruntime/core/providers/openvino/ov immich_machine_learning | _interface.cc:243 void immich_machine_learning | onnxruntime::openvino_ep::OVInferRequest::WaitReque immich_machine_learning | st() [OpenVINO-EP] Wait Model Failed: Exception immich_machine_learning | from src/inference/src/cpp/infer_request.cpp:245: immich_machine_learning | Check 'layout.bytes_count() <= immich_machine_learning | get_device_info().max_alloc_mem_size' failed at immich_machine_learning | src/plugins/intel_gpu/src/runtime/ocl/ocl_engine.cp immich_machine_learning | p:135: immich_machine_learning | [GPU] Exceeded max size of memory object immich_machine_learning | allocation: requested 5103511272 bytes, but max immich_machine_learning | alloc size supported by device is 4294959104 immich_machine_learning | bytes.Please try to reduce batch size or use lower immich_machine_learning | precision. immich_machine_learning | immich_machine_learning | immich_machine_learning | immich_server | [Nest] 7 - 11/16/2025, 9:29:44 PM WARN [Microservices:MachineLearningRepository] Machine learning request to "http://immich-machine-learning:3003" failed with status 500: Internal Server Error immich_server | [Nest] 7 - 11/16/2025, 9:29:44 PM LOG [Microservices:MachineLearningRepository] Machine learning server became unhealthy (http://immich-machine-learning:3003). immich_server | [Nest] 7 - 11/16/2025, 9:29:44 PM ERROR [Microservices:{"id":"cc0efd34-23e1-4343-a4be-ce335d503a38"}] Unable to run job handler (Ocr): Error: Machine learning request '{"ocr":{"detection":{"modelName":"PP-OCRv5_server","options":{"minScore":0.5,"maxResolution":736}},"recognition":{"modelName":"PP-OCRv5_server","options":{"minScore":0.8}}}}' failed for all URLs immich_server | Error: Machine learning request '{"ocr":{"detection":{"modelName":"PP-OCRv5_server","options":{"minScore":0.5,"maxResolution":736}},"recognition":{"modelName":"PP-OCRv5_server","options":{"minScore":0.8}}}}' failed for all URLs immich_server | at MachineLearningRepository.predict (/usr/src/app/server/dist/repositories/machine-learning.repository.js:117:15) immich_server | at process.processTicksAndRejections (node:internal/process/task_queues:105:5) immich_server | at async MachineLearningRepository.ocr (/usr/src/app/server/dist/repositories/machine-learning.repository.js:150:26) immich_server | at async OcrService.handleOcr (/usr/src/app/server/dist/services/ocr.service.js:52:28) immich_server | at async JobService.onJobRun (/usr/src/app/server/dist/services/job.service.js:199:30) immich_server | at async EventRepository.onEvent (/usr/src/app/server/dist/repositories/event.repository.js:91:13) immich_server | at async /usr/src/app/server/node_modules/.pnpm/bullmq@5.61.2/nod