We have reproduced RBD‑638 consistently on a test environment (Octopus 15.2.7). The failure occurs only when the destination image lives on a remote pool (accessed via CephFS or a separate cluster). `rbd info` works, but `rbd export‑diff` aborts with “No such file or directory”.
Next steps: gather higher‑verbosity logs, double‑check client capabilities on the remote pool, and test the scenario without a CephFS mount (direct `--dest-pool` flag). rbd-638
– Alice (QA) Feel free to paste the table above directly into the bug, add any extra logs you have, and assign the appropriate owners. rbd export-diff blows up with ENOENT when the destination image is on a remote pool. The problem is reproducible, likely a CLI parsing / auth issue, and can be temporarily mitigated by copying the destination locally or using a block‑device mapping. Next steps focus on logs, capability checks, and upstream investigation. Happy debugging! 🚀 We have reproduced RBD‑638 consistently on a test