rust-port/wifi-densepose-rs/docs/adr/ADR-003-neural-network-inference.md
Accepted
The WiFi-DensePose system requires neural network inference for:
We need to select an inference strategy that supports pre-trained models and multiple backends.
We will implement a multi-backend inference engine:
ort crate)tch-rs: PyTorch C++ bindingscandle: Pure Rust ML frameworkpub trait Backend: Send + Sync {
fn load_model(&mut self, path: &Path) -> NnResult<()>;
fn run(&self, inputs: HashMap<String, Tensor>) -> NnResult<HashMap<String, Tensor>>;
fn input_specs(&self) -> Vec<TensorSpec>;
fn output_specs(&self) -> Vec<TensorSpec>;
}
[features]
default = ["onnx"]
onnx = ["ort"]
tch-backend = ["tch"]
candle-backend = ["candle-core", "candle-nn"]
cuda = ["ort/cuda"]
tensorrt = ["ort/tensorrt"]