altametris.sara.yolo ==================== .. py:module:: altametris.sara.yolo .. autoapi-nested-parse:: YOLO - Object Detection and Instance Segmentation. Wrapper autour de Ultralytics YOLO (v8, v11). Submodules ---------- .. toctree:: :maxdepth: 1 /autoapi/altametris/sara/yolo/callbacks/index /autoapi/altametris/sara/yolo/detector/index /autoapi/altametris/sara/yolo/exporter/index /autoapi/altametris/sara/yolo/model/index /autoapi/altametris/sara/yolo/trainer/index /autoapi/altametris/sara/yolo/utils/index Classes ------- .. autoapisummary:: altametris.sara.yolo.YoloDetector altametris.sara.yolo.YoloExporter altametris.sara.yolo.YoloModel altametris.sara.yolo.YoloTrainer Package Contents ---------------- .. py:class:: YoloDetector(model_path: Union[str, pathlib.Path], task: str = 'detect', device: str = 'auto', warmup: bool = True, **kwargs: Any) Bases: :py:obj:`altametris.sara.core.base_detector.BaseDetector` YOLO detector implementing BaseDetector interface. Provides inference capabilities with support for images, videos, and batch processing. :param model_path: Path to YOLO model weights :param task: YOLO task ("detect", "segment", "obb", "pose") :param device: Inference device :param warmup: Run warmup on initialization .. rubric:: Example >>> detector = YoloDetector(model_path="yolo11x.pt", device="cuda") >>> results = detector.predict("image.jpg", conf=0.5, imgsz=640) .. py:attribute:: SUPPORTED_TASKS :value: ['detect', 'segment', 'obb', 'pose', 'classify'] .. py:attribute:: task :value: 'detect' .. py:attribute:: model :type: ultralytics.YOLO .. py:method:: _load_model(model_path: pathlib.Path, **kwargs: Any) -> None Load YOLO model from path. :param model_path: Path to model file :param \*\*kwargs: Additional loading arguments :raises ModelError: If model cannot be loaded .. py:method:: _validate_predict_params(conf: float, iou: float, imgsz: int) -> None Validate prediction parameters before inference. :param conf: Confidence threshold :param iou: IoU threshold :param imgsz: Image size :raises ConfigurationError: If parameters are invalid .. py:method:: predict(source: Union[str, pathlib.Path, numpy.typing.NDArray[numpy.uint8], PIL.Image.Image, List[Any]], conf: float = 0.25, iou: float = 0.45, imgsz: int = 640, verbose: bool = False, **kwargs: Any) -> Any Run inference on source. :param source: Input source (image path, array, PIL Image, or list) :param conf: Confidence threshold :param iou: IoU threshold for NMS :param imgsz: Input image size :param verbose: Verbose output :param \*\*kwargs: Additional inference arguments :returns: Ultralytics Results object(s) :raises ConfigurationError: If parameters (conf, iou, imgsz) are invalid :raises InferenceError: If prediction fails .. rubric:: Example >>> # Single image >>> results = detector.predict("image.jpg", conf=0.5) >>> >>> # Batch of images >>> results = detector.predict(["img1.jpg", "img2.jpg"], conf=0.5) >>> >>> # NumPy array >>> img : NDArray[np.uint8] = np.random.rand(640, 640, 3).astype(np.uint8) >>> results = detector.predict(img, conf=0.5) .. py:method:: predict_batch(sources: List[Union[str, pathlib.Path, numpy.typing.NDArray[numpy.uint8]]], batch_size: int = 32, **kwargs: Any) -> List[Any] Run batch inference on multiple sources. :param sources: List of input sources :param batch_size: Batch size for processing :param \*\*kwargs: Additional inference arguments :returns: List of results for each source .. rubric:: Example >>> images = ["img1.jpg", "img2.jpg", "img3.jpg"] >>> results = detector.predict_batch(images, batch_size=8) .. py:method:: warmup(iterations: int = 3, imgsz: int = 640) -> None Warmup model with dummy inference. :param iterations: Number of warmup iterations :param imgsz: Image size for warmup .. rubric:: Example >>> detector.warmup(iterations=5, imgsz=640) .. py:method:: get_names() -> dict[int, str] Get class names mapping. :returns: Dictionary mapping class IDs to names .. rubric:: Example >>> names = detector.get_names() >>> print(names[0]) # 'person' .. py:property:: num_classes :type: int Get number of classes. .. py:method:: __repr__() -> str String representation. .. py:class:: YoloExporter(model_path: Union[str, pathlib.Path], output_dir: Optional[Union[str, pathlib.Path]] = None) Bases: :py:obj:`altametris.sara.core.base_exporter.BaseExporter` YOLO exporter implementing BaseExporter interface. Supports export to multiple formats via Ultralytics backend. :param model_path: Path to YOLO model weights :param output_dir: Output directory for exports .. rubric:: Example >>> exporter = YoloExporter(model_path="yolo11x.pt") >>> onnx_path = exporter.export(format="onnx", imgsz=640) >>> trt_path = exporter.export(format="tensorrt", imgsz=640, half=True) .. py:attribute:: SUPPORTED_FORMATS :value: ['onnx', 'torchscript', 'tensorrt', 'coreml', 'tflite', 'engine'] .. py:method:: export(format: str = 'onnx', imgsz: Union[int, tuple[int, int]] = 640, half: bool = False, dynamic: bool = False, simplify: bool = True, **kwargs: Any) -> pathlib.Path Export YOLO model to specified format. :param format: Export format :param imgsz: Input image size (int or (height, width)) :param half: Use FP16 precision :param dynamic: Dynamic axes for ONNX :param simplify: Simplify ONNX model :param \*\*kwargs: Additional export arguments :returns: Path to exported model :raises ExportError: If export fails .. rubric:: Example >>> # Export to ONNX >>> exporter.export(format="onnx", imgsz=640, half=False) >>> >>> # Export to TensorRT >>> exporter.export(format="tensorrt", imgsz=640, half=True) .. py:method:: validate_export(export_path: pathlib.Path, test_image: Optional[Union[str, pathlib.Path]] = None, **kwargs: Any) -> bool Validate exported model. :param export_path: Path to exported model :param test_image: Optional test image for validation :param \*\*kwargs: Additional validation arguments :returns: True if validation passes :raises ExportError: If validation fails .. rubric:: Example >>> onnx_path = exporter.export(format="onnx") >>> exporter.validate_export(onnx_path, test_image="test.jpg") .. py:method:: export_to_tensorrt(imgsz: Union[int, tuple[int, int]] = 640, half: bool = True, workspace: float = 4.0, **kwargs: Any) -> pathlib.Path Export to TensorRT with optimized settings. :param imgsz: Input image size :param half: Use FP16 precision :param workspace: TensorRT workspace size (GB) :param \*\*kwargs: Additional TensorRT arguments :returns: Path to TensorRT engine .. rubric:: Example >>> trt_path = exporter.export_to_tensorrt(imgsz=640, half=True) .. py:method:: export_to_onnx(imgsz: Union[int, tuple[int, int]] = 640, dynamic: bool = True, simplify: bool = True, **kwargs: Any) -> pathlib.Path Export to ONNX with optimized settings. :param imgsz: Input image size :param dynamic: Dynamic axes for variable input size :param simplify: Simplify ONNX graph :param \*\*kwargs: Additional ONNX arguments :returns: Path to ONNX model .. rubric:: Example >>> onnx_path = exporter.export_to_onnx(imgsz=640, dynamic=True) .. py:method:: __repr__() -> str String representation. .. py:class:: YoloModel(model_path: Optional[Union[str, pathlib.Path]] = None, version: str = '11', task: str = 'detect', size: str = 'x', pretrained: bool = False, device: str = 'auto', **kwargs: Any) Bases: :py:obj:`altametris.sara.core.base_model.BaseModel` YOLO model wrapper implementing BaseModel interface. Provides standardized interface for YOLO models from Ultralytics with support for multiple versions, tasks, and sizes. :param model_path: Path to existing model weights (optional) :param version: YOLO version ("11" or "v8") :param task: YOLO task ("detect", "segment", "obb", "pose", "classify") :param size: Model size ("n", "s", "m", "l", "x") :param pretrained: Load pretrained weights from Ultralytics :param device: Device for model ("cpu", "cuda", "mps", "auto") .. rubric:: Example >>> # Create from scratch >>> model = YoloModel(version="11", task="detect", size="x") >>> >>> # Load from weights >>> model = YoloModel(model_path="weights/best.pt") .. py:attribute:: SUPPORTED_VERSIONS :value: ['11', 'v8'] .. py:attribute:: SUPPORTED_TASKS :value: ['detect', 'segment', 'obb', 'pose', 'classify'] .. py:attribute:: SUPPORTED_SIZES :value: ['n', 's', 'm', 'l', 'x'] .. py:attribute:: yolo :type: ultralytics.YOLO .. py:method:: _validate_init_params(version: str, task: str, size: str, model_path: Optional[Union[str, pathlib.Path]], pretrained: bool) -> None Validate initialization parameters. .. py:method:: _get_model_name(version: str, task: str, size: str) -> str Generate Ultralytics model name. :param version: YOLO version :param task: Task type :param size: Model size :returns: Model name string (e.g., "yolo11x-seg.pt") .. py:method:: forward(x: torch.Tensor, **kwargs: Any) -> Any Forward pass through YOLO model. :param x: Input tensor (B, C, H, W) :param \*\*kwargs: Additional forward arguments :returns: Model output .. rubric:: Example >>> x = torch.randn(1, 3, 640, 640) >>> output = model(x) .. py:method:: load_weights(path: Union[str, pathlib.Path], **kwargs: Any) -> None Load weights from file. :param path: Path to weights file :param \*\*kwargs: Additional loading arguments :raises ModelError: If weights cannot be loaded .. py:method:: save_weights(path: Union[str, pathlib.Path], **kwargs: Any) -> None Save model weights in Ultralytics checkpoint format. This method uses the native Ultralytics save() API which creates a complete checkpoint file compatible with YOLO loading. :param path: Path to save weights :param \*\*kwargs: Additional saving arguments .. rubric:: Example >>> model.save_weights("weights/my_model.pt") .. note:: The saved file is a complete Ultralytics checkpoint that can be reloaded with YoloModel(model_path=path) or YOLO(path). .. py:property:: names :type: dict[int, str] Get class names mapping. .. py:property:: task :type: str Get model task. .. py:property:: version :type: str Get YOLO version. .. py:method:: __repr__() -> str String representation. .. py:class:: YoloTrainer(model: altametris.sara.yolo.model.YoloModel, device: str = 'auto', callbacks: Optional[list[altametris.sara.core.base_callback.BaseCallback]] = None) Bases: :py:obj:`altametris.sara.core.base_trainer.BaseTrainer` YOLO trainer implementing BaseTrainer interface. Wraps Ultralytics training with Altametris callback system and standardized configuration. :param model: YOLO model to train :param device: Training device :param callbacks: Training callbacks .. rubric:: Example >>> model = YoloModel(version="11", task="detect", size="n") >>> trainer = YoloTrainer(model=model, device="cuda") >>> results = trainer.train( ... dataset_config="data.yaml", ... epochs=100, ... batch_size=16, ... imgsz=640 ... ) .. py:attribute:: model :type: altametris.sara.yolo.model.YoloModel .. py:method:: _resolve_device_for_ultralytics() -> str Convert Altametris device format to Ultralytics-compatible format. Ultralytics does not support 'auto' device. It requires explicit device specification: 'cpu', '0', '1', etc. Conversion rules: - 'auto' -> 'cpu' (if no GPU) or '0' (if GPU available) - 'cuda' -> '0' (first GPU) - 'cuda:0' -> '0' (extract GPU index) - 'cpu' -> 'cpu' (unchanged) :returns: Device string compatible with Ultralytics (e.g., 'cpu', '0') .. note:: This method checks actual GPU availability using torch.cuda.device_count() to avoid errors when CUDA is installed but no GPU is present. .. py:method:: _register_ultralytics_callbacks() -> None Bridge between Altametris callbacks and Ultralytics callbacks. Registers hooks into Ultralytics training lifecycle that forward events to the Altametris callback manager. This ensures callbacks are triggered automatically during Ultralytics training. Ultralytics hooks registered: - on_train_start: Called when training begins - on_train_epoch_end: Called after each training epoch - on_val_end: Called after validation - on_train_end: Called when training completes .. note:: This method is called automatically during __init__. Manual callback invocations in train() are not needed. .. py:method:: load_config(config_path: Union[str, pathlib.Path]) -> Dict[str, Any] Load and parse YAML configuration file. Supports both dataset configuration files (data.yaml) and training parameter files (train-params.yaml). :param config_path: Path to YAML configuration file :returns: Dictionary containing parsed configuration :raises TrainingError: If config file not found or invalid YAML .. rubric:: Example >>> # Load dataset config >>> dataset_config = trainer.load_config("data/detect/config.yml") >>> print(dataset_config['names']) {0: 'Chassis A', 1: 'Chassis C', ...} >>> # Load training params >>> train_params = trainer.load_config("data/detect/train-parameter-detect.yaml") >>> epochs = train_params['train']['epochs'] .. py:method:: train(dataset_config: Union[str, pathlib.Path, Dict[str, Any]], epochs: int = 100, batch_size: int = 16, imgsz: int = 640, save_dir: Optional[Union[str, pathlib.Path]] = None, **kwargs: Any) -> dict[str, Any] Train YOLO model. :param dataset_config: Path to dataset YAML configuration or dict :param epochs: Number of training epochs :param batch_size: Batch size :param imgsz: Input image size :param save_dir: Directory to save results :param \*\*kwargs: Additional Ultralytics training arguments :returns: Training results dictionary :raises TrainingError: If training fails .. rubric:: Example >>> # Using YAML path >>> results = trainer.train( ... dataset_config="data.yaml", ... epochs=10, ... batch_size=8, ... imgsz=640 ... ) >>> # Using dict config >>> config = trainer.load_config("train-params.yaml") >>> results = trainer.train( ... dataset_config="data.yaml", ... epochs=config['train']['epochs'], ... batch_size=config['train']['batch_size'] ... ) .. py:method:: validate(dataset_config: Union[str, pathlib.Path], **kwargs: Any) -> dict[str, Any] Validate YOLO model. :param dataset_config: Path to validation dataset config :param \*\*kwargs: Additional validation arguments :returns: Validation metrics :raises TrainingError: If validation fails .. rubric:: Example >>> metrics = trainer.validate(dataset_config="val.yaml") .. py:method:: _extract_metrics(results: Any) -> dict[str, Any] Extract metrics from Ultralytics results. :param results: Ultralytics training/validation results :returns: Dictionary of metrics .. py:method:: resume_training(checkpoint_path: Union[str, pathlib.Path], **kwargs: Any) -> dict[str, Any] Resume training from checkpoint. :param checkpoint_path: Path to checkpoint file :param \*\*kwargs: Additional training arguments :returns: Training results .. rubric:: Example >>> results = trainer.resume_training("runs/train/exp/weights/last.pt")