plugins/machine-learning-ops/commands/ml-pipeline.md
Design and implement a complete ML pipeline for: $ARGUMENTS
This workflow orchestrates multiple specialized agents to build a production-ready ML pipeline following modern MLOps best practices. The approach emphasizes:
The multi-agent approach ensures each aspect is handled by domain experts:
Deliverables:
Data source audit and ingestion strategy:
Data quality framework:
Storage architecture:
Provide implementation code for critical components and integration patterns. </Task>
<Task> subagent_type: data-scientist prompt: | Design feature engineering and model requirements for: $ARGUMENTS Using data architecture from: {phase1.data-engineer.output}Deliverables:
Feature engineering pipeline:
Model requirements:
Experiment design:
Include feature transformation code and statistical validation logic. </Task>
Build comprehensive training system:
Training pipeline implementation:
Experiment tracking setup:
Model registry integration:
Provide complete training code with configuration management. </Task>
<Task> subagent_type: python-pro prompt: | Optimize and productionize ML code from: {phase2.ml-engineer.output}Focus areas:
Code quality and structure:
Performance optimization:
Testing framework:
Deliver production-ready, maintainable code with full test coverage. </Task>
Implementation requirements:
Model serving infrastructure:
Deployment strategies:
CI/CD pipeline:
Infrastructure as Code:
Provide complete deployment configuration and automation scripts. </Task>
<Task> subagent_type: kubernetes-architect prompt: | Design Kubernetes infrastructure for ML workloads from: {phase3.mlops-engineer.output}Kubernetes-specific requirements:
Workload orchestration:
Serving infrastructure:
Storage and data access:
Provide Kubernetes manifests and Helm charts for entire ML platform. </Task>
Monitoring framework:
Model performance monitoring:
Data and model drift detection:
System observability:
Alerting and automation:
Cost tracking:
Deliver monitoring configuration, dashboards, and alert rules. </Task>
Data Pipeline Success:
Model Performance:
Operational Excellence:
Development Velocity:
Cost Efficiency:
Upon completion, the orchestrated pipeline will provide: