A simplified machine learning automation framework that handles the complexities of ML workflows through configuration. The project provides an easy-to-use interface for training machine learning models while automatically managing optimization, visualization, and training processes.
Important
This framework is designed for rapid ML prototyping and experimentation. For production deployments, please review the performance metrics and model validation sections carefully.
from lightning_auto import AutoML
from config import get_classification_config
# Get configuration
config = get_classification_config()
# Initialize and train
auto_ml = AutoML(config)
auto_ml.fit(train_data, val_data)
Tip
Check the examples/
directory for complete notebook demonstrations of common use cases.
- Automated Training: Simplified training process with minimal user intervention
- Configuration Templates: Easy-to-use configuration templates for common ML tasks
- Visualization: Built-in visualization tools for performance analysis
Note
These metrics represent baseline performance and may vary based on your dataset and configuration.
- Training Loss: 1.6422
- Validation Loss: 1.6169
- Learning Rate: 0.000896
Note
The distribution plot shows:
- Class 2 dominates with ~50 samples
- Class 0 follows with ~27 samples
- Class 4 has the least representation with ~20 samples
- Clear sign of class imbalance that may need addressing
Note
Key observations from the confusion matrix:
- Strong diagonal pattern indicates good overall classification
- Class 2 shows the highest confidence with 10β14 correct predictions
- Some classifications between neighboring classes
- Class 3 shows room for improvement in discrimination
Note
The loss plot reveals:
- Training loss (blue) shows healthy fluctuation between 1.54β1.70
- Validation loss (orange) maintains stability around 1.62
- No significant overfitting as validation loss remains stable
- Good model convergence with occasional exploration spikes
Note
The learning rate schedule demonstrates:
- Smooth cosine decay from 2e-3 to 9e-4
- Gradual learning rate reduction for fine-tuning
- Proper annealing behavior for optimization stability
automl/
βββ lightning_auto.py # Core AutoML engine
βββ config.py # Configuration templates
βββ train.py # Training script
βββ WriterSide/ # Documentation
βββ examples/ # Example notebooks (Coming soon!)
Warning
Always validate configuration parameters against your specific use case before training.
config = {
"model": {
"type": "classification",
"input_dim": 10,
"output_dim": 5,
"task": "classification"
},
"training": {
"learning_rate": 0.002,
"epochs": 30
}
# ... other parameters
}
Caution
Before submitting large changes, please open an issue to discuss the proposed modifications.
- Create a new function in
config.py
:
def get_custom_config():
return {
"model": {
# model specifications
},
"training": {
# training parameters
}
}
- Add documentation and example usage
- Submit a pull request
- For new features:
- Fork the repository
- Create feature branch
- Add tests
- Submit pull request
This project is licensed under the MIT Licenseβsee the LICENSE file for details.