A specialised Python library for Automated Machine Learning (AutoML) of Longitudinal machine learning classification tasks built upon GAMA
|
|
🌟 Exciting Update: We're delighted to introduce the brand new v0.1 documentation for
Auto-Sklong
! For a deep dive into the library's capabilities and features, please visit here.
🎉 PyPi is available!: We published
Auto-Sklong
, here!
Auto-Scikit-Longitudinal
, also called Auto-Sklong
is an automated machine learning (AutoML) library designed to analyse
longitudinal data (Classification tasks focussed as of today) using various search methods. Namely,
Bayesian Optimisation
via SMAC3, Asynchronous Successive Halving
,
Evolutionary Algorithms
, and Random Search
via the General Automated Machine Learning Assistant (GAMA).
Auto-Sklong
built upon GAMA
, offers a brand-new search space to tackle the Longitudinal Machine Learning classification problems,
with a user-friendly interface, similar to the popular Scikit
paradigm.
Please for further information, visit the official documentation.
To install Auto-Sklong
, take these two easy steps:
- ✅ Install the latest version of
Auto-Sklong
:
pip install Auto-Sklong
You could also install different versions of the library by specifying the version number,
e.g. pip install Auto-Sklong==0.0.1
.
Refer to Release Notes
- 📦 [MANDATORY] Update the required dependencies (Why? See here)
Auto-Sklong
incorporates via Sklong
a modified version of Scikit-Learn
called Scikit-Lexicographical-Trees
,
which can be found at this Pypi link.
This revised version guarantees compatibility with the unique features of Scikit-longitudinal
.
Nevertheless, conflicts may occur with other dependencies in Auto-Sklong
that also require Scikit-Learn
.
Follow these steps to prevent any issues when running your project.
🫵 Simple Setup: Command Line Installation
Say you want to try Auto-Sklong
in a very simple environment. Such as without a proper project.toml
file (Poetry
, PDM
, etc).
Run the following command:
pip uninstall scikit-learn scikit-lexicographical-trees && pip install scikit-lexicographical-trees
🫵 Project Setup: Using `PDM` (or any other such as `Poetry`, etc.)
Imagine you have a project being managed by PDM
, or any other package manager. The example below demonstrates PDM
.
Nevertheless, the process is similar for Poetry
and others. Consult their documentation for instructions on excluding a
package.
Therefore, to prevent dependency conflicts, you can exclude Scikit-Learn
by adding the provided configuration
to your pyproject.toml
file.
[tool.pdm.resolution]
excludes = ["scikit-learn"]
This exclusion ensures Scikit-Lexicographical-Trees (used as Scikit-learn
) is used seamlessly within your project.
We improved @PGijsbers' open-source GAMA
initiative to propose a new search space that
leverages our other newly-designed library
Scikit-Longitudinal
(Sklong) in order to tackle the longitudinal
classification problems via Combined Algorithm Selection and Hyperparameter Optimization (CASH Optimization).
Worth noting that it previously was not possible with GAMA
or any other AutoML libraries to the best of our knowledge
(refer to the Related Projects in the
official documentation nonetheless).
While GAMA
is offering a way to update the search space, we had to improve GAMA
to support a couple of new features as follow.
Nonetheless, it is worth-noting that in the coming months, the current version of Auto-Sklong
might speedy increase due
to the following pull requests ongoing on GAMA
:
- ConfigSpace Technology Integration for Enhanced GAMA Configuration and Management 🥇
- Search Methods Enhancements to Avoid Duplicate Evaluated Pipelines 🥈 #211
- SMAC3 Bayesian Optimisation Integration [🆕 Search Method] 🥉 #212
As soon as we are able to publish those on GAMA
, there will be a compatibility refactoring to align
Auto-Sklong
with the most recent version of GAMA
. As a result, this section will be removed appropriately.
For developers looking to contribute, please refer to the Contributing
section of GAMA
here
and Scikit-Longitudinal
here.
Auto-Sklong
is compatible with the following operating systems:
- MacOS
- Linux 🐧
- On Windows 🪟, you are recommended to run the library within a Docker container under a Linux distribution.
To perform AutoML on your longitudinal analysis with Auto-Sklong
, use the following two-easy-steps.
-
First, load and prepare your dataset using the
LongitudinalDataset
class ofSklong
. -
Second, use the
GamaLongitudinalClassifier
class ofAuto-Sklong
. Following instantiating it set up itshyperparameters
or let default, you can apply the popular fit, predict, prodict_proba, methods in the same way thatScikit-learn
does, as shown in the example below. It will then automatically search for the best model and hyperparameters for your dataset.
Refer to the documentation for more information on the GamaLongitudinalClassifier
class.
from sklearn.metrics import classification_report
from scikit_longitudinal.data_preparation import LongitudinalDataset
from gama.GamaLongitudinalClassifier import GamaLongitudinalClassifier
# Load your longitudinal dataset
dataset = LongitudinalDataset('./stroke.csv')
dataset.load_data_target_train_test_split(
target_column="class_stroke_wave_4",
)
# Pre-set or manually set your temporal dependencies
dataset.setup_features_group(input_data="elsa")
# Instantiate the AutoML system
automl = GamaLongitudinalClassifier(
features_group=dataset.features_group(),
non_longitudinal_features=dataset.non_longitudinal_features(),
feature_list_names=dataset.data.columns,
)
# Run the AutoML system to find the best model and hyperparameters
model.fit(dataset.X_train, dataset.y_train)
# Predictions and prediction probabilities
label_predictions = automl.predict(X_test)
probability_predictions = automl.predict_proba(X_test)
# Classification report
print(classification_report(y_test, label_predictions))
# Export a reproducible script of the champion model
automl.export_script()
Paper has been submitted to a conference. In the meantime, for the repository, utilise the button top right corner of the repository "How to cite?", or open the following citation file: CITATION.cff.