RobustX

The increasing use of machine learning models to aid decision-making in high-stakes industries like finance and healthcare demands explainability to build trust. Counterfactual Explanations (CEs) provide valuable insights into model predictions by showing how slight changes in input data could lead to different outcomes. A key aspect of CEs is their robustness, which ensures that the desired outcomes remain stable even with minor alterations to the input. Robustness is important since produced CEs should hold up in the future should the original model be altered or replaced.Despite the importance of robustness, there has been a lack of standardised tools to comprehensively evaluate and compare robust CE generation methods. To address this, RobustX was developed as an open-source Python library aimed at benchmarking the robustness of various CE methods. RobustX provides a systematic framework for generating, evaluating, and comparing CEs with a focus on robustness, enabling fair and effective benchmarking. The library is highly extensible, allowing for custom models, datasets, and tools to be integrated, making it an essential tool for enhancing the reliability and interpretability of machine learning models in critical applications.

Features

  • Standardises the evaluation and benchmarking of robust CEs.

  • Supports multiple ML frameworks, including PyTorch, Keras, and scikit-learn.

  • Extensible to incorporate custom models, datasets, CE methods, and evaluation metrics.

  • Includes several robust CE generation algorithms (e.g., TRexNN, RNCE) and non-robust baselines (e.g., MCE, BLS).

Setup

The core required packages for working on RobustX are listed in environment.yml and requirements.txt.

To set up a new virtual environment with Conda, use environment.yml. This will install the required packages into a new environment called RobustX.

conda env create -f environment.yml
conda activate robustx

Then, start using the library by:

pip install -e .

Alternatively, if using an existing Python environment, directly run pip install -e .

Note that one needs Gurobi optimizer to run mixed integer programming-based methods. Gurobi offers free academic licenses.

Examples

A first example of RobustX is provided below.

# first prepare a task
from robustx.datasets.ExampleDatasets import get_example_dataset
from robustx.lib.models.pytorch_models.SimpleNNModel import SimpleNNModel
from robustx.lib.tasks.ClassificationTask import ClassificationTask

data = get_example_dataset("ionosphere")
data.default_preprocess()
model = SimpleNNModel(34, [8], 1)
model.train(data.X, data.y)
task = ClassificationTask(model, data)

# specify the names of the methods and evaluations we want to use, run benchmarking
# This will find CEs for all instances predicted with the undesirable class (0) and compare
from robustx.lib import default_benchmark
methods = ["KDTreeNNCE", "MCE", "MCER", "RNCE", "STCE", "PROPLACE"]
evaluations = ["Validity", "Distance", "Delta-robustness"]
default_benchmark(task, methods, evaluations, neg_value=0, column_name="target", delta=0.005)

which will produce an output similar to:

Method

Execution Time (s)

V alidity

D istance

D elta-robustness

K DTreeNNCE

0.21686

1

5.76588

0.515625

MCE

3.44478

1

3.26922

0

MCER

137.563

1

5.14061

0.648438

RNCE

3.98173

1

6.03255

1

STCE

29.6889

1

6.86523

1

PROPLACE

12.9444

1

5.96721

1

A demonstration of how to use the library is available here:

conda activate robustx
cd RobustX/demo
streamlit run demo_main.py --theme.base="light"

Python notebooks demonstrating the usage of RobustX are available here.

The docs pages can be accessed by opening docs/build/html/index.html.

Contributors

License

RobustX is licensed under the MIT License. For more details, please refer to the LICENSE file in this repository.

robustx documentation

WIP. Docs automatically generated in codes.