MLInterpret.jl

A Meta Package for Machine Learning Interpretation
Author AStupidBear
Popularity
22 Stars
Updated Last
2 Years Ago
Started In
October 2019

Machine Learning Interpretation

Build Status Coverage

Installation

using Pkg
pkg"add MLInterpret"

Try without installation using docker

docker run -it --rm astupidbear/mli

Or build it from Dockerfile

url=https://raw.githubusercontent.com/AStupidBear/MLInterpret.jl/master/Dockerfile.py
python3 -c "$(curl $url)"

Usage

using MLInterpret
using PyCall
using PyCallUtils
using PandasLite
X = DataFrame(randn(Float32, 10000, 5))
y = (X[3] > 0) & (X[2] >= 0)
@from lightgbm imports LGBMRegressor
model = LGBMRegressor()
model.fit(X, y)

You can interpret any machine learning model from Python which has a property .predict by calling

interpret(model, X, y)

If your model dosen't have a property '.predict' (like Julia models), you can still interpret its predictions by

= model.predict(X)
interpret(X, ŷ)

This will generate a folder mli in the current directory which contains

  • pdp.pdf: partial dependency plot link
  • perturb_feaimpt.csv: feature importance calculated by purturbation link
  • shap.pdf: shap value link
  • shap2.pdf: shap interaction value link
  • surrogate_tree-*.pdf: surrogate tree link
  • actual.pdf: actual plot link
  • actual2.pdf: actual interaction plot link

MLI with H2O Driverless AI

Start DAI

docker run -d \
    --pid=host \
    --init \
    -u `id -u`:`id -g` \
    -p 12345:12345 \
    -v /dev/shm:/dev/shm \
    astupidbear/dai:1.7.0

You can get a trial license of H2O Driverless AI from H2O, then open http://127.0.0.1:12345/, login and enter your license.

Interpret

dai_interpret(X, y)

Open http://127.0.0.1:12345/, click MLI, choose the toppest Interpreted Model

MLI with Bayesian Rule List

Installation

using MLInterpret
MLInterpret.install_brl()

Interpret

sbrl_interpret(X, y)

A file named sbrl.txt will be created in your working directory.

Used By Packages

No packages found.