Skip to content
Last updated

Shap Analysis

This notebook shows SHAP (SHapley Additive exPlanations) values to interpret the relative importance of features in the resulting predictions. Learn more about SHAP Analysis through this TD Blog Post.

Some sample visualizations are as follows:

Workflow Example

Find a sample workflow here in Treasure Boxes.

+explain_predictions_by_shap:
  ipynb>:
    notebook: shapley
    model_name: gluon_model         # model used for prediction
    input_table: ml_test.gluon_test # test data used for prediction

Parameters

Parameter nameParameter on ConsoleDescriptionDefault Value
docker.task_nameDocker Task MemTask memory size. Available values are 64g, 128g (default), 256g, 384g, or 512g depending on your contracted tiers128g
model_nameModel Nameprediction model name-
input_tableInput Tablespecify a TD table in dbname.table_name-
shared_modelShared Modelspecify a shared model UUIDNone
sampling_thresholdSampling Thresholdthreshold used for sampling. See the executed notebook in detail.10_000_000
hide_table_contentsHide Table Contentssuppress showing table contentsfalse
explain_thresholdExplain Thresholdthe number of rows to explain shapley values200
interpret_samplesInterpret Samplesthe number of samples to build the surrogate model to interpret predictions100
export_shap_valuesExport Shap Valuesexport shapley values as a TD tableNone