Python Library for Model Interpretation/Explanations

Overview

Skater

Skater is a unified framework to enable Model Interpretation for all forms of model to help one build an Interpretable machine learning system often needed for real world use-cases(** we are actively working towards to enabling faithful interpretability for all forms models). It is an open source python library designed to demystify the learned structures of a black box model both globally(inference on the basis of a complete data set) and locally(inference about an individual prediction).

The project was started as a research idea to find ways to enable better interpretability(preferably human interpretability) to predictive "black boxes" both for researchers and practioners. The project is still in beta phase.

Install Skater

pip

    Option 1: without rule lists and without deepinterpreter
    pip install -U skater

    Option 2: without rule lists and with deep-interpreter:
    1. Ubuntu: pip3 install --upgrade tensorflow (follow instructions at https://www.tensorflow.org/install/ for details and best practices)
    2. sudo pip install keras
    3. pip install -U skater==1.1.2

    Option 3: For everything included
    1. conda install gxx_linux-64
    2. Ubuntu: pip3 install --upgrade tensorflow (follow instructions https://www.tensorflow.org/install/ for
       details and best practices)
    3. sudo pip install keras
    4. sudo pip install -U --no-deps --force-reinstall --install-option="--rl=True" skater==1.1.2

To get the latest changes try cloning the repo and use the below mentioned commands to get started,


    1. conda install gxx_linux-64
    2. Ubuntu: pip3 install --upgrade tensorflow (follow instructions https://www.tensorflow.org/install/ for
       details and best practices)
    3. sudo pip install keras
    4. git clone the repo
    5. sudo python setup.py install --ostype=linux-ubuntu --rl=True

Testing

  1. If repo is cloned: python skater/tests/all_tests.py
  2. If pip installed: python -c "from skater.tests.all_tests import run_tests; run_tests()"

Usage and Examples

See examples folder for usage examples.

Comments
  • Can't pip install 1.0.3

    Can't pip install 1.0.3

    This version is listed as latest on pypi, but I get this when I try to install it:

    Could not find a version that satisfies the requirement skater==1.0.3 (from versions: 1.0.0b10, 1.0.0b11, 1.0.1, 1.0.2, 1.1.0b1)

    1.0.2 installs just fine (and also installs by default with pip install skater)

    question 
    opened by CPapadim 17
  • Kernel dies when running plot_partial_dependence

    Kernel dies when running plot_partial_dependence

    Hello,

    I'm currently having trouble running plot_partial_dependence for my XGBoost model.

    My kernel dies when I try to run the script, and this happens also if I run partial_dependence: The kernel appears to have died. It will restart automatically.

    In terminal, it says: malloc: *** error for object 0x100007f87ba95b8c: pointer being freed was not allocated *** set a breakpoint in malloc_error_break to debug

    There doesn't seem to be an issue when I run plot_partial_dependence for my random forest model.

    Could someone please look into this? Thanks in advance for your help!

    bug question 
    opened by alay18 16
  • skater.util.exceptions.ModelError: Predict function must be callable

    skater.util.exceptions.ModelError: Predict function must be callable

    Was checking out some examples in the documentation - https://datascienceinc.github.io/Skater/reference/interpretation.html#skater.core.global_interpretation.feature_importance.FeatureImportance.plot_feature_importance

    from skater.model import InMemoryModel
        from skater.core.explanations import Interpretation
        from sklearn.ensemble import RandomForestClassifier
        from sklearn.datasets import load_iris
    
        X, y = load_iris(True)
        rf = RandomForestClassifier()
        rf.fit(X,y)
        model = InMemoryModel(rf, examples = X)
        interpreter = Interpretation()
        interpreter.load_data(X)
        interpreter.feature_importance.plot_feature_importance(model, ascending=True, ax=None)
    

    gave this error,

    Traceback (most recent call last):
      File "rough.py", line 52, in <module>
        main()
      File "rough.py", line 48, in main
        model_interpration_skater()
      File "rough.py", line 40, in model_interpration_skater
        model = InMemoryModel(rf, examples = X)
      File "<ABS_PATH>/lib/python2.7/site-packages/skater/model/local_model.py", line 75, in __init__
        raise(exceptions.ModelError("Predict function must be callable"))
    skater.util.exceptions.ModelError: Predict function must be callable
    
    question 
    opened by Nithanaroy 12
  • Cant install on Python 3.7

    Cant install on Python 3.7

    The installation fails on Python 3.7.3 on a machine running Ubuntu 18.04.3

    Building wheels for collected packages: wordcloud
      Building wheel for wordcloud (setup.py) ... error
      ERROR: Command errored out with exit status 1:
       command: /home/user/.pyenv/versions/3.7.3/envs/absa-py37/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"'; __file__='"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-vqaw8gcn --python-tag cp37
           cwd: /tmp/pip-install-h6dc_hhg/wordcloud/
      Complete output (86 lines):
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-x86_64-3.7
      creating build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/__init__.py -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/color_from_image.py -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/tokenization.py -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/wordcloud.py -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/wordcloud_cli.py -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/stopwords -> build/lib.linux-x86_64-3.7/wordcloud
      copying wordcloud/DroidSansMono.ttf -> build/lib.linux-x86_64-3.7/wordcloud
      running build_ext
      building 'wordcloud.query_integral_image' extension
      creating build/temp.linux-x86_64-3.7
      creating build/temp.linux-x86_64-3.7/wordcloud
      gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/home/user/.pyenv/versions/3.7.3/envs/absa-py37/include -I/home/user/.pyenv/versions/3.7.3/include/python3.7m -c wordcloud/query_integral_image.c -o build/temp.linux-x86_64-3.7/wordcloud/query_integral_image.o
      wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSave’:
      wordcloud/query_integral_image.c:14910:19: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           *type = tstate->exc_type;
                         ^~
      wordcloud/query_integral_image.c:14911:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           *value = tstate->exc_value;
                          ^~
      wordcloud/query_integral_image.c:14912:17: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           *tb = tstate->exc_traceback;
                       ^~
      wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionReset’:
      wordcloud/query_integral_image.c:14924:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tmp_type = tstate->exc_type;
                            ^~
      wordcloud/query_integral_image.c:14925:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tmp_value = tstate->exc_value;
                             ^~
      wordcloud/query_integral_image.c:14926:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tmp_tb = tstate->exc_traceback;
                          ^~
      wordcloud/query_integral_image.c:14927:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tstate->exc_type = type;
                 ^~
      wordcloud/query_integral_image.c:14928:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tstate->exc_value = value;
                 ^~
      wordcloud/query_integral_image.c:14929:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tstate->exc_traceback = tb;
                 ^~
      wordcloud/query_integral_image.c: In function ‘__Pyx_GetException’:
      wordcloud/query_integral_image.c:14972:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tmp_type = tstate->exc_type;
                            ^~
      wordcloud/query_integral_image.c:14973:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tmp_value = tstate->exc_value;
                             ^~
      wordcloud/query_integral_image.c:14974:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tmp_tb = tstate->exc_traceback;
                          ^~
      wordcloud/query_integral_image.c:14975:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tstate->exc_type = local_type;
                 ^~
      wordcloud/query_integral_image.c:14976:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tstate->exc_value = local_value;
                 ^~
      wordcloud/query_integral_image.c:14977:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tstate->exc_traceback = local_tb;
                 ^~
      wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSwap’:
      wordcloud/query_integral_image.c:14999:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tmp_type = tstate->exc_type;
                            ^~
      wordcloud/query_integral_image.c:15000:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tmp_value = tstate->exc_value;
                             ^~
      wordcloud/query_integral_image.c:15001:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tmp_tb = tstate->exc_traceback;
                          ^~
      wordcloud/query_integral_image.c:15002:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
           tstate->exc_type = *type;
                 ^~
      wordcloud/query_integral_image.c:15003:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
           tstate->exc_value = *value;
                 ^~
      wordcloud/query_integral_image.c:15004:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
           tstate->exc_traceback = *tb;
                 ^~
      error: command 'gcc' failed with exit status 1
      ----------------------------------------
      ERROR: Failed building wheel for wordcloud
      Running setup.py clean for wordcloud
    Failed to build wordcloud
    Installing collected packages: wordcloud, Jinja2, soupsieve, beautifulsoup4
      Running setup.py install for wordcloud ... error
        ERROR: Command errored out with exit status 1:
         command: /home/user/.pyenv/versions/3.7.3/envs/absa-py37/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"'; __file__='"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-ejl6j_1_/install-record.txt --single-version-externally-managed --compile --install-headers /home/user/.pyenv/versions/3.7.3/envs/absa-py37/include/site/python3.7/wordcloud
             cwd: /tmp/pip-install-h6dc_hhg/wordcloud/
        Complete output (86 lines):
        running install
        running build
        running build_py
        creating build
        creating build/lib.linux-x86_64-3.7
        creating build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/__init__.py -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/color_from_image.py -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/tokenization.py -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/wordcloud.py -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/wordcloud_cli.py -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/stopwords -> build/lib.linux-x86_64-3.7/wordcloud
        copying wordcloud/DroidSansMono.ttf -> build/lib.linux-x86_64-3.7/wordcloud
        running build_ext
        building 'wordcloud.query_integral_image' extension
        creating build/temp.linux-x86_64-3.7
        creating build/temp.linux-x86_64-3.7/wordcloud
        gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/home/user/.pyenv/versions/3.7.3/envs/absa-py37/include -I/home/user/.pyenv/versions/3.7.3/include/python3.7m -c wordcloud/query_integral_image.c -o build/temp.linux-x86_64-3.7/wordcloud/query_integral_image.o
        wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSave’:
        wordcloud/query_integral_image.c:14910:19: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             *type = tstate->exc_type;
                           ^~
        wordcloud/query_integral_image.c:14911:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             *value = tstate->exc_value;
                            ^~
        wordcloud/query_integral_image.c:14912:17: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             *tb = tstate->exc_traceback;
                         ^~
        wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionReset’:
        wordcloud/query_integral_image.c:14924:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tmp_type = tstate->exc_type;
                              ^~
        wordcloud/query_integral_image.c:14925:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tmp_value = tstate->exc_value;
                               ^~
        wordcloud/query_integral_image.c:14926:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tmp_tb = tstate->exc_traceback;
                            ^~
        wordcloud/query_integral_image.c:14927:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tstate->exc_type = type;
                   ^~
        wordcloud/query_integral_image.c:14928:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tstate->exc_value = value;
                   ^~
        wordcloud/query_integral_image.c:14929:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tstate->exc_traceback = tb;
                   ^~
        wordcloud/query_integral_image.c: In function ‘__Pyx_GetException’:
        wordcloud/query_integral_image.c:14972:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tmp_type = tstate->exc_type;
                              ^~
        wordcloud/query_integral_image.c:14973:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tmp_value = tstate->exc_value;
                               ^~
        wordcloud/query_integral_image.c:14974:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tmp_tb = tstate->exc_traceback;
                            ^~
        wordcloud/query_integral_image.c:14975:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tstate->exc_type = local_type;
                   ^~
        wordcloud/query_integral_image.c:14976:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tstate->exc_value = local_value;
                   ^~
        wordcloud/query_integral_image.c:14977:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tstate->exc_traceback = local_tb;
                   ^~
        wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSwap’:
        wordcloud/query_integral_image.c:14999:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tmp_type = tstate->exc_type;
                              ^~
        wordcloud/query_integral_image.c:15000:23: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tmp_value = tstate->exc_value;
                               ^~
        wordcloud/query_integral_image.c:15001:20: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tmp_tb = tstate->exc_traceback;
                            ^~
        wordcloud/query_integral_image.c:15002:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’?
             tstate->exc_type = *type;
                   ^~
        wordcloud/query_integral_image.c:15003:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’?
             tstate->exc_value = *value;
                   ^~
        wordcloud/query_integral_image.c:15004:11: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
             tstate->exc_traceback = *tb;
                   ^~
        error: command 'gcc' failed with exit status 1
        ----------------------------------------
    ERROR: Command errored out with exit status 1: /home/user/.pyenv/versions/3.7.3/envs/absa-py37/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"'; __file__='"'"'/tmp/pip-install-h6dc_hhg/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-ejl6j_1_/install-record.txt --single-version-externally-managed --compile --install-headers /home/user/.pyenv/versions/3.7.3/envs/absa-py37/include/site/python3.7/wordcloud Check the logs for full command output.
    

    It installs just fine on python 3.6.8. Is there any workaround to this?

    opened by zoj613 9
  • Add license to manifest and remove ds-lime requirement for conda-forge

    Add license to manifest and remove ds-lime requirement for conda-forge

    To get the package approved for conda-forge, we need to include the license file in the manifest. Also, there appears to be an orphaned requirement, ds-lime. I've removed it from the setup file. Is this correct?

    Cython is not a requirement either and can also be dropped?

    opened by benvandyke 8
  • Interpretation using Tree Surrogates

    Interpretation using Tree Surrogates

    This idea of explaining model's learned decision policies by extracting rules using a tree based surrogate model is influenced by the work of Mark W. Craven captured as the Trepan algorithm. This PR implements functions to extract the rules as "if else then" hierarchical conditional statements. The results could be visualized or rendered as text both for global and local interpretation.

    Reference papers:

    1. http://ftp.cs.wisc.edu/machine-learning/shavlik-group/craven.thesis.pdf
    2. Extracting Tree structured Representations of Trained Networks: https://papers.nips.cc/paper/1152-extracting-tree-structured-representations-of-trained-networks.pdf
    enhancement v1 new feature 
    opened by pramitchoudhary 7
  • Interpretation using Decision boundaries

    Interpretation using Decision boundaries

    With this PR, the users will be able to visualize the decision boundaries in-regards to an estimator(currently only for Classification)

    This is partial implementation. More work needs to be done to handle multi-dimensional data. Will address that as improvements to this implementation.

    enhancement v1 new feature algorithm 
    opened by pramitchoudhary 6
  • Correction to work with Python 3

    Correction to work with Python 3

    I have a TooManyFeaturesError using plot_partial_dependence with Python 3 that does not occur with Python 2. It seems to be caused by "hasattr(feature_or_feature_pair, 'iter')" in the _check_features function (line 139 of partial_dependence.py). String are iterable with Python 3 but were not with Python 2. Replacing this line by "not isinstance(feature_or_feature_pair, str)" should correct this.

    opened by viticano 6
  • OverflowError: Range exceeds array bounds

    OverflowError: Range exceeds array bounds

    I run into this problem when I try to use Skater (first time) on a catboost classifier.

    Here is my code:

    from skater.core.explanations import Interpretation
    from skater.model import InMemoryModel
    
    interpreter = Interpretation()
    
    interpreter.load_data(X_train, feature_names=list(X_orig.columns))
    im_model = InMemoryModel(model.predict_proba,
                             examples=X_test, 
                             #target_names=y_train.unique(),
                             #unique_values=y_train.unique()
                            )
    plots = interpreter.feature_importance.plot_feature_importance(im_model, ascending=True)
    

    Here is the output:

    [2/7] features █████--------------- Time elapsed: 0 seconds
    ---------------------------------------------------------------------------
    OverflowError                             Traceback (most recent call last)
    <ipython-input-141-18afbd1218da> in <module>
    ----> 1 plots = interpreter.feature_importance.plot_feature_importance(im_model, ascending=True)
    
    ~\Anaconda3\lib\site-packages\skater\core\global_interpretation\feature_importance.py in plot_feature_importance(self, predict_fn, filter_classes, ascending, ax, progressbar)
        163             raise (MatplotlibDisplayError("Matplotlib unable to open display"))
        164 
    --> 165         importances = self.feature_importance(predict_fn, filter_classes=filter_classes, progressbar=progressbar)
        166 
        167         if ax is None:
    
    ~\Anaconda3\lib\site-packages\skater\core\global_interpretation\feature_importance.py in feature_importance(self, model_instance, ascending, filter_classes, progressbar)
         85 
         86             if self.data_set.feature_info[feature_id]['numeric']:
    ---> 87                 samples = self.data_set.generate_column_sample(feature_id, n_samples=n, method='stratified')
         88             else:
         89                 samples = self.data_set.generate_column_sample(feature_id, n_samples=n, method='random-choice')
    
    ~\Anaconda3\lib\site-packages\skater\data\datamanager.py in generate_column_sample(self, feature_id, n_samples, method)
        407             return self._generate_column_sample_random_choice(feature_id, n_samples=n_samples)
        408         elif method == 'stratified':
    --> 409             return self._generate_column_sample_stratified(feature_id, n_samples=n_samples)
        410         else:
        411             raise(NotImplementedError("Currenly we only support random-choice, stratified for "
    
    ~\Anaconda3\lib\site-packages\skater\data\datamanager.py in _generate_column_sample_stratified(self, feature_id, n_samples, n_bins)
        439         samples = []
        440         for window, n in zip(sample_windows, samples_per_bin):
    --> 441             samples.append(np.random.uniform(window[0], window[1], size=int(n)).tolist())
        442 
        443         return np.array(flatten(samples))
    
    mtrand.pyx in numpy.random.mtrand.RandomState.uniform()
    
    OverflowError: Range exceeds valid bounds
    
    opened by lrq3000 5
  • force exception into using map instead of mapper when n_jobs==1

    force exception into using map instead of mapper when n_jobs==1

    The PartialDependence class currently tries multiprocessing even when n_jobs==1. This code adopted from the FeatureImportance class should fix this from happening.

    opened by alvinthai 5
  • Improvement: Improvement to partial_dependence function

    Improvement: Improvement to partial_dependence function

    Too many if else is ugly and error prone and difficult to extend. When time permits lets address this. This will help in reducing bugs. We should do this earlier than later because as we move forward we will forget about things.

    enhancement v2 performance 
    opened by pramitchoudhary 5
  • Surrogate Tree Explainer

    Surrogate Tree Explainer "plot_global_decisions" fails to generate

    I try to generate the Surrogate Tree explainer based on your example code from GitHub but it fails.

    Below is the code:

    %matplotlib inline
    import matplotlib.pyplot
    import matplotlib.pyplot as plt
    import numpy as np
    from sklearn.model_selection import train_test_split
    from sklearn.ensemble import RandomForestClassifier
    from sklearn import datasets
    from sklearn import svm
    from skater.core.explanations import Interpretation
    from skater.model import InMemoryModel
    
    from skater.core.global_interpretation.tree_surrogate import TreeSurrogate
    from skater.util.dataops import show_in_notebook
    
    iris = datasets.load_iris()
    digits = datasets.load_digits()
    X = iris.data
    y = iris.target
    
    clf = RandomForestClassifier(random_state=0, n_jobs=-1)
    
    xtrain, xtest, ytrain, ytest = train_test_split(X,y,test_size=0.2, random_state=0)
    clf = clf.fit(xtrain, ytrain)
    
    y_pred=clf.predict(xtest)
    prob=clf.predict_proba(xtest)
    
    interpreter = Interpretation(
            training_data=xtrain, training_labels=ytrain, feature_names=iris.feature_names
        )
    pyint_model = InMemoryModel(
                clf.predict_proba,
                examples=xtrain,
                target_names=iris.target_names,
                unique_values=np.unique(ytrain).tolist(),
                feature_names=iris.feature_names,
            )
    
    surrogate_explainer = interpreter.tree_surrogate(oracle=pyint_model, seed=5)
    surrogate_explainer.fit(xtrain, ytrain)
    `surrogate_explainer.plot_global_decisions(show_img=True)
    

    And this is the error generated from the code:

    2022-03-30 01:17:01,327 - skater.core.global_interpretation.tree_surrogate - INFO - post pruning applied ...
    2022-03-30 01:17:01,332 - skater.core.global_interpretation.tree_surrogate - INFO - Scorer used cross-entropy
    2022-03-30 01:17:01,342 - skater.core.global_interpretation.tree_surrogate - INFO - original score using base model 2.1094237467877998e-15
    2022-03-30 01:17:01,388 - skater.core.global_interpretation.tree_surrogate - INFO - Summary: childrens of the following nodes are removed []
    2022-03-30 01:17:01,392 - skater.core.global_interpretation.tree_surrogate - INFO - Done generating prediction using the surrogate, shape (120, 3)
    2022-03-30 01:17:01,398 - skater.core.global_interpretation.tree_surrogate - INFO - Done scoring, surrogate score 0.0; oracle score 0.033
    2022-03-30 01:17:01,401 - skater.core.global_interpretation.tree_surrogate - WARNING - impurity score: 0.033 of the surrogate model is higher than the impurity threshold: 0.01. The higher the impurity score, lower is the fidelity/faithfulness of the surrogate model
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    [<ipython-input-13-dd8a23b7ccad>](https://localhost:8080/#) in <module>()
         39 surrogate_explainer = interpreter.tree_surrogate(oracle=pyint_model, seed=5)
         40 surrogate_explainer.fit(xtrain, ytrain)
    ---> 41 surrogate_explainer.plot_global_decisions(show_img=True)
    
    2 frames
    [/usr/local/lib/python3.7/dist-packages/skater/core/global_interpretation/tree_surrogate.py](https://localhost:8080/#) in plot_global_decisions(self, colors, enable_node_id, random_state, file_name, show_img, fig_size)
        399         """
        400         graph_inst = plot_tree(self.__model, self.__model_type, feature_names=self.feature_names, color_list=colors,
    --> 401                                class_names=self.class_names, enable_node_id=enable_node_id, seed=random_state)
        402         f_name = "interpretable_tree.png" if file_name is None else file_name
        403         graph_inst.write_png(f_name)
    
    [/usr/local/lib/python3.7/dist-packages/skater/core/visualizer/tree_visualizer.py](https://localhost:8080/#) in plot_tree(estimator, estimator_type, feature_names, class_names, color_list, colormap_reg, enable_node_id, coverage, seed)
        105         default_color = None
        106 
    --> 107     graph = _set_node_properites(estimator, estimator_type, graph, color_names=colors, default_color=default_color)
        108 
        109     # Set the color scheme for the edges
    
    [/usr/local/lib/python3.7/dist-packages/skater/core/visualizer/tree_visualizer.py](https://localhost:8080/#) in _set_node_properites(estimator, estimator_type, graph_instance, color_names, default_color)
         68         if node.get_name() not in ('node', 'edge'):
         69             if estimator_type == 'classifier':
    ---> 70                 value = values[int(node.get_name())][0]
         71                 # 1. Color only the leaf nodes, where one class is dominant or if it is a leaf node
         72                 # 2. For mixed population or otherwise set the default color
    
    ValueError: invalid literal for int() with base 10: '"\\n"'
    

    Please kindly take a look. Thank you!

    opened by christoferjulio3 0
  • How to install Skater in Kaggle kernel?!

    How to install Skater in Kaggle kernel?!

    I try !pip install skater

    but get:

    `Collecting skater Downloading skater-1.1.2.tar.gz (96 kB) |████████████████████████████████| 96 kB 1.1 MB/s eta 0:00:01 Requirement already satisfied: scikit-learn>=0.18 in /opt/conda/lib/python3.7/site-packages (from skater) (0.23.2) Collecting scikit-image==0.14 Downloading scikit_image-0.14.0-cp37-cp37m-manylinux1_x86_64.whl (25.3 MB) |████████████████████████████████| 25.3 MB 4.4 MB/s eta 0:00:01 Requirement already satisfied: pandas>=0.22.0 in /opt/conda/lib/python3.7/site-packages (from skater) (1.1.2) Collecting ds-lime>=0.1.1.21 Downloading ds-lime-0.1.1.27.tar.gz (253 kB) |████████████████████████████████| 253 kB 59.8 MB/s eta 0:00:01 Requirement already satisfied: requests in /opt/conda/lib/python3.7/site-packages (from skater) (2.23.0) Requirement already satisfied: multiprocess in /opt/conda/lib/python3.7/site-packages (from skater) (0.70.10) Requirement already satisfied: dill>=0.2.6 in /opt/conda/lib/python3.7/site-packages (from skater) (0.3.2) Collecting wordcloud==1.3.1 Downloading wordcloud-1.3.1.tar.gz (169 kB) |████████████████████████████████| 169 kB 62.1 MB/s eta 0:00:01 Collecting joblib==0.11 Downloading joblib-0.11-py2.py3-none-any.whl (176 kB) |████████████████████████████████| 176 kB 60.3 MB/s eta 0:00:01 Collecting Jinja2==2.10 Downloading Jinja2-2.10-py2.py3-none-any.whl (126 kB) |████████████████████████████████| 126 kB 64.0 MB/s eta 0:00:01 Collecting pydotplus==2.0.2 Downloading pydotplus-2.0.2.tar.gz (278 kB) |████████████████████████████████| 278 kB 64.0 MB/s eta 0:00:01 Collecting bs4 Downloading bs4-0.0.1.tar.gz (1.1 kB) Requirement already satisfied: scipy>=0.19.1 in /opt/conda/lib/python3.7/site-packages (from scikit-learn>=0.18->skater) (1.4.1) Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from scikit-learn>=0.18->skater) (2.1.0) Requirement already satisfied: numpy>=1.13.3 in /opt/conda/lib/python3.7/site-packages (from scikit-learn>=0.18->skater) (1.18.5) Requirement already satisfied: pillow>=4.3.0 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (7.2.0) Requirement already satisfied: six>=1.10.0 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (1.14.0) Requirement already satisfied: dask[array]>=0.9.0 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (2.27.0) Requirement already satisfied: cloudpickle>=0.2.1 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (1.3.0) Requirement already satisfied: PyWavelets>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (1.1.1) Requirement already satisfied: matplotlib>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (3.2.1) Requirement already satisfied: networkx>=1.8 in /opt/conda/lib/python3.7/site-packages (from scikit-image==0.14->skater) (2.4) Requirement already satisfied: python-dateutil>=2.7.3 in /opt/conda/lib/python3.7/site-packages (from pandas>=0.22.0->skater) (2.8.1) Requirement already satisfied: pytz>=2017.2 in /opt/conda/lib/python3.7/site-packages (from pandas>=0.22.0->skater) (2019.3) Requirement already satisfied: chardet<4,>=3.0.2 in /opt/conda/lib/python3.7/site-packages (from requests->skater) (3.0.4) Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/lib/python3.7/site-packages (from requests->skater) (1.24.3) Requirement already satisfied: idna<3,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests->skater) (2.9) Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.7/site-packages (from requests->skater) (2020.6.20) Requirement already satisfied: MarkupSafe>=0.23 in /opt/conda/lib/python3.7/site-packages (from Jinja2==2.10->skater) (1.1.1) Requirement already satisfied: pyparsing>=2.0.1 in /opt/conda/lib/python3.7/site-packages (from pydotplus==2.0.2->skater) (2.4.7) Requirement already satisfied: beautifulsoup4 in /opt/conda/lib/python3.7/site-packages (from bs4->skater) (4.9.0) Requirement already satisfied: pyyaml in /opt/conda/lib/python3.7/site-packages (from dask[array]>=0.9.0->scikit-image==0.14->skater) (5.3.1) Requirement already satisfied: toolz>=0.8.2; extra == "array" in /opt/conda/lib/python3.7/site-packages (from dask[array]>=0.9.0->scikit-image==0.14->skater) (0.10.0) Requirement already satisfied: cycler>=0.10 in /opt/conda/lib/python3.7/site-packages (from matplotlib>=2.0.0->scikit-image==0.14->skater) (0.10.0) Requirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/lib/python3.7/site-packages (from matplotlib>=2.0.0->scikit-image==0.14->skater) (1.2.0) Requirement already satisfied: decorator>=4.3.0 in /opt/conda/lib/python3.7/site-packages (from networkx>=1.8->scikit-image==0.14->skater) (4.4.2) Requirement already satisfied: soupsieve>1.2 in /opt/conda/lib/python3.7/site-packages (from beautifulsoup4->bs4->skater) (1.9.4) Building wheels for collected packages: skater, ds-lime, wordcloud, pydotplus, bs4 Building wheel for skater (setup.py) ... error ERROR: Command errored out with exit status 1: command: /opt/conda/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-5ieiwjh2/skater/setup.py'"'"'; file='"'"'/tmp/pip-install-5ieiwjh2/skater/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-x8i3xny_ cwd: /tmp/pip-install-5ieiwjh2/skater/ Complete output (363 lines): running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/skater copying skater/init.py -> build/lib/skater copying skater/about.py -> build/lib/skater creating build/lib/skater/data copying skater/data/init.py -> build/lib/skater/data copying skater/data/datamanager.py -> build/lib/skater/data creating build/lib/skater/tests copying skater/tests/test_image_ops.py -> build/lib/skater/tests copying skater/tests/init.py -> build/lib/skater/tests copying skater/tests/test_partial_dependence.py -> build/lib/skater/tests copying skater/tests/test_scorer.py -> build/lib/skater/tests copying skater/tests/all_tests.py -> build/lib/skater/tests copying skater/tests/test_data.py -> build/lib/skater/tests copying skater/tests/test_feature_importance.py -> build/lib/skater/tests copying skater/tests/test_dnni.py -> build/lib/skater/tests copying skater/tests/test_validation.py -> build/lib/skater/tests copying skater/tests/test_rule_list.py -> build/lib/skater/tests copying skater/tests/arg_parser.py -> build/lib/skater/tests copying skater/tests/test_tree_surrogates.py -> build/lib/skater/tests copying skater/tests/test_lime.py -> build/lib/skater/tests copying skater/tests/test_model.py -> build/lib/skater/tests copying skater/tests/test_text_ops.py -> build/lib/skater/tests creating build/lib/skater/core copying skater/core/init.py -> build/lib/skater/core copying skater/core/model_interpreter.py -> build/lib/skater/core copying skater/core/validation.py -> build/lib/skater/core copying skater/core/explanations.py -> build/lib/skater/core creating build/lib/skater/model copying skater/model/local_model.py -> build/lib/skater/model copying skater/model/init.py -> build/lib/skater/model copying skater/model/deployed_model.py -> build/lib/skater/model copying skater/model/base.py -> build/lib/skater/model copying skater/model/scorer.py -> build/lib/skater/model creating build/lib/skater/util copying skater/util/image_ops.py -> build/lib/skater/util copying skater/util/text_ops.py -> build/lib/skater/util copying skater/util/init.py -> build/lib/skater/util copying skater/util/kernels.py -> build/lib/skater/util copying skater/util/logger.py -> build/lib/skater/util copying skater/util/dataops.py -> build/lib/skater/util copying skater/util/exceptions.py -> build/lib/skater/util copying skater/util/serialization.py -> build/lib/skater/util copying skater/util/user_defined_types.py -> build/lib/skater/util copying skater/util/plotting.py -> build/lib/skater/util copying skater/util/progressbar.py -> build/lib/skater/util copying skater/util/static_types.py -> build/lib/skater/util creating build/lib/skater/tests/util copying skater/tests/util/build_example.py -> build/lib/skater/tests/util copying skater/tests/util/init.py -> build/lib/skater/tests/util creating build/lib/skater/core/local_interpretation copying skater/core/local_interpretation/init.py -> build/lib/skater/core/local_interpretation copying skater/core/local_interpretation/text_interpreter.py -> build/lib/skater/core/local_interpretation creating build/lib/skater/core/global_interpretation copying skater/core/global_interpretation/init.py -> build/lib/skater/core/global_interpretation copying skater/core/global_interpretation/feature_importance.py -> build/lib/skater/core/global_interpretation copying skater/core/global_interpretation/base.py -> build/lib/skater/core/global_interpretation copying skater/core/global_interpretation/partial_dependence.py -> build/lib/skater/core/global_interpretation copying skater/core/global_interpretation/tree_surrogate.py -> build/lib/skater/core/global_interpretation creating build/lib/skater/core/visualizer copying skater/core/visualizer/text_relevance_visualizer.py -> build/lib/skater/core/visualizer copying skater/core/visualizer/init.py -> build/lib/skater/core/visualizer copying skater/core/visualizer/image_relevance_visualizer.py -> build/lib/skater/core/visualizer copying skater/core/visualizer/tree_visualizer.py -> build/lib/skater/core/visualizer creating build/lib/skater/core/local_interpretation/dnni copying skater/core/local_interpretation/dnni/init.py -> build/lib/skater/core/local_interpretation/dnni copying skater/core/local_interpretation/dnni/initializer.py -> build/lib/skater/core/local_interpretation/dnni copying skater/core/local_interpretation/dnni/deep_interpreter.py -> build/lib/skater/core/local_interpretation/dnni copying skater/core/local_interpretation/dnni/perturbation_relevance_scorer.py -> build/lib/skater/core/local_interpretation/dnni copying skater/core/local_interpretation/dnni/gradient_relevance_scorer.py -> build/lib/skater/core/local_interpretation/dnni creating build/lib/skater/core/local_interpretation/lime copying skater/core/local_interpretation/lime/lime_image.py -> build/lib/skater/core/local_interpretation/lime copying skater/core/local_interpretation/lime/init.py -> build/lib/skater/core/local_interpretation/lime copying skater/core/local_interpretation/lime/lime_tabular.py -> build/lib/skater/core/local_interpretation/lime copying skater/core/local_interpretation/lime/lime_text.py -> build/lib/skater/core/local_interpretation/lime creating build/lib/skater/core/global_interpretation/interpretable_models copying skater/core/global_interpretation/interpretable_models/brlc.py -> build/lib/skater/core/global_interpretation/interpretable_models copying skater/core/global_interpretation/interpretable_models/init.py -> build/lib/skater/core/global_interpretation/interpretable_models copying skater/core/global_interpretation/interpretable_models/bigdatabrlc.py -> build/lib/skater/core/global_interpretation/interpretable_models running egg_info writing skater.egg-info/PKG-INFO writing dependency_links to skater.egg-info/dependency_links.txt writing requirements to skater.egg-info/requires.txt writing top-level names to skater.egg-info/top_level.txt reading manifest file 'skater.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'skater.egg-info/SOURCES.txt' creating build/lib/skater/util/model_specific copying skater/util/model_specific/imagenet_label.json -> build/lib/skater/util/model_specific installing to build/bdist.linux-x86_64/wheel running install Checking .pth file support in build/bdist.linux-x86_64/wheel/ /opt/conda/bin/python3.7 -E -c pass TEST FAILED: build/bdist.linux-x86_64/wheel/ does NOT support .pth files bad install directory or PYTHONPATH

    You are attempting to install a package to a directory that is not on PYTHONPATH and which Python does not read ".pth" files from. The installation directory you specified (via --install-dir, --prefix, or the distutils default setting) was:

      build/bdist.linux-x86_64/wheel/
    

    and your PYTHONPATH environment variable currently contains:

      '/kaggle/lib/kagglegym:/kaggle/lib:/kaggle/input/mf-accelerator'
    

    Here are some of your options for correcting the problem:

    • You can choose a different installation directory, i.e., one that is on PYTHONPATH or supports .pth files

    • You can add the installation directory to the PYTHONPATH environment variable. (It must then also be on PYTHONPATH whenever you run Python and want to use the package(s) you are installing.)

    • You can set up the installation directory to support ".pth" files by using one of the approaches described here:

      https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations

    Please make the appropriate changes for your system and try again. running bdist_egg installing library code to build/bdist.linux-x86_64/egg running install_lib creating build/bdist.linux-x86_64/egg creating build/bdist.linux-x86_64/egg/skater copying build/lib/skater/init.py -> build/bdist.linux-x86_64/egg/skater creating build/bdist.linux-x86_64/egg/skater/data copying build/lib/skater/data/init.py -> build/bdist.linux-x86_64/egg/skater/data copying build/lib/skater/data/datamanager.py -> build/bdist.linux-x86_64/egg/skater/data copying build/lib/skater/about.py -> build/bdist.linux-x86_64/egg/skater creating build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_image_ops.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/init.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_partial_dependence.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_scorer.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/all_tests.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_data.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_feature_importance.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_dnni.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_validation.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_rule_list.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/arg_parser.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_tree_surrogates.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_lime.py -> build/bdist.linux-x86_64/egg/skater/tests copying build/lib/skater/tests/test_model.py -> build/bdist.linux-x86_64/egg/skater/tests creating build/bdist.linux-x86_64/egg/skater/tests/util copying build/lib/skater/tests/util/build_example.py -> build/bdist.linux-x86_64/egg/skater/tests/util copying build/lib/skater/tests/util/init.py -> build/bdist.linux-x86_64/egg/skater/tests/util copying build/lib/skater/tests/test_text_ops.py -> build/bdist.linux-x86_64/egg/skater/tests creating build/bdist.linux-x86_64/egg/skater/core creating build/bdist.linux-x86_64/egg/skater/core/local_interpretation creating build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/dnni/init.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/dnni/initializer.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/dnni/deep_interpreter.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/dnni/perturbation_relevance_scorer.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/dnni/gradient_relevance_scorer.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/dnni copying build/lib/skater/core/local_interpretation/init.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation creating build/bdist.linux-x86_64/egg/skater/core/local_interpretation/lime copying build/lib/skater/core/local_interpretation/lime/lime_image.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/lime copying build/lib/skater/core/local_interpretation/lime/init.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/lime copying build/lib/skater/core/local_interpretation/lime/lime_tabular.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/lime copying build/lib/skater/core/local_interpretation/lime/lime_text.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation/lime copying build/lib/skater/core/local_interpretation/text_interpreter.py -> build/bdist.linux-x86_64/egg/skater/core/local_interpretation copying build/lib/skater/core/init.py -> build/bdist.linux-x86_64/egg/skater/core copying build/lib/skater/core/model_interpreter.py -> build/bdist.linux-x86_64/egg/skater/core creating build/bdist.linux-x86_64/egg/skater/core/global_interpretation creating build/bdist.linux-x86_64/egg/skater/core/global_interpretation/interpretable_models copying build/lib/skater/core/global_interpretation/interpretable_models/brlc.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation/interpretable_models copying build/lib/skater/core/global_interpretation/interpretable_models/init.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation/interpretable_models copying build/lib/skater/core/global_interpretation/interpretable_models/bigdatabrlc.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation/interpretable_models copying build/lib/skater/core/global_interpretation/init.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation copying build/lib/skater/core/global_interpretation/feature_importance.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation copying build/lib/skater/core/global_interpretation/base.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation copying build/lib/skater/core/global_interpretation/partial_dependence.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation copying build/lib/skater/core/global_interpretation/tree_surrogate.py -> build/bdist.linux-x86_64/egg/skater/core/global_interpretation copying build/lib/skater/core/validation.py -> build/bdist.linux-x86_64/egg/skater/core copying build/lib/skater/core/explanations.py -> build/bdist.linux-x86_64/egg/skater/core creating build/bdist.linux-x86_64/egg/skater/core/visualizer copying build/lib/skater/core/visualizer/text_relevance_visualizer.py -> build/bdist.linux-x86_64/egg/skater/core/visualizer copying build/lib/skater/core/visualizer/init.py -> build/bdist.linux-x86_64/egg/skater/core/visualizer copying build/lib/skater/core/visualizer/image_relevance_visualizer.py -> build/bdist.linux-x86_64/egg/skater/core/visualizer copying build/lib/skater/core/visualizer/tree_visualizer.py -> build/bdist.linux-x86_64/egg/skater/core/visualizer creating build/bdist.linux-x86_64/egg/skater/model copying build/lib/skater/model/local_model.py -> build/bdist.linux-x86_64/egg/skater/model copying build/lib/skater/model/init.py -> build/bdist.linux-x86_64/egg/skater/model copying build/lib/skater/model/deployed_model.py -> build/bdist.linux-x86_64/egg/skater/model copying build/lib/skater/model/base.py -> build/bdist.linux-x86_64/egg/skater/model copying build/lib/skater/model/scorer.py -> build/bdist.linux-x86_64/egg/skater/model creating build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/image_ops.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/text_ops.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/init.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/kernels.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/logger.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/dataops.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/exceptions.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/serialization.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/user_defined_types.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/plotting.py -> build/bdist.linux-x86_64/egg/skater/util creating build/bdist.linux-x86_64/egg/skater/util/model_specific copying build/lib/skater/util/model_specific/imagenet_label.json -> build/bdist.linux-x86_64/egg/skater/util/model_specific copying build/lib/skater/util/progressbar.py -> build/bdist.linux-x86_64/egg/skater/util copying build/lib/skater/util/static_types.py -> build/bdist.linux-x86_64/egg/skater/util creating build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/not-zip-safe -> build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying skater.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO creating dist creating 'dist/skater-1.1.2-py3.7.egg' and adding 'build/bdist.linux-x86_64/egg' to it removing 'build/bdist.linux-x86_64/egg' (and everything under it) Processing skater-1.1.2-py3.7.egg creating /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/skater-1.1.2-py3.7.egg Extracting skater-1.1.2-py3.7.egg to /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel

    Installed /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/skater-1.1.2-py3.7.egg Processing dependencies for skater==1.1.2 Searching for bs4 Reading https://pypi.org/simple/bs4/ Downloading https://files.pythonhosted.org/packages/10/ed/7e8b97591f6f456174139ec089c769f89a94a1a4025fe967691de971f314/bs4-0.0.1.tar.gz#sha256=36ecea1fd7cc5c0c6e4a1ff075df26d50da647b75376626cc186e2212886dd3a Best match: bs4 0.0.1 Processing bs4-0.0.1.tar.gz Writing /tmp/easy_install-aa7r89dg/bs4-0.0.1/setup.cfg Running bs4-0.0.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-aa7r89dg/bs4-0.0.1/egg-dist-tmp-vz5t14nm warning: install_lib: 'build/lib' does not exist -- no Python modules to install

    zip_safe flag not set; analyzing archive contents... Moving bs4-0.0.1-py3.7.egg to /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel

    Installed /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/bs4-0.0.1-py3.7.egg Searching for pydotplus==2.0.2 Reading https://pypi.org/simple/pydotplus/ Downloading https://files.pythonhosted.org/packages/60/bf/62567830b700d9f6930e9ab6831d6ba256f7b0b730acb37278b0ccdffacf/pydotplus-2.0.2.tar.gz#sha256=91e85e9ee9b85d2391ead7d635e3d9c7f5f44fd60a60e59b13e2403fa66505c4 Best match: pydotplus 2.0.2 Processing pydotplus-2.0.2.tar.gz Writing /tmp/easy_install-noh09qix/pydotplus-2.0.2/setup.cfg Running pydotplus-2.0.2/setup.py -q bdist_egg --dist-dir /tmp/easy_install-noh09qix/pydotplus-2.0.2/egg-dist-tmp-bl7n4jro zip_safe flag not set; analyzing archive contents... Moving pydotplus-2.0.2-py3.7.egg to /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel

    Installed /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/pydotplus-2.0.2-py3.7.egg Searching for Jinja2==2.10 Reading https://pypi.org/simple/Jinja2/ Downloading https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl#sha256=74c935a1b8bb9a3947c50a54766a969d4846290e1e788ea44c1392163723c3bd Best match: Jinja2 2.10 Processing Jinja2-2.10-py2.py3-none-any.whl Installing Jinja2-2.10-py2.py3-none-any.whl to /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel

    Installed /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/Jinja2-2.10-py3.7.egg Searching for joblib==0.11 Reading https://pypi.org/simple/joblib/ Downloading https://files.pythonhosted.org/packages/4f/51/870b2ec270fc29c5d89f85353da420606a9cb39fba4747127e7c7d7eb25d/joblib-0.11-py2.py3-none-any.whl#sha256=cf3420e27048c66916754472bc3a2d4717271103a4806f31f11707a3d82a991f Best match: joblib 0.11 Processing joblib-0.11-py2.py3-none-any.whl Installing joblib-0.11-py2.py3-none-any.whl to /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel

    Installed /tmp/pip-install-5ieiwjh2/skater/build/bdist.linux-x86_64/wheel/joblib-0.11-py3.7.egg Searching for wordcloud==1.3.1 Reading https://pypi.org/simple/wordcloud/ Downloading https://files.pythonhosted.org/packages/be/8b/4e47893d91d2408028b80815a30f128e840104e5ed10999c0121f5f1a7f3/wordcloud-1.3.1.tar.gz#sha256=a5b974e29f7677b66047de2fb214dbed8763c5f5e7b72041ecf2b4040722dce0 Best match: wordcloud 1.3.1 Processing wordcloud-1.3.1.tar.gz Writing /tmp/easy_install-6vv1q_nx/wordcloud-1.3.1/setup.cfg Running wordcloud-1.3.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-6vv1q_nx/wordcloud-1.3.1/egg-dist-tmp-1xtc6i0e wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSave’: wordcloud/query_integral_image.c:14910:21: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? *type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14911:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? *value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14912:19: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? *tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionReset’: wordcloud/query_integral_image.c:14924:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14925:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14926:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14927:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14928:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14929:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_GetException’: wordcloud/query_integral_image.c:14972:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14973:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14974:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14975:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = local_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14976:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = local_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14977:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = local_tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSwap’: wordcloud/query_integral_image.c:14999:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15000:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15001:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:15002:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = *type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15003:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = *value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15004:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = *tb; ^~~~~~~~~~~~~ curexc_traceback error: Setup script exited with error: command 'gcc' failed with exit status 1

    ERROR: Failed building wheel for skater Running setup.py clean for skater Building wheel for ds-lime (setup.py) ... done Created wheel for ds-lime: filename=ds_lime-0.1.1.27-py3-none-any.whl size=261594 sha256=ad7c77aa6f2facd5223349dcf528d6638ac1997bc5d0a97296b60e31f3739ff2 Stored in directory: /root/.cache/pip/wheels/4a/b8/e7/a458f3742328a23a7fc4674207955c47094bc233a9a0139963 Building wheel for wordcloud (setup.py) ... error ERROR: Command errored out with exit status 1: command: /opt/conda/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"'; file='"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-iesbirc9 cwd: /tmp/pip-install-5ieiwjh2/wordcloud/ Complete output (107 lines): running bdist_wheel running build running build_py creating build creating build/lib.linux-x86_64-3.7 creating build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/tokenization.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/init.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/color_from_image.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/wordcloud.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/wordcloud_cli.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/stopwords -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/DroidSansMono.ttf -> build/lib.linux-x86_64-3.7/wordcloud running build_ext building 'wordcloud.query_integral_image' extension creating build/temp.linux-x86_64-3.7 creating build/temp.linux-x86_64-3.7/wordcloud gcc -pthread -B /opt/conda/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/conda/include/python3.7m -c wordcloud/query_integral_image.c -o build/temp.linux-x86_64-3.7/wordcloud/query_integral_image.o wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSave’: wordcloud/query_integral_image.c:14910:21: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? *type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14911:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? *value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14912:19: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? *tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionReset’: wordcloud/query_integral_image.c:14924:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14925:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14926:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14927:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14928:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14929:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_GetException’: wordcloud/query_integral_image.c:14972:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14973:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14974:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14975:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = local_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14976:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = local_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14977:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = local_tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSwap’: wordcloud/query_integral_image.c:14999:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15000:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15001:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:15002:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = *type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15003:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = *value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15004:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = *tb; ^~~~~~~~~~~~~ curexc_traceback error: command 'gcc' failed with exit status 1

    ERROR: Failed building wheel for wordcloud Running setup.py clean for wordcloud Building wheel for pydotplus (setup.py) ... done Created wheel for pydotplus: filename=pydotplus-2.0.2-py3-none-any.whl size=24566 sha256=266ab5a99d3ea1e725f315dfb523308b2d217658f5095333d1d53d21cc9b5ebb Stored in directory: /root/.cache/pip/wheels/1e/7b/04/7387cf6cc9e48b4a96e361b0be812f0708b394b821bf8c9c50 Building wheel for bs4 (setup.py) ... done Created wheel for bs4: filename=bs4-0.0.1-py3-none-any.whl size=1272 sha256=95f03f807f9a053dcad2789851c36e18fbc2f172105af81f992004565d5b3ee4 Stored in directory: /root/.cache/pip/wheels/0a/9e/ba/20e5bbc1afef3a491f0b3bb74d508f99403aabe76eda2167ca Successfully built ds-lime pydotplus bs4 Failed to build skater wordcloud Installing collected packages: scikit-image, ds-lime, wordcloud, joblib, Jinja2, pydotplus, bs4, skater Attempting uninstall: scikit-image Found existing installation: scikit-image 0.16.2 Uninstalling scikit-image-0.16.2: Successfully uninstalled scikit-image-0.16.2 Attempting uninstall: wordcloud Found existing installation: wordcloud 1.8.0 Uninstalling wordcloud-1.8.0: Successfully uninstalled wordcloud-1.8.0 Running setup.py install for wordcloud ... error ERROR: Command errored out with exit status 1: command: /opt/conda/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"'; file='"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-l4_8_zaj/install-record.txt --single-version-externally-managed --compile --install-headers /opt/conda/include/python3.7m/wordcloud cwd: /tmp/pip-install-5ieiwjh2/wordcloud/ Complete output (107 lines): running install running build running build_py creating build creating build/lib.linux-x86_64-3.7 creating build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/tokenization.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/init.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/color_from_image.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/wordcloud.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/wordcloud_cli.py -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/stopwords -> build/lib.linux-x86_64-3.7/wordcloud copying wordcloud/DroidSansMono.ttf -> build/lib.linux-x86_64-3.7/wordcloud running build_ext building 'wordcloud.query_integral_image' extension creating build/temp.linux-x86_64-3.7 creating build/temp.linux-x86_64-3.7/wordcloud gcc -pthread -B /opt/conda/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/conda/include/python3.7m -c wordcloud/query_integral_image.c -o build/temp.linux-x86_64-3.7/wordcloud/query_integral_image.o wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSave’: wordcloud/query_integral_image.c:14910:21: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? *type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14911:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? *value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14912:19: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? *tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionReset’: wordcloud/query_integral_image.c:14924:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14925:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14926:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14927:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14928:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14929:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_GetException’: wordcloud/query_integral_image.c:14972:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14973:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14974:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:14975:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = local_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:14976:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = local_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:14977:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = local_tb; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c: In function ‘__Pyx_ExceptionSwap’: wordcloud/query_integral_image.c:14999:24: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tmp_type = tstate->exc_type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15000:25: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tmp_value = tstate->exc_value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15001:22: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tmp_tb = tstate->exc_traceback; ^~~~~~~~~~~~~ curexc_traceback wordcloud/query_integral_image.c:15002:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_type’; did you mean ‘curexc_type’? tstate->exc_type = *type; ^~~~~~~~ curexc_type wordcloud/query_integral_image.c:15003:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_value’; did you mean ‘curexc_value’? tstate->exc_value = *value; ^~~~~~~~~ curexc_value wordcloud/query_integral_image.c:15004:13: error: ‘PyThreadState {aka struct _ts}’ has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’? tstate->exc_traceback = *tb; ^~~~~~~~~~~~~ curexc_traceback error: command 'gcc' failed with exit status 1 ---------------------------------------- Rolling back uninstall of wordcloud Moving to /opt/conda/bin/wordcloud_cli from /tmp/pip-uninstall-by6ks366/wordcloud_cli Moving to /opt/conda/lib/python3.7/site-packages/wordcloud-1.8.0.dist-info/ from /opt/conda/lib/python3.7/site-packages/~ordcloud-1.8.0.dist-info Moving to /opt/conda/lib/python3.7/site-packages/wordcloud/ from /opt/conda/lib/python3.7/site-packages/~ordcloud ERROR: Command errored out with exit status 1: /opt/conda/bin/python3.7 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"'; file='"'"'/tmp/pip-install-5ieiwjh2/wordcloud/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-l4_8_zaj/install-record.txt --single-version-externally-managed --compile --install-headers /opt/conda/include/python3.7m/wordcloud Check the logs for full command output. WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available. You should consider upgrading via the '/opt/conda/bin/python3.7 -m pip install --upgrade pip' command.`

    opened by dimka11 1
  • Out of Memory Errors when skater is called

    Out of Memory Errors when skater is called

    As soon as I installed skater for this notebook, the Jupyter NB process starting running out of memory.

    I have 16 GB RAM with a GPU running a recently updated version of Windows 10 on Dell Precision 7530.

    So it should not be having memory issues like this. The last time I tried Oracle's Skater program about 2 years ago I had similar issues.

    Predictive Analytics.ipynb
    Process SpawnPoolWorker-12:
    Process SpawnPoolWorker-1:
    Process SpawnPoolWorker-11:
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 515, in find_class
        return StockUnpickler.find_class(self, module, name)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\__init__.py", line 2, in <module>
        from .core.explanations import Interpretation
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\explanations.py", line 3, in <module>
        from .global_interpretation.partial_dependence import PartialDependence
    Traceback (most recent call last):
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\global_interpretation\partial_dependence.py", line 9, in <module>
        from ...data import DataManager
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\data\__init__.py", line 3, in <module>
        from .datamanager import DataManager
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\data\datamanager.py", line 5, in <module>
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\__init__.py", line 82, in <module>
        from .base import clone
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\base.py", line 17, in <module>
        from .utils import _IS_32BIT
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\__init__.py", line 23, in <module>
        from .class_weight import compute_class_weight, compute_sample_weight
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\class_weight.py", line 7, in <module>
        from .validation import _deprecate_positional_args
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\validation.py", line 26, in <module>
        from .fixes import _object_dtype_isnan, parse_version
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\fixes.py", line 20, in <module>
        import scipy.stats
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\__init__.py", line 388, in <module>
        from .stats import *
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 515, in find_class
        return StockUnpickler.find_class(self, module, name)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 515, in find_class
        return StockUnpickler.find_class(self, module, name)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\stats.py", line 174, in <module>
        from scipy.spatial.distance import cdist
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\spatial\__init__.py", line 101, in <module>
        from ._procrustes import procrustes
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\__init__.py", line 2, in <module>
        from .core.explanations import Interpretation
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\explanations.py", line 3, in <module>
        from .global_interpretation.partial_dependence import PartialDependence
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\global_interpretation\partial_dependence.py", line 5, in <module>
        import pandas as pd
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\__init__.py", line 30, in <module>
        from pandas._libs import hashtable as _hashtable, lib as _lib, tslib as _tslib
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\_libs\__init__.py", line 3, in <module>
        from .tslibs import (
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\_libs\tslibs\__init__.py", line 3, in <module>
        from .conversion import localize_pydatetime, normalize_date
      File "pandas/_libs/tslibs/c_timestamp.pxd", line 7, in init pandas._libs.tslibs.conversion
      File "pandas/_libs/tslibs/c_timestamp.pyx", line 1, in init pandas._libs.tslibs.c_timestamp
      File "pandas/_libs/tslibs/tzconversion.pyx", line 1, in init pandas._libs.tslibs.tzconversion
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\__init__.py", line 2, in <module>
        from .core.explanations import Interpretation
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\spatial\_procrustes.py", line 9, in <module>
        from scipy.linalg import orthogonal_procrustes
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\linalg\__init__.py", line 211, in <module>
        from ._decomp_update import *
      File "_decomp_update.pyx", line 1, in init scipy.linalg._decomp_update
      File "pandas/_libs/tslibs/timedeltas.pyx", line 1, in init pandas._libs.tslibs.timedeltas
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\explanations.py", line 3, in <module>
        from .global_interpretation.partial_dependence import PartialDependence
    ImportError: DLL load failed: The paging file is too small for this operation to complete.
      File "pandas/_libs/tslibs/offsets.pyx", line 1, in init pandas._libs.tslibs.offsets
      File "pandas/_libs/tslibs/ccalendar.pyx", line 12, in init pandas._libs.tslibs.ccalendar
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\_config\__init__.py", line 18, in <module>
        from pandas._config import config
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\global_interpretation\partial_dependence.py", line 5, in <module>
        import pandas as pd
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\__init__.py", line 55, in <module>
        from pandas.core.api import (
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\core\api.py", line 24, in <module>
        from pandas.core.groupby import Grouper, NamedAgg
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\core\groupby\__init__.py", line 1, in <module>
        from pandas.core.groupby.generic import (  # noqa: F401
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\_config\config.py", line 54, in <module>
        from typing import Dict, List
      File "<frozen importlib._bootstrap>", line 971, in _find_and_load
      File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
      File "<frozen importlib._bootstrap_external>", line 674, in exec_module
      File "<frozen importlib._bootstrap_external>", line 764, in get_code
      File "<frozen importlib._bootstrap_external>", line 833, in get_data
    MemoryError
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\pandas\core\groupby\generic.py", line 44, in <module>
        from pandas.core.frame import DataFrame
      File "<frozen importlib._bootstrap>", line 971, in _find_and_load
      File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
      File "<frozen importlib._bootstrap_external>", line 674, in exec_module
      File "<frozen importlib._bootstrap_external>", line 764, in get_code
      File "<frozen importlib._bootstrap_external>", line 833, in get_data
    MemoryError
    Process SpawnPoolWorker-3:
    Process SpawnPoolWorker-5:
    Process SpawnPoolWorker-2:
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
    MemoryError
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
    MemoryError
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "sklearn\tree\_tree.pyx", line 671, in sklearn.tree._tree.Tree.__setstate__
      File "sklearn\tree\_tree.pyx", line 705, in sklearn.tree._tree.Tree._resize_c
      File "sklearn\tree\_utils.pyx", line 41, in sklearn.tree._utils.safe_realloc
    MemoryError: could not allocate 79688 bytes
    INTEL MKL ERROR: The paging file is too small for this operation to complete. mkl_intel_thread.1.dll.
    Intel MKL FATAL ERROR: Cannot load mkl_intel_thread.1.dll.Process SpawnPoolWorker-9:
    
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
        return load(file, ignore, **kwds)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 515, in find_class
        return StockUnpickler.find_class(self, module, name)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\ensemble\__init__.py", line 8, in <module>
        from ._forest import RandomForestClassifier
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\ensemble\_forest.py", line 56, in <module>
        from ..tree import (DecisionTreeClassifier, DecisionTreeRegressor,
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\tree\__init__.py", line 6, in <module>
        from ._classes import BaseDecisionTree
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\tree\_classes.py", line 41, in <module>
        from ._criterion import Criterion
      File "sklearn\tree\_criterion.pyx", line 1, in init sklearn.tree._criterion
      File "sklearn\tree\_splitter.pyx", line 1, in init sklearn.tree._splitter
      File "sklearn\tree\_tree.pyx", line 1, in init sklearn.tree._tree
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\neighbors\__init__.py", line 17, in <module>
        from ._nca import NeighborhoodComponentsAnalysis
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\neighbors\_nca.py", line 22, in <module>
        from ..decomposition import PCA
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\decomposition\__init__.py", line 8, in <module>
        from ._nmf import NMF, non_negative_factorization
      File "<frozen importlib._bootstrap>", line 971, in _find_and_load
      File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
      File "<frozen importlib._bootstrap_external>", line 674, in exec_module
      File "<frozen importlib._bootstrap_external>", line 771, in get_code
      File "<frozen importlib._bootstrap_external>", line 482, in _validate_bytecode_header
    MemoryError
    [I 2021-08-28 23:54:04.811 ServerApp] Saving file at /Desktop/00-PythonWIP/!0-2021-Sarkar/Ch09/Predictive Analytics.ipynb
    Process SpawnPoolWorker-10:
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 338, in get
        res = self._reader.recv_bytes()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\connection.py", line 219, in recv_bytes
        buf = self._recv_bytes(maxlength)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\connection.py", line 321, in _recv_bytes
        return self._get_more_data(ov, maxsize)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\connection.py", line 340, in _get_more_data
        assert left > 0
    AssertionError
    Process SpawnPoolWorker-20:
    Traceback (most recent call last):
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 258, in _bootstrap
        self.run()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\process.py", line 93, in run
        self._target(*self._args, **self._kwargs)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\pool.py", line 108, in worker
        task = get()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\multiprocess\queues.py", line 340, in get
        return _ForkingPickler.loads(res)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 327, in loads
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 313, in load
        return Unpickler(file, ignore=ignore, **kwds).load()
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 525, in load
        obj = StockUnpickler.load(self)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\dill\_dill.py", line 515, in find_class
        return StockUnpickler.find_class(self, module, name)
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\__init__.py", line 2, in <module>
        from .core.explanations import Interpretation
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\explanations.py", line 3, in <module>
        from .global_interpretation.partial_dependence import PartialDependence
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\core\global_interpretation\partial_dependence.py", line 9, in <module>
        from ...data import DataManager
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\data\__init__.py", line 3, in <module>
        from .datamanager import DataManager
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\skater\data\datamanager.py", line 5, in <module>
        from sklearn.metrics.pairwise import cosine_distances
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\__init__.py", line 82, in <module>
        from .base import clone
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\base.py", line 17, in <module>
        from .utils import _IS_32BIT
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\__init__.py", line 23, in <module>
        from .class_weight import compute_class_weight, compute_sample_weight
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\class_weight.py", line 7, in <module>
        from .validation import _deprecate_positional_args
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\validation.py", line 26, in <module>
        from .fixes import _object_dtype_isnan, parse_version
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\sklearn\utils\fixes.py", line 20, in <module>
        import scipy.stats
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\__init__.py", line 388, in <module>
        from .stats import *
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\stats.py", line 180, in <module>
        from . import distributions
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\distributions.py", line 8, in <module>
        from ._distn_infrastructure import (entropy, rv_discrete, rv_continuous,
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\stats\_distn_infrastructure.py", line 23, in <module>
        from scipy import optimize
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\optimize\__init__.py", line 387, in <module>
        from .optimize import *
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\optimize\optimize.py", line 36, in <module>
        from ._numdiff import approx_derivative
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\optimize\_numdiff.py", line 6, in <module>
        from scipy.sparse.linalg import LinearOperator
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\sparse\linalg\__init__.py", line 112, in <module>
        from .dsolve import *
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\sparse\linalg\dsolve\__init__.py", line 58, in <module>
        from .linsolve import *
      File "c:\ProgramData\Anaconda3\envs\elyra_env\lib\site-packages\scipy\sparse\linalg\dsolve\linsolve.py", line 11, in <module>
        from . import _superlu
    ImportError: DLL load failed: The paging file is too small for this operation to complete.
    
    
    opened by richlysakowski 0
  • Update PyPI because of outdated requirements

    Update PyPI because of outdated requirements

    The latest released version (PyPI) of Skater is 1.1.2 but since then the changes made to the repo diverge from the PyPI package. Some of these changes include loosening the requirements, which is really helpful when installing with pip and when using it with other packages that have conflicting dependencies. I will be really thankful if you would update the PyPI package.

    opened by Batev 0
  • ValueError: Can't get rows for data of type <class 'pandas.core.series.Series'>

    ValueError: Can't get rows for data of type

    Dear all,

    I am using DecisionTreeClassifier and skater. When I apply skater, I got the error below. ValueError: Can't get rows for data of type <class 'pandas.core.series.Series'> I checked both x_train and x_test are pandas.core.frame.DataFrame while y_train and y_test are numpy.ndarray. I do not know where is the problem.. Any help please? Thanks in advance.

    opened by amaa11 1
  • ModuleNotFoundError: No module named 'skater.core.global_interpretation.tree_surrogate'

    ModuleNotFoundError: No module named 'skater.core.global_interpretation.tree_surrogate'

    I'm using Skater on windows 64 with python 3.6 in Anaconda. I'm trying to implement the example on sklearn's breast cancer dataset (https://github.com/oracle/Skater/blob/master/examples/explaining_classification_breast_cancer.ipynb). When I run the command 'from skater.core.global_interpretation.tree_surrogate import TreeSurrogate' I get the error ModuleNotFoundError: No module named 'skater.core.global_interpretation.tree_surrogate'. Does anyone know why? modulenotfounderror

    opened by aliceschi93 1
Releases(v1.1.2)
  • v1.1.2(Sep 21, 2018)

    • Support for Tree Surrogates for explanations(Experimental)
    • Bug-fixes
    • Updated notebooks examples
    • Basic documentation update to reflect new changes(more work needs to be done there)
    • Other bug-fixes and improvements
    Source code(tar.gz)
    Source code(zip)
  • v1.1.1-b3(Jul 24, 2018)

    • Simplified Installation to handle dependencies better
    • Convenience functions to support flipping image orientation flip_orientation
    • Updated examples on how to use skater with xgboost
    • Enabled support for Image Inference using Occlusion with the ability to specify window_size and step_size for perturbing feature space
    • Other minor bug fixes and code clean-ups

    Credits: To all contributors who have helped in moving the project forward.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.1-b2(May 17, 2018)

    New Features:

    • Added new interface skater.core.local_interpretation.dnni.deep_interpreter.DeepInterpreter for interpreting tensorflow and Keras based models
    • enabling support for interpreting DNNs using gradient-based e-LRP and Integrated Gradient through DeepInterpreter.explain
    • added support to visualize relevance/attribution scores for interpreting image and text inputs
      • skater.core.visualizer.image_relevance_visualizer.visualize
      • skater.core.visualizer.text_relevance_visualizer import build_visual_explainer, show_in_notebook
    • user-friendly Utility functions to generate simple yet effective conditional adversarial examples for image
inputs
      • skater.util.image_ops import load_image, show_image, normalize, add_noise, flip_pixels, image_transformation
      • skater.util.image_ops import in_between, greater_than, greater_than_or_equal
    • More interactive notebook use-cases for building and interpreting DNNs for evaluating model stability/identifying blind spots
    • Updates to documentation, https://datascienceinc.github.io/Skater/overview.html
    • New section summarizing Notebook examples https://datascienceinc.github.io/Skater/gallery.html
    • Other bug fixes

    Credits:

    • Special thanks to Marco Ancona(@marcoancona) for guiding in enabling this feature within Skater.
    • Thanks to all other contributors for helping move the library forward every day.
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(May 12, 2018)

    • This is a follow-up release on experimental support for building rule-based models to enable interpretability both at the Global and Local scope.
    • More improved docs with a section on Jupyter Notebook highlighting different supported algorithms.

    More improvement in planned in the subsequent release. Stay tuned.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.0-b1(Mar 15, 2018)

    1. Skater till now has been an interpretation engine to enable post-hoc model evaluation and interpretation. With this PR Skater starts its journey to support interpretable models. Rule List algorithms are highly popular in the space of Interpretable Models because the trained models are represented as simple decision lists. In the latest release, we enable support for Bayesian Rule Lists(BRL). The probabilistic classifier( estimating P(Y=1|X) for each X ) optimizes the posterior of a Bayesian hierarchical model over the pre-mined rules.

      Usage Example:

      from skater.core.global_interpretation.interpretable_models.brlc import BRLC
      import pandas as pd
      from sklearn.datasets.mldata import fetch_mldata
      input_df = fetch_mldata("diabetes")
      ...
      Xtrain, Xtest, ytrain, ytest = train_test_split(input_df, y, test_size=0.20, random_state=0)
      
      sbrl_model = BRLC(min_rule_len=1, max_rule_len=10, iterations=10000, n_chains=20, drop_features=True)
      # Train a model, by default discretizer is enabled. So, you wish to exclude features then exclude them using
      # the undiscretize_feature_list parameter
      model = sbrl_model.fit(Xtrain, ytrain, bin_labels="default")
      
    2. Other minor bug fixes and documentation update

    Credits: Special thanks to Professor Cynthia Rudin, Hongyu Yang and @tmadl(Tamas Madl) for helping enable this feature.

    Source code(tar.gz)
    Source code(zip)
  • v1.0.3(Oct 12, 2017)

    This release includes:

    • Various bug fixes and performance improvements
    • new feature importance calculation method
    • introduction of model scorers
    • model types can now be determined explicitly

    Model Scoring

    Now, after you create a Skater model with:

    model = InMemoryModel(predict_fn, examples=examples, model_type="classifier")
    

    The model object now provides a .scorers api, which allows you to store predictions against training labels. Based on whether your model is a regressor, classifier that returns labels, or classifier that returns probabilities, scorers will automatically expose various scoring algorithms specific to your model. For instance, in the example above, we could do:

    model.scorers.f1(labels, model(X))
    model.scorers.cross_entropy(labels, model(X))
    

    if it were a regression, we could do:

    model.scorers.mse(labels, model(X))
    

    Calling model.scorers.default(labels, model(X)) or simply model.scorers(labels, model(X)) will execute the default scorer for your model, which are:

    regression: mean absolute error classifier (probabilities): cross entropy classifier (labels): f1

    Let us know if you'd like more scorers, or even better, feel free to make a PR to add more yourself!

    Feature Importance Calculation

    The default method of computing feature importance is done by perturbing each feature, and observing how much those perturbations affect predictions.

    With the addition of model scoring, we now also provide a method based on observing changes in model scoring functions; the less accurate your model becomes based on perturbing a feature, the more important it is.

    To enable scoring based feature importance, you must load training labels into your interpretation object, like:

    interpreter = Interpretation(training_data=training_data, training_labels=training_labels)
    interpreter.feature_importance.plot_feature_importance(model, method='model-scoring')
    

    Explicit Model Types

    Originall Skater tried to infer the type of your model based on the types of predictions it made. Now when you create a model, you can define these explicitely with model_type and probability keyword arguments to skater model types:

    model = InMemoryModel(predict_fn, model_type='classifier', probability=True)
    

    or

    model = InMemoryModel(predict_fn, model_type='regressor')
    
    Source code(tar.gz)
    Source code(zip)
  • v1.0.2(Jul 5, 2017)

  • v1.0.0b.12(Jun 15, 2017)

  • v1.0.0b.11(Jun 8, 2017)

  • v1.0.0b.10(May 19, 2017)

    • Beta release of the Model Interpretation library. This is the first public release of the library. So, some of the apis and functionalities could still be experimental
      • Global Interpretation - PDP, Variable Importance
      • Local Interpretation - improved LIME
    Source code(tar.gz)
    Source code(zip)
  • v1.0.0b.2(May 14, 2017)

  • v1.0.0b.1(May 12, 2017)

  • v1.0.0b(May 9, 2017)

    • Support for classifiers without probability scores
    • Support for categorical variables
    • minor code restructuring
    • Column wise stratified sampling support
    • Ability to filter results to subset of classes
    Source code(tar.gz)
    Source code(zip)
  • v1.0.0-alpha(Mar 30, 2017)

Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet

Neural-Backed Decision Trees · Site · Paper · Blog · Video Alvin Wan, *Lisa Dunlap, *Daniel Ho, Jihan Yin, Scott Lee, Henry Jin, Suzanne Petryk, Sarah

Alvin Wan 556 Dec 20, 2022
JittorVis - Visual understanding of deep learning model.

JittorVis - Visual understanding of deep learning model.

182 Jan 06, 2023
Quickly and easily create / train a custom DeepDream model

Dream-Creator This project aims to simplify the process of creating a custom DeepDream model by using pretrained GoogleNet models and custom image dat

56 Jan 03, 2023
Visualizer for neural network, deep learning, and machine learning models

Netron is a viewer for neural network, deep learning and machine learning models. Netron supports ONNX (.onnx, .pb, .pbtxt), Keras (.h5, .keras), Tens

Lutz Roeder 20.9k Dec 28, 2022
Visualize a molecule and its conformations in Jupyter notebooks/lab using py3dmol

Mol Viewer This is a simple package wrapping py3dmol for a single command visualization of a RDKit molecule and its conformations (embed as Conformer

Benoît BAILLIF 1 Feb 11, 2022
Algorithms for monitoring and explaining machine learning models

Alibi is an open source Python library aimed at machine learning model inspection and interpretation. The focus of the library is to provide high-qual

Seldon 1.9k Dec 30, 2022
Pytorch Feature Map Extractor

MapExtrackt Convolutional Neural Networks Are Beautiful We all take our eyes for granted, we glance at an object for an instant and our brains can ide

Lewis Morris 40 Dec 07, 2022
ModelChimp is an experiment tracker for Deep Learning and Machine Learning experiments.

ModelChimp What is ModelChimp? ModelChimp is an experiment tracker for Deep Learning and Machine Learning experiments. ModelChimp provides the followi

ModelChimp 124 Dec 21, 2022
Neural network visualization toolkit for tf.keras

Neural network visualization toolkit for tf.keras

Yasuhiro Kubota 262 Dec 19, 2022
Portal is the fastest way to load and visualize your deep neural networks on images and videos 🔮

Portal is the fastest way to load and visualize your deep neural networks on images and videos 🔮

Datature 243 Jan 05, 2023
Auralisation of learned features in CNN (for audio)

AuralisationCNN This repo is for an example of auralisastion of CNNs that is demonstrated on ISMIR 2015. Files auralise.py: includes all required func

Keunwoo Choi 39 Nov 19, 2022
Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve 73 Dec 12, 2022
A collection of research papers and software related to explainability in graph machine learning.

A collection of research papers and software related to explainability in graph machine learning.

AstraZeneca 1.9k Dec 26, 2022
Convolutional neural network visualization techniques implemented in PyTorch.

This repository contains a number of convolutional neural network visualization techniques implemented in PyTorch.

1 Nov 06, 2021
Code for visualizing the loss landscape of neural nets

Visualizing the Loss Landscape of Neural Nets This repository contains the PyTorch code for the paper Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer

Tom Goldstein 2.2k Dec 30, 2022
Code for "High-Precision Model-Agnostic Explanations" paper

Anchor This repository has code for the paper High-Precision Model-Agnostic Explanations. An anchor explanation is a rule that sufficiently “anchors”

Marco Tulio Correia Ribeiro 735 Jan 05, 2023
L2X - Code for replicating the experiments in the paper Learning to Explain: An Information-Theoretic Perspective on Model Interpretation.

L2X Code for replicating the experiments in the paper Learning to Explain: An Information-Theoretic Perspective on Model Interpretation at ICML 2018,

Jianbo Chen 113 Sep 06, 2022
⬛ Python Individual Conditional Expectation Plot Toolbox

⬛ PyCEbox Python Individual Conditional Expectation Plot Toolbox A Python implementation of individual conditional expecation plots inspired by R's IC

Austin Rochford 140 Dec 30, 2022
FairML - is a python toolbox auditing the machine learning models for bias.

======== FairML: Auditing Black-Box Predictive Models FairML is a python toolbox auditing the machine learning models for bias. Description Predictive

Julius Adebayo 338 Nov 09, 2022
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)

Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)

Jesse Vig 4.7k Jan 01, 2023