Spatial Single-Cell Analysis Toolkit

Overview

Single-Cell Image Analysis Package


build: Unix-Mac-Win Docs Downloads PyPI Version PyPI License Gitter chat



Scimap is a scalable toolkit for analyzing spatial molecular data. The underlying framework is generalizable to spatial datasets mapped to XY coordinates. The package uses the anndata framework making it easy to integrate with other popular single-cell analysis toolkits. It includes preprocessing, phenotyping, visualization, clustering, spatial analysis and differential spatial testing. The Python-based implementation efficiently deals with large datasets of millions of cells.

Installation

We strongly recommend installing scimap in a fresh virtual environment.

# If you have conda installed
conda create --name scimap python=3.7
conda activate scimap

Install scimap directly into an activated virtual environment:

$ pip install scimap

After installation, the package can be imported as:

$ python
>>> import scimap as sm

Get Started

Detailed documentation of scimap functions and tutorials are available here.

SCIMAP development is led by Ajit Johnson Nirmal at the Laboratory of Systems Pharmacology, Harvard Medical School.

Funding

This work is supported by the following NIH grant K99-CA256497

Comments
  • sm.tl.spatial_lda stores result in adata.uns not adata.obs

    sm.tl.spatial_lda stores result in adata.uns not adata.obs

    Hello,

    I am trying to run sm.tl.spatial_lda with my adata object but for some reason the spatial_lda results are stored in adata.uns and not adata.obs. This is causing some errors with the clustering downstream using sm.tl.cluster. Any help would be appreciated!

    I am working in Python v3.9.

    After running the code:

    adata= sm.tl.spatial_lda(adata, x_coordinate='X', y_coordinate='Y', phenotype='celltype', method='radius', radius=30, knn=10, imageid='UniqueID', num_motifs=10, random_state=0, subset=None, label='spatial_lda')

    adata

    AnnData object with n_obs × n_vars = 79308 × 34 obs: 'Unnamed: 0', 'X', 'Y', 'Area', 'celltype', 'TLSType', 'UniqueID' uns: 'spatial_lda', 'spatial_lda_probability', 'spatial_lda_model'

    Scimap

    opened by marinabroz 12
  • strange spatial_LDA -> spatial_cluster results

    strange spatial_LDA -> spatial_cluster results

    Hello again!

    When clustering (spatial_cluster) on spatial_LDA results, I am getting strange results, as below.

    I always get reasonable spatial_cluster results when training on a single ROI. But with as few as 2 ROIs, I start to get this artifactual-seeming result, visible as clusters forming vertical stripes in one or more of the ROIs.

    I have tried both 'knn' and 'radius' as spatial_LDA methods with varying values of motifs, knn, and radius. Clustering method was always kmeans (leiden and phenograph were always giving me 99 clusters even with resolution set to 0.1—so I am actually not sure if it's spatial_LDA or instead the clustering that is contributing to this)

    Conditions which promote the appearance of this "artifact":

    • more than one ROI trained together
    • radius larger than 30
    • smaller/more numerous cells

    Example of a "sensible" spatial clustering result:

    image

    when one additional ROI is trained together with it, with all the same spatial_LDA and spatial_cluster parameters, that ROI becomes:

    image

    Some real structure is retained in the lower left corner, while the right side no longer makes sense...

    Any idea what could be causing this, or parameters to try which could mitigate?

    Thank you again!!

    opened by yerahko 5
  • Segmentation mask error in pl.image_viewer

    Segmentation mask error in pl.image_viewer

    I am getting an error when i include a segmentation mask file:

    # Pre- subset the anndata to prevent indexing error. Also, don't use `imageid` and `subset` args to image_viewer().
    #selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :]          # This doesn't work
    selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :].copy() # Making a copy makes it work
    
    sm.pl.image_viewer(image_path=img_path,
                       seg_mask=seg_mask,   # seg_mask=None,
                       #adata=adata, imageid='Unique_ID', subset = 'PT7.ROI_1',  # Don't do this, Use pre-subsetted data instead
                       adata=selected,
                       overlay='spatial_kmeans', 
                       channel_names=layers_order,
                       x_coordinate='X', y_coordinate='Y', flip_y=False, point_size=4)
    

    The segmentation mask file is readable, as in, I am able to display it in napari by running this chunk of code from sm.pl.image_viewer (minus the last two lines that I commented out—those break):

    # Load the segmentation mask
            if seg_mask is not None:
                seg_m = tiff.imread(seg_mask)
                # if seg_m.shape[0] > 1: 
                #    seg_m = seg_m[0]   
    

    followed by:

    napari.view_image(seg_m)
    

    However, I get the error below when trying to open it with image_viewer().

    Click to expand error
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    Input In [198], in <cell line: 5>()
          1 # Pre- subset the anndata to prevent indexing error. Also, don't use imageid and subset args to image_viewer().
          2 #selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1'])]          # This doesn't work
          3 selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :].copy() # Making a copy makes it work
    ----> 5 viewer=sm.pl.image_viewer(image_path=img_path,
          6                    seg_mask=seg_path, 
          7                    #adata=adata, imageid='Unique_ID', subset = 'PT7.ROI_1',  # Don't do this, Use pre-subsetted data instead
          8                    adata=selected,
          9                    overlay='cellsimple2', # 'spatial_kmeans', # 
         10                    channel_names=layers_order,
         11                    x_coordinate='X', y_coordinate='Y', flip_y=False, point_size=4)
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/scimap/plotting/_image_viewer.py:184, in image_viewer(image_path, adata, overlay, flip_y, overlay_category, markers, channel_names, x_coordinate, y_coordinate, point_size, point_color, subset, imageid, seg_mask, **kwargs)
        182 # Add the seg mask
        183 if seg_mask is not None:
    --> 184     viewer.add_labels(seg_m, name='segmentation mask', visible=False)
        186 # Add phenotype layer function
        187 def add_phenotype_layer (adata, overlay, phenotype_layer,x,y,viewer,point_size,point_color):
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/components/viewer_model.py:4, in add_labels(self, data, num_colors, features, properties, color, seed, name, metadata, scale, translate, rotate, shear, affine, opacity, blending, rendering, depiction, visible, multiscale, cache, plane, experimental_clipping_planes)
          1 from __future__ import annotations
          3 import inspect
    ----> 4 import itertools
          5 import os
          6 import warnings
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/labels/labels.py:259, in Labels.__init__(self, data, num_colors, features, properties, color, seed, name, metadata, scale, translate, rotate, shear, affine, opacity, blending, rendering, depiction, visible, multiscale, cache, plane, experimental_clipping_planes)
        256 self._show_selected_label = False
        257 self._contour = 0
    --> 259 data = self._ensure_int_labels(data)
        260 self._color_lookup_func = None
        262 super().__init__(
        263     data,
        264     rgb=False,
       (...)
        284     experimental_clipping_planes=experimental_clipping_planes,
        285 )
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/labels/labels.py:554, in Labels._ensure_int_labels(self, data)
        552 def _ensure_int_labels(self, data):
        553     """Ensure data is integer by converting from bool if required, raising an error otherwise."""
    --> 554     looks_multiscale, data = guess_multiscale(data)
        555     if not looks_multiscale:
        556         data = [data]
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/image/_image_utils.py:76, in guess_multiscale(data)
         72 consistent = bool(np.all(sizes[:-1] > sizes[1:]))
         73 if np.all(sizes == sizes[0]):
         74     # note: the individual array case should be caught by the first
         75     # code line in this function, hasattr(ndim) and ndim > 1.
    ---> 76     raise ValueError(
         77         trans._(
         78             'Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays of single shape: {shape}',
         79             deferred=True,
         80             shape=shapes[0],
         81         )
         82     )
         83 if not consistent:
         84     raise ValueError(
         85         trans._(
         86             'Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays in incorrect order, shapes: {shapes}',
       (...)
         89         )
         90     )
    
    ValueError: Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays of single shape: ()
    
    

    Thank you!

    opened by yerahko 3
  • sm.pl.foldchange runs into KeyError

    sm.pl.foldchange runs into KeyError

    Dear scimap developer,

    I am Jose, a biologist who now requires to perform multiplex image analysis. I have ran MCMICRO, and load the output csv as AnnData object. While trying to visualize the data, the foldchange function runs into an error, I am probably doing something wrong. foldchange_KeyError.txt conda list.txt

    opened by josenimo 3
  • sm.pl.image_viewer works but the Napiri visualization wont pop up

    sm.pl.image_viewer works but the Napiri visualization wont pop up

    Halloo everyone........

    I am a new for the bioinformatic analysis. Currently, I am working on spatial CODEX imaging analysis and I found this package very useful. However I have encounter a problem in using sm.pl.image_viewer. It seems the code is working but Napiri visualization wont pop up. this is my code I put on:

    import sys import os import anndata as ad import pandas as pd import scanpy as sc import seaborn as sns; sns.set(color_codes=True)

    Import Scimap

    import scimap as sm

    Set the working directory

    os.chdir ("/Users/admin/Desktop/scimap package.CODEX.analysis/")

    Load data

    adata = ad.read('tutorial_data_D20LN1_setting5.h5ad') adata AnnData object with n_obs × n_vars = 8957 × 24 obs: 'X_centroid', 'Y_centroid', 'Area', 'MajorAxisLength', 'MinorAxisLength', 'Eccentricity', 'Solidity', 'Extent', 'Orientation', 'CellID', 'imageid' uns: 'all_markers', 'pca' obsm: 'X_pca' varm: 'PCs' image_path = '/Users/admin/Desktop/scimap package.CODEX.analysis/image.ome.tif' sm.pl.image_viewer(image_path, adata, overlay='leiden',overlay_category=None, ... markers=[ 'CD8'], imageid='imageid', seg_mask=None,
    ... point_size=7,point_color='white')

    the data put for adata was h5ad.

    Any help I really appreciate.

    Best regards,

    Bugie

    opened by bugie19 2
  • image_viewer and gate_finder do not work: type object 'SubControl' has no attribute 'SC_None'

    image_viewer and gate_finder do not work: type object 'SubControl' has no attribute 'SC_None'

    Hi,

    and thank you for developing scimap! Unfortunately though, I have had problems with sm.pl.image_viewer and sm.pl.gate_finder, where I keep getting the same error: type object 'SubControl' has no attribute 'SC_None' All the previous steps in scimap tutorials have worked smoothly. Apparently the problem is in opening Napari. Two collaborators have kindly tried to use image_viewer and gate_finder using the same data and they have not had any problems. However, I am the only one using Mac (currently Monterey 12.4), so could this be a Mac-Napari thing?

    Here is a link to a jupyter notebook example, error message and to example image.

    Any help will be much appreciated.

    Best, Joona

    opened by SarkkinenJ 2
  • spatial_LDA with KNN neighborhoods

    spatial_LDA with KNN neighborhoods

    Hi there,

    I am running into trouble when I try to run the function spatial_LDA using method='knn', i.e.: https://github.com/labsyspharm/scimap/blob/b957d8d771260947c4f783b62533c0d0c3c48e5a/scimap/tools/_spatial_lda.py#L96-L112

    I am getting the following error at line 112:

    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    
          [1] for i in range(len(ind)):
    ----> [2]     ind[i] = [phenomap[letter] for letter in ind[i]]
    
    ValueError: invalid literal for int() with base 10: 'B'
    

    Using method='radius' works.

    Thank you very much for your help!

    opened by yerahko 2
  • install errors

    install errors

    Dear developer,

    I am just wondering that is there any way to install scimap on a M1 chip MacBook? I got errors in pip install scimap

    Thanks in adcance

    opened by sailseem 2
  • Bump nltk from 3.5 to 3.6.6

    Bump nltk from 3.5 to 3.6.6

    Bumps nltk from 3.5 to 3.6.6.

    Changelog

    Sourced from nltk's changelog.

    Version 3.7 2022-02-09

    • Improve and update the NLTK team page on nltk.org (#2855, #2941)
    • Drop support for Python 3.6, support Python 3.10 (#2920)

    Version 3.6.7 2021-12-28

    • Resolve IndexError in sent_tokenize and word_tokenize (#2922)

    Version 3.6.6 2021-12-21

    • Refactor gensim.doctest to work for gensim 4.0.0 and up (#2914)
    • Add Precision, Recall, F-measure, Confusion Matrix to Taggers (#2862)
    • Added warnings if .zip files exist without any corresponding .csv files. (#2908)
    • Fix FileNotFoundError when the download_dir is a non-existing nested folder (#2910)
    • Rename omw to omw-1.4 (#2907)
    • Resolve ReDoS opportunity by fixing incorrectly specified regex (#2906)
    • Support OMW 1.4 (#2899)
    • Deprecate Tree get and set node methods (#2900)
    • Fix broken inaugural test case (#2903)
    • Use Multilingual Wordnet Data from OMW with newer Wordnet versions (#2889)
    • Keep NLTKs "tokenize" module working with pathlib (#2896)
    • Make prettyprinter to be more readable (#2893)
    • Update links to the nltk book (#2895)
    • Add CITATION.cff to nltk (#2880)
    • Resolve serious ReDoS in PunktSentenceTokenizer (#2869)
    • Delete old CI config files (#2881)
    • Improve Tokenize documentation + add TokenizerI as superclass for TweetTokenizer (#2878)
    • Fix expected value for BLEU score doctest after changes from #2572
    • Add multi Bleu functionality and tests (#2793)
    • Deprecate 'return_str' parameter in NLTKWordTokenizer and TreebankWordTokenizer (#2883)
    • Allow empty string in CFG's + more (#2888)
    • Partition tree.py module into tree package + pickle fix (#2863)
    • Fix several TreebankWordTokenizer and NLTKWordTokenizer bugs (#2877)
    • Rewind Wordnet data file after each lookup (#2868)
    • Correct init call for SyntaxCorpusReader subclasses (#2872)
    • Documentation fixes (#2873)
    • Fix levenstein distance for duplicated letters (#2849)
    • Support alternative Wordnet versions (#2860)
    • Remove hundreds of formatting warnings for nltk.org (#2859)
    • Modernize nltk.org/howto pages (#2856)
    • Fix Bleu Score smoothing function from taking log(0) (#2839)
    • Update third party tools to newer versions and removing MaltParser fixed version (#2832)
    • Fix TypeError: _pretty() takes 1 positional argument but 2 were given in sem/drt.py (#2854)
    • Replace http with https in most URLs (#2852)

    Thanks to the following contributors to 3.6.6 Adam Hawley, BatMrE, Danny Sepler, Eric Kafe, Gavish Poddar, Panagiotis Simakis, RnDevelover, Robby Horvath, Tom Aarsen, Yuta Nakamura, Mohaned Mashaly

    ... (truncated)

    Commits
    • 4862b09 updates for 3.6.6
    • 6b60213 Refactor gensim.doctest to work for gensim 4.0.0 and up (#2914)
    • 59aa3fb Fix decode error for bllip parser (#2897)
    • a28d256 Add Precision, Recall, F-measure, Confusion Matrix to Taggers (#2862)
    • 72d9885 Added warnings if .zip files exist without any corresponding .csv files. (#2908)
    • dea7b44 Fix FileNotFoundError when the download_dir is a non-existing nested fold...
    • abbe86b Undo #2909 due to unexpected test failure
    • c075dab Allow commits with /nocache to not use the cache (#2909)
    • d6d513d Renamed omw to omw-1.4 (#2907)
    • 2a50a3e Resolve ReDoS opportunity by fixing incorrectly specified regex (#2906)
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump ipython from 7.19.0 to 7.31.1

    Bump ipython from 7.19.0 to 7.31.1

    Bumps ipython from 7.19.0 to 7.31.1.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump pillow from 8.0.1 to 9.0.1

    Bump pillow from 8.0.1 to 9.0.1

    Bumps pillow from 8.0.1 to 9.0.1.

    Release notes

    Sourced from pillow's releases.

    9.0.1

    https://pillow.readthedocs.io/en/stable/releasenotes/9.0.1.html

    Changes

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [@​radarhere, @​hugovk]
    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]

    9.0.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.0.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.0.1 (2022-02-03)

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [radarhere, hugovk]

    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]

    9.0.0 (2022-01-02)

    • Restrict builtins for ImageMath.eval(). CVE-2022-22817 #5923 [radarhere]

    • Ensure JpegImagePlugin stops at the end of a truncated file #5921 [radarhere]

    • Fixed ImagePath.Path array handling. CVE-2022-22815, CVE-2022-22816 #5920 [radarhere]

    • Remove consecutive duplicate tiles that only differ by their offset #5919 [radarhere]

    • Improved I;16 operations on big endian #5901 [radarhere]

    • Limit quantized palette to number of colors #5879 [radarhere]

    • Fixed palette index for zeroed color in FASTOCTREE quantize #5869 [radarhere]

    • When saving RGBA to GIF, make use of first transparent palette entry #5859 [radarhere]

    • Pass SAMPLEFORMAT to libtiff #5848 [radarhere]

    • Added rounding when converting P and PA #5824 [radarhere]

    • Improved putdata() documentation and data handling #5910 [radarhere]

    • Exclude carriage return in PDF regex to help prevent ReDoS #5912 [hugovk]

    • Fixed freeing pointer in ImageDraw.Outline.transform #5909 [radarhere]

    ... (truncated)

    Commits
    • 6deac9e 9.0.1 version bump
    • c04d812 Update CHANGES.rst [ci skip]
    • 4fabec3 Added release notes for 9.0.1
    • 02affaa Added delay after opening image with xdg-open
    • ca0b585 Updated formatting
    • 427221e In show_file, use os.remove to remove temporary images
    • c930be0 Restrict builtins within lambdas for ImageMath.eval
    • 75b69dd Dont need to pin for GHA
    • cd938a7 Autolink CWE numbers with sphinx-issues
    • 2e9c461 Add CVE IDs
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump certifi from 2021.10.8 to 2022.12.7

    Bump certifi from 2021.10.8 to 2022.12.7

    Bumps certifi from 2021.10.8 to 2022.12.7.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • supplying additional groupings

    supplying additional groupings

    Just wondering if there is support to add additional arguments into some functions to change the way the data is grouped other than by 'imageid'. eg in tl.foldchange 'from_group' could be another categorical variable in adata.obs, eg 'mutation1', and 'to_group' could be 'mutation2' similarly tl.spatial_interaction is performed by 'imageid', can these scores be further grouped by another adata.obs variable?

    thanks

    opened by jamesMo84 1
  • Bump pillow from 9.1.1 to 9.3.0

    Bump pillow from 9.1.1 to 9.3.0

    Bumps pillow from 9.1.1 to 9.3.0.

    Release notes

    Sourced from pillow's releases.

    9.3.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.3.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.3.0 (2022-10-29)

    • Limit SAMPLESPERPIXEL to avoid runtime DOS #6700 [wiredfool]

    • Initialize libtiff buffer when saving #6699 [radarhere]

    • Inline fname2char to fix memory leak #6329 [nulano]

    • Fix memory leaks related to text features #6330 [nulano]

    • Use double quotes for version check on old CPython on Windows #6695 [hugovk]

    • Remove backup implementation of Round for Windows platforms #6693 [cgohlke]

    • Fixed set_variation_by_name offset #6445 [radarhere]

    • Fix malloc in _imagingft.c:font_setvaraxes #6690 [cgohlke]

    • Release Python GIL when converting images using matrix operations #6418 [hmaarrfk]

    • Added ExifTags enums #6630 [radarhere]

    • Do not modify previous frame when calculating delta in PNG #6683 [radarhere]

    • Added support for reading BMP images with RLE4 compression #6674 [npjg, radarhere]

    • Decode JPEG compressed BLP1 data in original mode #6678 [radarhere]

    • Added GPS TIFF tag info #6661 [radarhere]

    • Added conversion between RGB/RGBA/RGBX and LAB #6647 [radarhere]

    • Do not attempt normalization if mode is already normal #6644 [radarhere]

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump jupyter-core from 4.9.2 to 4.11.2

    Bump jupyter-core from 4.9.2 to 4.11.2

    Bumps jupyter-core from 4.9.2 to 4.11.2.

    Release notes

    Sourced from jupyter-core's releases.

    4.11.1

    What's Changed

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.11.0...4.11.1

    4.11.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.10.0...4.11.0

    4.10.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.9.2...4.10.0

    Changelog

    Sourced from jupyter-core's changelog.

    Changes in jupyter-core

    5.0.0

    (Full Changelog)

    Major Changes

    Prefer Environment Level Configuration

    We now make the assumption that if we are running in a virtual environment, we should prioritize the environment-level sys.prefix over the user-level paths. Users can opt out of this behavior by setting JUPYTER_PREFER_ENV_PATH, which takes precedence over our autodetection.

    Migrate to Standard Platform Directories

    In version 5, we introduce a JUPYTER_PLATFORM_DIRS environment variable to opt in to using more appropriate platform-specific directories. We raise a deprecation warning if the variable is not set. In version 6, JUPYTER_PLATFORM_DIRS will be opt-out. In version 7, we will remove the environment variable checks and old directory logic.

    Drop Support for Python 3.7

    We are dropping support for Python 3.7 ahead of its official end of life, to reduce maintenance burden as we add support for Python 3.11.

    Enhancements made

    Bugs fixed

    Maintenance and upkeep improvements

    Documentation

    Contributors to this release

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump joblib from 1.1.0 to 1.2.0

    Bump joblib from 1.1.0 to 1.2.0

    Bumps joblib from 1.1.0 to 1.2.0.

    Changelog

    Sourced from joblib's changelog.

    Release 1.2.0

    • Fix a security issue where eval(pre_dispatch) could potentially run arbitrary code. Now only basic numerics are supported. joblib/joblib#1327

    • Make sure that joblib works even when multiprocessing is not available, for instance with Pyodide joblib/joblib#1256

    • Avoid unnecessary warnings when workers and main process delete the temporary memmap folder contents concurrently. joblib/joblib#1263

    • Fix memory alignment bug for pickles containing numpy arrays. This is especially important when loading the pickle with mmap_mode != None as the resulting numpy.memmap object would not be able to correct the misalignment without performing a memory copy. This bug would cause invalid computation and segmentation faults with native code that would directly access the underlying data buffer of a numpy array, for instance C/C++/Cython code compiled with older GCC versions or some old OpenBLAS written in platform specific assembly. joblib/joblib#1254

    • Vendor cloudpickle 2.2.0 which adds support for PyPy 3.8+.

    • Vendor loky 3.3.0 which fixes several bugs including:

      • robustly forcibly terminating worker processes in case of a crash (joblib/joblib#1269);

      • avoiding leaking worker processes in case of nested loky parallel calls;

      • reliability spawn the correct number of reusable workers.

    Release 1.1.1

    • Fix a security issue where eval(pre_dispatch) could potentially run arbitrary code. Now only basic numerics are supported. joblib/joblib#1327
    Commits
    • 5991350 Release 1.2.0
    • 3fa2188 MAINT cleanup numpy warnings related to np.matrix in tests (#1340)
    • cea26ff CI test the future loky-3.3.0 branch (#1338)
    • 8aca6f4 MAINT: remove pytest.warns(None) warnings in pytest 7 (#1264)
    • 067ed4f XFAIL test_child_raises_parent_exits_cleanly with multiprocessing (#1339)
    • ac4ebd5 MAINT add back pytest warnings plugin (#1337)
    • a23427d Test child raises parent exits cleanly more reliable on macos (#1335)
    • ac09691 [MAINT] various test updates (#1334)
    • 4a314b1 Vendor loky 3.2.0 (#1333)
    • bdf47e9 Make test_parallel_with_interactively_defined_functions_default_backend timeo...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
Releases(0.22.0)
  • 0.19.0(Apr 3, 2022)

    • Included support for Apple M1 machines
    • Included support for native rendering of Zarr stored images using Napari: pl.image_viewer and pl.gate_finder

    Temporary workaround for installing in Apple M1 machines

    # reate and load a new environment
    conda create --name scimap python=3.8 -y
    conda activate scimap
    
    # if you do not have xcode please install it
    xcode-select --install
    
    # if you do not have homebrew please install it
    /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
    
    # if you do not have cmake install it
    brew install cmake
    
    # install h5py
    brew install [email protected]
    export HDF5_DIR=/opt/homebrew/Cellar/hdf5/1.12.1_1/
    pip install --no-binary=h5py h5py
    
    # install llvmlite
    conda install llvmlite -y
    
    # install leidenalg
    pip install git+https://github.com/vtraag/leidenalg.git
    
    # install scimap
    pip install -U scimap
    
    # uninstall 
    conda remove llvmlite -y
    pip uninstall numba -y
    pip uninstall numpy -y
    
    # reinstall this specific version of llvmlite (ignore errors/warning)
    pip install -i https://pypi.anaconda.org/numba/label/wheels_experimental_m1/simple llvmlite
    
    # reinstall this specific version of numpy (ignore errors/warning)
    pip install numpy==1.22.3
    
    # reinstall this specific version of numba (ignore errors/warning)
    pip install -i https://pypi.anaconda.org/numba/label/wheels_experimental_m1/simple numba
    
    Source code(tar.gz)
    Source code(zip)
  • 0.17.7(Aug 5, 2021)

    Testing the new GitHub Action that automatically builds, tags and pushed Docker container images. The action is triggered by new releases.

    Source code(tar.gz)
    Source code(zip)
  • 0.17.2(Jul 1, 2021)

  • 0.1.10(Jun 29, 2020)

Owner
Laboratory of Systems Pharmacology @ Harvard
Reinventing the fundamental science underlying the development of new medicines and their use in individual patients.
Laboratory of Systems Pharmacology @ Harvard
Time Delayed NN implemented in pytorch

Pytorch Time Delayed NN Time Delayed NN implemented in PyTorch. Usage kernels = [(1, 25), (2, 50), (3, 75), (4, 100), (5, 125), (6, 150)] tdnn = TDNN

Daniil Gavrilov 79 Aug 04, 2022
Public Code for NIPS submission SimiGrad: Fine-Grained Adaptive Batching for Large ScaleTraining using Gradient Similarity Measurement

Public code for NIPS submission "SimiGrad: Fine-Grained Adaptive Batching for Large Scale Training using Gradient Similarity Measurement" This repo co

Heyang Qin 0 Oct 13, 2021
Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis

Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis Website | ICCV paper | arXiv | Twitter This repository contains the official i

Ajay Jain 73 Dec 27, 2022
DeepRec is a recommendation engine based on TensorFlow.

DeepRec Introduction DeepRec is a recommendation engine based on TensorFlow 1.15, Intel-TensorFlow and NVIDIA-TensorFlow. Background Sparse model is a

Alibaba 676 Jan 03, 2023
PyTorch version of the paper 'Enhanced Deep Residual Networks for Single Image Super-Resolution' (CVPRW 2017)

About PyTorch 1.2.0 Now the master branch supports PyTorch 1.2.0 by default. Due to the serious version problem (especially torch.utils.data.dataloade

Sanghyun Son 2.1k Dec 27, 2022
The codebase for our paper "Generative Occupancy Fields for 3D Surface-Aware Image Synthesis" (NeurIPS 2021)

Generative Occupancy Fields for 3D Surface-Aware Image Synthesis (NeurIPS 2021) Project Page | Paper Xudong Xu, Xingang Pan, Dahua Lin and Bo Dai GOF

xuxudong 97 Nov 10, 2022
Pytorch implementation of the paper Time-series Generative Adversarial Networks

TimeGAN-pytorch Pytorch implementation of the paper Time-series Generative Adversarial Networks presented at NeurIPS'19. Jinsung Yoon, Daniel Jarrett

Zhiwei ZHANG 21 Nov 24, 2022
Generalized Random Forests

generalized random forests A pluggable package for forest-based statistical estimation and inference. GRF currently provides non-parametric methods fo

GRF Labs 781 Dec 25, 2022
Modified prey-predator system - Modified prey–predator model describes the rate of change for each species by adding coupling terms.

Modified prey-predator system We aim to study the behaviors of the modified prey–predator model and establish the effects of several parameters that p

Seoyoung Oh 1 Jan 02, 2022
Extract MNIST handwritten digits dataset binary file into bmp images

MNIST-dataset-extractor Extract MNIST handwritten digits dataset binary file into bmp images More info at http://yann.lecun.com/exdb/mnist/ Dependenci

Omar Mostafa 6 May 24, 2021
Implementing Graph Convolutional Networks and Information Retrieval Mechanisms using pure Python and NumPy

Implementing Graph Convolutional Networks and Information Retrieval Mechanisms using pure Python and NumPy

Noah Getz 3 Jun 22, 2022
Official pytorch implement for “Transformer-Based Source-Free Domain Adaptation”

Official implementation for TransDA Official pytorch implement for “Transformer-Based Source-Free Domain Adaptation”. Overview: Result: Prerequisites:

stanley 54 Dec 22, 2022
Simple sinc interpolation in PyTorch.

Kazane: simple sinc interpolation for 1D signal in PyTorch Kazane utilize FFT based convolution to provide fast sinc interpolation for 1D signal when

Chin-Yun Yu 10 May 03, 2022
Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer.

DocEnTR Description Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer. This model is implemented on to

Mohamed Ali Souibgui 74 Jan 07, 2023
El-Gamal on Elliptic Curve (Python)

El-Gamal-on-EC El-Gamal on Elliptic Curve (Python) References: https://docsdrive.com/pdfs/ansinet/itj/2005/299-306.pdf https://arxiv.org/ftp/arxiv/pap

3 May 04, 2022
Use evolutionary algorithms instead of gridsearch in scikit-learn

sklearn-deap Use evolutionary algorithms instead of gridsearch in scikit-learn. This allows you to reduce the time required to find the best parameter

rsteca 709 Jan 03, 2023
TorchOk - The toolkit for fast Deep Learning experiments in Computer Vision

TorchOk - The toolkit for fast Deep Learning experiments in Computer Vision

52 Dec 23, 2022
Out-of-boundary View Synthesis towards Full-frame Video Stabilization

Out-of-boundary View Synthesis towards Full-frame Video Stabilization Introduction | Update | Results Demo | Introduction This repository contains the

25 Oct 10, 2022
PyTorch Implementation of Sparse DETR

Sparse DETR By Byungseok Roh*, Jaewoong Shin*, Wuhyun Shin*, and Saehoon Kim at Kakao Brain. (*: Equal contribution) This repository is an official im

Kakao Brain 113 Dec 28, 2022
A module that used for encrypt code which includes RSA and AES

软件加密模块 requirement: Crypto,pycryptodome,pyqt5 本地加密信息为随机字符串 使用说明 命令行参数 -h 帮助 -checkWorking 检查是否能正常工作,后接1确认指令 -checkEndDate 检查截至日期,后接1确认指令 -activateCode

2 Sep 27, 2022