Airbus Ship Detection Challenge

Overview

Airbus Ship Detection Challenge

license

This is an open solution to the Airbus Ship Detection Challenge.

Our goals

We are building entirely open solution to this competition. Specifically:

  1. Learning from the process - updates about new ideas, code and experiments is the best way to learn data science. Our activity is especially useful for people who wants to enter the competition, but lack appropriate experience.
  2. Encourage more Kagglers to start working on this competition.
  3. Deliver open source solution with no strings attached. Code is available on our GitHub repository ๐Ÿ’ป . This solution should establish solid benchmark, as well as provide good base for your custom ideas and experiments. We care about clean code ๐Ÿ˜ƒ
  4. We are opening our experiments as well: everybody can have live preview on our experiments, parameters, code, etc. Check: Airbus Ship Detection Challenge ๐Ÿ“ˆ or screen below.
Train and validation monitor ๐Ÿ“Š
training monitor

Disclaimer

In this open source solution you will find references to the neptune.ml. It is free platform for community Users, which we use daily to keep track of our experiments. Please note that using neptune.ml is not necessary to proceed with this solution. You may run it as plain Python script ๐Ÿ .

How to start?

Learn about our solutions

  1. Check Kaggle forum and participate in the discussions.
  2. See solutions below:
link to code CV LB
solution 1 0.541 0.573
solution 2 0.661 0.679
solution 3 0.694 0.696
solution 4 0.722 0.703
solution 5 0.719 0.725

Start experimenting with ready-to-use code

You can jump start your participation in the competition by using our starter pack. Installation instruction below will guide you through the setup.

Installation (fast track)

  1. Clone repository and install requirements (use Python3.5) pip3 install -r requirements.txt
  2. Register to the neptune.ml (if you wish to use it)
  3. Run experiment based on U-Net:

Cloud

neptune account login

Create project say Ships (SHIP)

Go to neptune.yaml and change:

project: USERNAME/PROJECT_NAME

to your username and project name

Prepare metadata and overlayed target masks It only needs to be done once

neptune send --worker xs \
--environment base-cpu-py3 \
--config neptune.yaml \
prepare_metadata.py

They will be saved in the

  metadata_filepath: /output/metadata.csv
  masks_overlayed_dir: /output/masks_overlayed

From now on we will load the metadata by changing the neptune.yaml

  metadata_filepath: /input/metadata.csv
  masks_overlayed_dir: /input/masks_overlayed

and adding the path to the experiment that generated metadata say SHIP-1 to every command --input/metadata.csv

Let's train the model by running the main.py:

neptune send --worker m-2p100 \
--environment pytorch-0.3.1-gpu-py3 \
--config neptune.yaml \
--input /SHIP-1/output/metadata.csv \
--input /SHIP-1/output/masks_overlayed \
main.py 

The model will be saved in the:

  experiment_dir: /output/experiment

and the submission.csv will be saved in /output/experiment/submission.csv

You can easily use models trained during one experiment in other experiments. For example when running evaluation we need to use the previous model folder in our experiment. We do that by:

changing main.py

  CLONE_EXPERIMENT_DIR_FROM = '/SHIP-2/output/experiment'

and running the following command:

neptune send --worker m-2p100 \
--environment pytorch-0.3.1-gpu-py3 \
--config neptune.yaml \
--input /SHIP-1/output/metadata.csv \
--input /SHIP-1/output/masks_overlayed \
--input /SHIP-2 \
main.py

Local

Login to neptune if you want to use it

neptune account login

Prepare metadata by running:

neptune run --config neptune.yaml prepare_metadata.py

Training and inference by running main.py:

neptune run --config neptune.yaml main.py

You can always run it with pure python ๐Ÿ

python main.py 

Get involved

You are welcome to contribute your code and ideas to this open solution. To get started:

  1. Check competition project on GitHub to see what we are working on right now.
  2. Express your interest in particular task by writing comment in this task, or by creating new one with your fresh idea.
  3. We will get back to you quickly in order to start working together.
  4. Check CONTRIBUTING for some more information.

User support

There are several ways to seek help:

  1. Kaggle discussion is our primary way of communication.
  2. Submit an issue directly in this repo.
Comments
  • Executable 'prepare_metadata.py' does not exist.

    Executable 'prepare_metadata.py' does not exist.

    Hi, thank you for open your solution.

    I am trying to use neptune but I an unable to generate metadata.csv as instructed. The file prepare_metadata.py seems to be missing ?

    opened by tkuanlun350 2
  • enabling dev_mode

    enabling dev_mode

    Hi

    Using python 3.5 (without neptune), the following works. python main.py -- train --pipeline_name unet

    However I wish to debug using dev_mode and can't seem to pass through the flag. python main.py -- train --pipeline_name unet --dev_mode

    Error: No value provided for parameter 'dev_mode'

    I think the click option is_flag for dev_mode means we shouldn't need to pass any values, so I'm doing something wrong?

    Thanks

    opened by sovvo 2
  • Bump opencv-python from 3.4.0.12 to 3.4.7.28

    Bump opencv-python from 3.4.0.12 to 3.4.7.28

    Bumps opencv-python from 3.4.0.12 to 3.4.7.28.

    Release notes

    Sourced from opencv-python's releases.

    3.4.7.28

    OpenCV version 3.4.7.

    3.4.6.27

    OpenCV version 3.4.6.

    3.4.5.20

    OpenCV version 3.4.5.

    Once some build issues are solved, next releases will be targeting OpenCV version 4.

    3.4.4.19

    OpenCV version 3.4.4.

    Thanks to Ivan Pozdeev for following fixes and enhancements: #135, #136, #141, #144, #145, #146, #147, #149, #150

    3.4.3.18

    OpenCV version 3.4.3.

    3.4.2.17

    Same as 3.4.2.16 but includes missing x86_64 Linux wheels. Thanks to Krassimir Valev for fixing the build matrix.

    3.4.2.16

    This release bumps OpenCV version to 3.4.2 and adds support for Python 3.7.

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Segmentation fault encountered in worker

    Segmentation fault encountered in worker

    I've successfully prepared the masks and the meta data. However when I run the training (as neptune run -x data --config configs/neptune.yaml main.py train --pipeline_name unet) I get the following error:

    2018-08-27 14-54-31 ships-detection >>> training
    /anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py:535: DtypeWarning: Columns (1) have mixed types. Specify dtype option on import or set low_memory=False.
      return callback(*args, **kwargs)
    2018-08-27 14:54:31 steppy >>> initializing Step xy_train...
    2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
    2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
    2018-08-27 14:54:31 steppy >>> Step xy_train initialized
    2018-08-27 14:54:31 steppy >>> initializing Step xy_inference...
    2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
    2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
    2018-08-27 14:54:31 steppy >>> Step xy_inference initialized
    2018-08-27 14:54:31 steppy >>> initializing Step loader...
    2018-08-27 14:54:31 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
    2018-08-27 14:54:31 steppy >>> done: initializing experiment directories
    2018-08-27 14:54:31 steppy >>> Step loader initialized
    /anaconda3/envs/neptune/lib/python3.5/site-packages/toolkit/pytorch_transformers/architectures/unet.py:22: UserWarning: Please make sure, that your input tensor's dimensions are divisible by (pool_stride ** repeat_blocks)
      warnings.warn("Please make sure, that your input tensor's dimensions are divisible by "
    2018-08-27 14:54:32 steppy >>> initializing Step unet...
    2018-08-27 14:54:32 steppy >>> initializing experiment directories under /Users/jonathan/devwork/open-solution-ship-detection/experiments
    2018-08-27 14:54:32 steppy >>> done: initializing experiment directories
    2018-08-27 14:54:32 steppy >>> Step unet initialized
    2018-08-27 14:54:32 steppy >>> cleaning cache...
    2018-08-27 14:54:32 steppy >>> cleaning cache done
    2018-08-27 14:54:32 steppy >>> Step xy_train, adapting inputs...
    2018-08-27 14:54:32 steppy >>> Step xy_train, transforming...
    2018-08-27 14:54:32 steppy >>> Step xy_inference, adapting inputs...
    2018-08-27 14:54:32 steppy >>> Step xy_inference, transforming...
    2018-08-27 14:54:32 steppy >>> Step loader, adapting inputs...
    2018-08-27 14:54:32 steppy >>> Step loader, transforming...
    2018-08-27 14:54:32 steppy >>> Step unet, adapting inputs...
    2018-08-27 14:54:32 steppy >>> Step unet, fitting and transforming...
    2018-08-27 14:54:32 steppy >>> initializing model weights...
    2018-08-27 14-54-32 ships-detection >>> starting training...
    2018-08-27 14-54-32 ships-detection >>> initial lr: 0.0001
    2018-08-27 14-54-32 ships-detection >>> epoch 0 ...
    ERROR: Unexpected segmentation fault encountered in worker.
    Traceback (most recent call last):
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/deepsense/neptune/job_wrapper.py", line 107, in <module>
        execute()
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/deepsense/neptune/job_wrapper.py", line 103, in execute
        execfile(job_filepath, job_globals)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/past/builtins/misc.py", line 82, in execfile
        exec_(code, myglobals, mylocals)
      File "main.py", line 89, in <module>
        main()
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 722, in __call__
        return self.main(*args, **kwargs)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 697, in main
        rv = self.invoke(ctx)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 1066, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 895, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/click/core.py", line 535, in invoke
        return callback(*args, **kwargs)
      File "main.py", line 27, in train
        pipeline_manager.train(pipeline_name, dev_mode)
      File "/Users/jonathan/devwork/open-solution-ship-detection/src/pipeline_manager.py", line 28, in train
        train(pipeline_name, dev_mode)
      File "/Users/jonathan/devwork/open-solution-ship-detection/src/pipeline_manager.py", line 77, in train
        pipeline.fit_transform(data)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 323, in fit_transform
        step_output_data = self._cached_fit_transform(step_inputs)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 443, in _cached_fit_transform
        step_output_data = self.transformer.fit_transform(**step_inputs)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/steppy/base.py", line 605, in fit_transform
        self.fit(*args, **kwargs)
      File "/Users/jonathan/devwork/open-solution-ship-detection/src/models.py", line 68, in fit
        for batch_id, data in enumerate(batch_gen):
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 275, in __next__
        idx, batch = self._get_batch()
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 254, in _get_batch
        return self.data_queue.get()
      File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/queues.py", line 335, in get
        res = self._reader.recv_bytes()
      File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 216, in recv_bytes
        buf = self._recv_bytes(maxlength)
      File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 407, in _recv_bytes
        buf = self._recv(4)
      File "/anaconda3/envs/neptune/lib/python3.5/multiprocessing/connection.py", line 379, in _recv
        chunk = read(handle, remaining)
      File "/anaconda3/envs/neptune/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 175, in handler
        _error_if_any_worker_fails()
    RuntimeError: DataLoader worker (pid 33362) is killed by signal: Unknown signal: 0.
    ERROR: Unexpected segmentation fault encountered in worker.
    ERROR: Unexpected segmentation fault encountered in worker.
    

    Do you have any ideas on this? Thank you very much.

    opened by marketneutral 1
  • Local vs Cloud execution

    Local vs Cloud execution

    Hello, thank you for putting together this excellent baseline.

    I have two questions:

    1. what are these two directories (the others are self explanatory):
      annotation_file:      /path/to/data
      masks_overlayed_dir:  /path/to/data
    
    1. when I execute neptune run --config configs/neptune.yaml main.py prepare_masks I get
    Calculated experiment snapshot size: 57.45 GB   
    Sending sources to server:   0%|                                   | 31.8M/57.4G [00:25<8:14:17, 1.94MB/s
    

    which means it is sending all the files to the cloud. Why is this happening? Isn't run intended to run everything locally?

    opened by marketneutral 1
  • Bump numpy from 1.21.0 to 1.22.0

    Bump numpy from 1.21.0 to 1.22.0

    Bumps numpy from 1.21.0 to 1.22.0.

    Release notes

    Sourced from numpy's releases.

    v1.22.0

    NumPy 1.22.0 Release Notes

    NumPy 1.22.0 is a big release featuring the work of 153 contributors spread over 609 pull requests. There have been many improvements, highlights are:

    • Annotations of the main namespace are essentially complete. Upstream is a moving target, so there will likely be further improvements, but the major work is done. This is probably the most user visible enhancement in this release.
    • A preliminary version of the proposed Array-API is provided. This is a step in creating a standard collection of functions that can be used across application such as CuPy and JAX.
    • NumPy now has a DLPack backend. DLPack provides a common interchange format for array (tensor) data.
    • New methods for quantile, percentile, and related functions. The new methods provide a complete set of the methods commonly found in the literature.
    • A new configurable allocator for use by downstream projects.

    These are in addition to the ongoing work to provide SIMD support for commonly used functions, improvements to F2PY, and better documentation.

    The Python versions supported in this release are 3.8-3.10, Python 3.7 has been dropped. Note that 32 bit wheels are only provided for Python 3.8 and 3.9 on Windows, all other wheels are 64 bits on account of Ubuntu, Fedora, and other Linux distributions dropping 32 bit support. All 64 bit wheels are also linked with 64 bit integer OpenBLAS, which should fix the occasional problems encountered by folks using truly huge arrays.

    Expired deprecations

    Deprecated numeric style dtype strings have been removed

    Using the strings "Bytes0", "Datetime64", "Str0", "Uint32", and "Uint64" as a dtype will now raise a TypeError.

    (gh-19539)

    Expired deprecations for loads, ndfromtxt, and mafromtxt in npyio

    numpy.loads was deprecated in v1.15, with the recommendation that users use pickle.loads instead. ndfromtxt and mafromtxt were both deprecated in v1.17 - users should use numpy.genfromtxt instead with the appropriate value for the usemask parameter.

    (gh-19615)

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump numpy from 1.14.0 to 1.21.0

    Bump numpy from 1.14.0 to 1.21.0

    Bumps numpy from 1.14.0 to 1.21.0.

    Release notes

    Sourced from numpy's releases.

    v1.21.0

    NumPy 1.21.0 Release Notes

    The NumPy 1.21.0 release highlights are

    • continued SIMD work covering more functions and platforms,
    • initial work on the new dtype infrastructure and casting,
    • universal2 wheels for Python 3.8 and Python 3.9 on Mac,
    • improved documentation,
    • improved annotations,
    • new PCG64DXSM bitgenerator for random numbers.

    In addition there are the usual large number of bug fixes and other improvements.

    The Python versions supported for this release are 3.7-3.9. Official support for Python 3.10 will be added when it is released.

    :warning: Warning: there are unresolved problems compiling NumPy 1.21.0 with gcc-11.1 .

    • Optimization level -O3 results in many wrong warnings when running the tests.
    • On some hardware NumPy will hang in an infinite loop.

    New functions

    Add PCG64DXSM BitGenerator

    Uses of the PCG64 BitGenerator in a massively-parallel context have been shown to have statistical weaknesses that were not apparent at the first release in numpy 1.17. Most users will never observe this weakness and are safe to continue to use PCG64. We have introduced a new PCG64DXSM BitGenerator that will eventually become the new default BitGenerator implementation used by default_rng in future releases. PCG64DXSM solves the statistical weakness while preserving the performance and the features of PCG64.

    See upgrading-pcg64 for more details.

    (gh-18906)

    Expired deprecations

    • The shape argument numpy.unravel_index cannot be passed as dims keyword argument anymore. (Was deprecated in NumPy 1.16.)

    ... (truncated)

    Commits
    • b235f9e Merge pull request #19283 from charris/prepare-1.21.0-release
    • 34aebc2 MAINT: Update 1.21.0-notes.rst
    • 493b64b MAINT: Update 1.21.0-changelog.rst
    • 07d7e72 MAINT: Remove accidentally created directory.
    • 032fca5 Merge pull request #19280 from charris/backport-19277
    • 7d25b81 BUG: Fix refcount leak in ResultType
    • fa5754e BUG: Add missing DECREF in new path
    • 61127bb Merge pull request #19268 from charris/backport-19264
    • 143d45f Merge pull request #19269 from charris/backport-19228
    • d80e473 BUG: Removed typing for == and != in dtypes
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump ipython from 6.3.1 to 7.16.3

    Bump ipython from 6.3.1 to 7.16.3

    Bumps ipython from 6.3.1 to 7.16.3.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump opencv-python from 3.4.0.12 to 4.2.0.32

    Bump opencv-python from 3.4.0.12 to 4.2.0.32

    Bumps opencv-python from 3.4.0.12 to 4.2.0.32.

    Release notes

    Sourced from opencv-python's releases.

    4.2.0.32

    OpenCV version 4.2.0.

    Changes:

    • macOS environment updated from xcode8.3 to xcode 9.4
    • macOS uses now Qt 5 instead of Qt 4
    • Nasm version updated to Docker containers
    • multibuild updated

    Fixes:

    • don't use deprecated brew tap-pin, instead refer to the full package name when installing #267
    • replace get_config_var() with get_config_vars() in setup.py #274
    • add workaround for DLL errors in Windows Server #264

    3.4.9.31

    OpenCV version 3.4.9.

    Changes:

    • macOS environment updated from xcode8.3 to xcode 9.4
    • macOS uses now Qt 5 instead of Qt 4
    • Nasm version updated to Docker containers
    • multibuild updated

    Fixes:

    • don't use deprecated brew tap-pin, instead refer to the full package name when installing #267
    • replace get_config_var() with get_config_vars() in setup.py #274
    • add workaround for DLL errors in Windows Server #264

    4.1.2.30

    OpenCV version 4.1.2.

    Changes:

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump pillow from 5.1.0 to 6.2.0

    Bump pillow from 5.1.0 to 6.2.0

    Bumps pillow from 5.1.0 to 6.2.0.

    Release notes

    Sourced from pillow's releases.

    6.2.0

    https://pillow.readthedocs.io/en/stable/releasenotes/6.2.0.html

    6.1.0

    https://pillow.readthedocs.io/en/stable/releasenotes/6.1.0.html

    6.0.0

    No release notes provided.

    5.4.1

    No release notes provided.

    5.4.0

    No release notes provided.

    5.3.0

    No release notes provided.

    5.2.0

    No release notes provided.

    Changelog

    Sourced from pillow's changelog.

    6.2.0 (2019-10-01)

    • Catch buffer overruns #4104 [radarhere]

    • Initialize rows_per_strip when RowsPerStrip tag is missing #4034 [cgohlke, radarhere]

    • Raise error if TIFF dimension is a string #4103 [radarhere]

    • Added decompression bomb checks #4102 [radarhere]

    • Fix ImageGrab.grab DPI scaling on Windows 10 version 1607+ #4000 [nulano, radarhere]

    • Corrected negative seeks #4101 [radarhere]

    • Added argument to capture all screens on Windows #3950 [nulano, radarhere]

    • Updated warning to specify when Image.frombuffer defaults will change #4086 [radarhere]

    • Changed WindowsViewer format to PNG #4080 [radarhere]

    • Use TIFF orientation #4063 [radarhere]

    • Raise the same error if a truncated image is loaded a second time #3965 [radarhere]

    • Lazily use ImageFileDirectory_v1 values from Exif #4031 [radarhere]

    • Improved HSV conversion #4004 [radarhere]

    • Added text stroking #3978 [radarhere, hugovk]

    • No more deprecated bdist_wininst .exe installers #4029 [hugovk]

    • Do not allow floodfill to extend into negative coordinates #4017 [radarhere]

    ... (truncated)
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot ignore this [patch|minor|major] version will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Handle corrupted images

    Handle corrupted images

    Kaggle topic: https://www.kaggle.com/c/airbus-ship-detection/discussion/62921

    Image with examples: https://storage.googleapis.com/kaggle-forum-message-attachments/367968/10020/corrupted.png

    opened by kamil-kaczmarek 0
Releases(solution-2)
UNION: An Unreferenced Metric for Evaluating Open-ended Story Generation

UNION Automatic Evaluation Metric described in the paper UNION: An UNreferenced MetrIc for Evaluating Open-eNded Story Generation (EMNLP 2020). Please

50 Dec 30, 2022
Data from "HateCheck: Functional Tests for Hate Speech Detection Models" (Rรถttger et al., ACL 2021)

In this repo, you can find the data from our ACL 2021 paper "HateCheck: Functional Tests for Hate Speech Detection Models". "test_suite_cases.csv" con

Paul Rรถttger 43 Nov 11, 2022
Graduation Project

Gesture-Detection-and-Depth-Estimation This is my graduation project. (1) In this project, I use the YOLOv3 object detection model to detect gesture i

ChaosAT 1 Nov 23, 2021
A mini library for Policy Gradients with Parameter-based Exploration, with reference implementation of the ClipUp optimizer from NNAISENSE.

PGPElib A mini library for Policy Gradients with Parameter-based Exploration [1] and friends. This library serves as a clean re-implementation of the

NNAISENSE 56 Jan 01, 2023
Code for ACL'2021 paper WARP ๐ŸŒ€ Word-level Adversarial ReProgramming

Code for ACL'2021 paper WARP ๐ŸŒ€ Word-level Adversarial ReProgramming. Outperforming `GPT-3` on SuperGLUE Few-Shot text classification.

YerevaNN 75 Nov 06, 2022
Generating Band-Limited Adversarial Surfaces Using Neural Networks

Generating Band-Limited Adversarial Surfaces Using Neural Networks This is the official repository of the technical report that was published on arXiv

3 Jul 26, 2022
Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE)

OG-SPACE Introduction Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE) is a computational framewo

Data and Computational Biology Group UNIMIB (was BI*oinformatics MI*lan B*icocca) 0 Nov 17, 2021
Visual Adversarial Imitation Learning using Variational Models (VMAIL)

Visual Adversarial Imitation Learning using Variational Models (VMAIL) This is the official implementation of the NeurIPS 2021 paper. Project website

14 Nov 18, 2022
Python inverse kinematics for your robot model based on Pinocchio.

Python inverse kinematics for your robot model based on Pinocchio.

Stรฉphane Caron 50 Dec 22, 2022
This project implements "virtual speed" from heart rate monito

ANT+ Virtual Stride Based Speed and Distance Monitor Overview This project imple

2 May 20, 2022
This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroimaging" that has been accepted to NeurIPS 2021.

Dugh-NeurIPS-2021 This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroi

Ali Hashemi 5 Jul 12, 2022
PyTorch implementation of Pay Attention to MLPs

gMLP PyTorch implementation of Pay Attention to MLPs. Quickstart Clone this repository. git clone https://github.com/jaketae/g-mlp.git Navigate to th

Jake Tae 34 Dec 13, 2022
Official PyTorch implementation for "Low Precision Decentralized Distributed Training with Heterogenous Data"

Low Precision Decentralized Training with Heterogenous Data Official PyTorch implementation for "Low Precision Decentralized Distributed Training with

Aparna Aketi 0 Nov 23, 2021
Code for our paper "Graph Pre-training for AMR Parsing and Generation" in ACL2022

AMRBART An implementation for ACL2022 paper "Graph Pre-training for AMR Parsing and Generation". You may find our paper here (Arxiv). Requirements pyt

xfbai 60 Jan 03, 2023
Implementation of GGB color space

GGB Color Space This package is implementation of GGB color space from Development of a Robust Algorithm for Detection of Nuclei and Classification of

Resha Dwika Hefni Al-Fahsi 2 Oct 06, 2021
repro_eval is a collection of measures to evaluate the reproducibility/replicability of system-oriented IR experiments

repro_eval repro_eval is a collection of measures to evaluate the reproducibility/replicability of system-oriented IR experiments. The measures were d

IR Group at Technische Hochschule Kรถln 9 May 25, 2022
PyMatting: A Python Library for Alpha Matting

Given an input image and a hand-drawn trimap (top row), alpha matting estimates the alpha channel of a foreground object which can then be composed onto a different background (bottom row).

PyMatting 1.4k Dec 30, 2022
Empower Sequence Labeling with Task-Aware Language Model

LM-LSTM-CRF Check Our New NER Toolkit ๐Ÿš€ ๐Ÿš€ ๐Ÿš€ Inference: LightNER: inference w. models pre-trained / trained w. any following tools, efficiently. Tra

Liyuan Liu 838 Jan 05, 2023
Ipython notebook presentations for getting starting with basic programming, statistics and machine learning techniques

Data Science 45-min Intros Every week*, our data science team @Gnip (aka @TwitterBoulder) gets together for about 50 minutes to learn something. While

Scott Hendrickson 1.6k Dec 31, 2022
code for paper "Not All Unlabeled Data are Equal: Learning to Weight Data in Semi-supervised Learning" by Zhongzheng Ren*, Raymond A. Yeh*, Alexander G. Schwing.

Not All Unlabeled Data are Equal: Learning to Weight Data in Semi-supervised Learning Overview This code is for paper: Not All Unlabeled Data are Equa

Jason Ren 22 Nov 23, 2022