Python implementation of "Elliptic Fourier Features of a Closed Contour"

Overview

PyEFD

Build and Test Documentation Status image image image

An Python/NumPy implementation of a method for approximating a contour with a Fourier series, as described in [1].

Installation

pip install pyefd

Usage

Given a closed contour of a shape, generated by e.g. scikit-image or OpenCV, this package can fit a Fourier series approximating the shape of the contour.

General usage examples

This section describes the general usage patterns of pyefd.

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10)

The coefficients returned are the a_n, b_n, c_n and d_n of the following Fourier series representation of the shape.

The coefficients returned are by default normalized so that they are rotation and size-invariant. This can be overridden by calling:

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=False)

Normalization can also be done afterwards:

from pyefd import normalize_efd
coeffs = normalize_efd(coeffs)

OpenCV example

If you are using OpenCV to generate contours, this example shows how to connect it to pyefd.

import cv2 
import numpy
from pyefd import elliptic_fourier_descriptors

# Find the contours of a binary image using OpenCV.
contours, hierarchy = cv2.findContours(
    im, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)

# Iterate through all contours found and store each contour's 
# elliptical Fourier descriptor's coefficients.
coeffs = []
for cnt in contours:
    # Find the coefficients of all contours
    coeffs.append(elliptic_fourier_descriptors(
        numpy.squeeze(cnt), order=10))

Using EFD as features

To use these as features, one can write a small wrapper function:

from pyefd import elliptic_fourier_descriptors

def efd_feature(contour):
    coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
    return coeffs.flatten()[3:]

If the coefficients are normalized, then coeffs[0, 0] = 1.0, coeffs[0, 1] = 0.0 and coeffs[0, 2] = 0.0, so they can be disregarded when using the elliptic Fourier descriptors as features.

See [1] for more technical details.

Testing

Run tests with with Pytest:

py.test tests.py

The tests include a single image from the MNIST dataset of handwritten digits ([2]) as a contour to use for testing.

Documentation

See ReadTheDocs.

References

[1]: Frank P Kuhl, Charles R Giardina, Elliptic Fourier features of a closed contour, Computer Graphics and Image Processing, Volume 18, Issue 3, 1982, Pages 236-258, ISSN 0146-664X, http://dx.doi.org/10.1016/0146-664X(82)90034-X.

[2]: LeCun et al. (1999): The MNIST Dataset Of Handwritten Digits

Comments
  • Vectorized contour reconstruction function

    Vectorized contour reconstruction function

    Hope to contribute some more to this project with an extracted contour reconstruction function. Refactored tests accordingly. To compare reconstructed shapes I had to import a reliable hausdorff distance function, for which the scipy package was included in the test requirements.

    opened by reinvantveer 4
  • fix x/y swapping and add demo

    fix x/y swapping and add demo

    Hi,

    I noticed that in some places apparently the x/y dimension was mixed up and I attempted to fix this. As a test and demo, I added a few geometric figures to showcase this method.

    Best regards, Jonathan

    enhancement 
    opened by jonathanschilling 3
  • Method not robust to random index ?

    Method not robust to random index ?

    Hello,

    I wanted to test your method, I do not really know how does it works but it seems that how the point are indexed have some importance as I get strange result when the array is indexed differently ... Is there a way to resolve this ?

    Find below illustration of what I mean

    normal result when points are correctly ordered image

    abnormal result when points are randomly ordered image

    opened by julienguegan 3
  • Bad reconstruction results

    Bad reconstruction results

    Hi, now I'm writing the code that reconstructs the image from eft coefficienct @hbldh

    img_1 = np.array(
        [
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                64,
                64,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                127,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                64,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                191,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
        ]
    )
    
    img_1 = np.uint8(img_1)
    edges = cv2.Canny(img_1,100,200)
    contour_2 = []
    
    for i in range(edges.shape[0]):
        for j in range(edges.shape[1]):
            if edges[i,j] == 255:
              contour_2.append([i,j])
    contour_2 = np.array(contour_2)
    
    cv2.imwrite('test1.png',img_1)
    
    coeffs = pyefd.elliptic_fourier_descriptors(contour_2, order=10, normalize=False)
    
    contour_2 = pyefd.reconstruct_contour(coeffs, locus=(0, 0), num_points=300)
    
    for i in range(contour_1.shape[0]):
        tmp[int(round(contour_1[i][0]))][int(round(contour_1[i][1]))] = 255
    print(tmp.shape)
    cv2.imwrite('test2.png',tmp)
    

    However, the result is not the supposed one. How can I fix my code to reconstruct the correct image?

    test1, reconstruction of img_1(test1.png) test2, reconstruction of edge test3, reconstruction from coeffs, (test2.png)

    opened by MADONOKOUKI 2
  • Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Hi, I am sending my contour sequence to your function to define properties using the opencv example in your readme file, but I get the following error. What is the reason?

    My code:

    import cv2 
    import numpy as np
    from pyefd import elliptic_fourier_descriptors
    
    def auto_canny(image, sigma=0.33):
    	# compute the median of the single channel pixel intensities
    	v = np.median(image)
    	# apply automatic Canny edge detection using the computed median
    	lower = int(max(0, (1.0 - sigma) * v))
    	upper = int(min(255, (1.0 + sigma) * v))
    	edged = cv2.Canny(image, lower, upper)
    	# return the edged image
    	return edged
    def efd_feature(contour):
        coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
        return coeffs.flatten()[3:]
    img = cv2.imread('C:/Users/Ogeday/image.jpg')
    gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
    retval,th = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY_INV +cv2.THRESH_OTSU)
    cv2.imshow("thresolded",th);
    
    canny=auto_canny(th);
    
    cv2.imshow("cannied",canny);
    # Find the contours of a binary image using OpenCV.
    contours, hierarchy = cv2.findContours(canny, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
    
    # Iterate through all contours found and store each contour's 
    # elliptical Fourier descriptor's coefficients.
    coeffs = []
    for cnt in contours:
        # Find the coefficients of all contours
     coeffs.append(elliptic_fourier_descriptors(np.squeeze(cnt), order=10))
    
    efd=efd_feature(contours);
    print(efd);
    
    opened by OgedayOztekin 2
  • pyefd for 3D points

    pyefd for 3D points

    Hi!

    I wondered if I could use pyefd for generating the contour from 3D data points, where x, y, and z are the coordinates of a generic point. Do you have any suggestions?

    I really appreciate any help you can provide!

    opened by dalbenzioG 1
  • Feature request: normalize_efd function that also outputs angle and scale

    Feature request: normalize_efd function that also outputs angle and scale

    Thank you very much for this beautiful piece of software. For my purposes it would be great to also get the normalization angle and scale in order to store it alongside the descriptor for future lookups. Would it be possible to have a analogous function to normalize_efd that outputs those values and the normalized descriptor as a tuple?

    enhancement 
    opened by geloescht 1
  • Release/v1.5.0

    Release/v1.5.0

    Version 1.5.0

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.
    opened by hbldh 0
  • Create Dependabot config file

    Create Dependabot config file

    :wave: Dependabot is moving natively into GitHub! This pull request migrates your configuration from Dependabot.com to a config file, using the new syntax. When you merge this pull request, we'll swap out dependabot-preview (me) for a new dependabot app, and you'll be all set!

    With this change, you'll now use the Dependabot page in GitHub, rather than the Dependabot dashboard, to monitor your version updates. Dependabot is now configured exclusively using config files.

    If you've got any questions or feedback for us, please let us know by creating an issue in the dependabot/dependabot-core repository.

    Learn more about the relaunch of Dependabot

    Please note that regular @dependabot commands do not work on this pull request.

    :robot::yellow_heart:

    dependencies 
    opened by dependabot-preview[bot] 0
  • Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/.

    You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

    View the update logs.

    opened by dependabot-preview[bot] 0
  • Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files.

    As a result, Dependabot couldn't update your dependencies.

    The error Dependabot encountered was:

    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    [pipenv.exceptions.ResolutionFailure]:       req_dir=requirements_dir
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 726, in resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       req_dir=req_dir,
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 480, in actually_resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       resolved_tree = resolver.resolve()
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 395, in resolve
    [pipenv.exceptions.ResolutionFailure]:       raise ResolutionFailure(message=str(e))
    [pipenv.exceptions.ResolutionFailure]:       pipenv.exceptions.ResolutionFailure: ERROR: ERROR: Could not find a version that matches black
    [pipenv.exceptions.ResolutionFailure]:       Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    [pipenv.exceptions.ResolutionFailure]: Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
      First try clearing your dependency cache with $ pipenv lock --clear, then try the original command again.
     Alternatively, you can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
      Hint: try $ pipenv lock --pre if it is a pre-release dependency.
    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    
    ['Traceback (most recent call last):\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 501, in create_spinner\n    yield sp\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 649, in venv_resolve_deps\n    c = resolve(cmd, sp)\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 539, in resolve\n    sys.exit(c.return_code)\n', 'SystemExit: 1\n']
    

    If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

    You can mention @dependabot in the comments below to contact the Dependabot team.

    opened by dependabot-preview[bot] 0
  • Contour chain approximation

    Contour chain approximation "simple" is buggy or numerically instable

    Description

    I was running Fourier descriptors extraction on contours that naturally contain long straight lines. I used cv.CHAIN_APPROX_SIMPLE as usual but was having weird results as if the method does not converge:

    image

    I tried storing the contour as cv.CHAIN_APPROX_NONE instead and it fixed the problem for all of my cases: image

    Minimal setup to reproduce:

    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,255,255), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    
    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,0,0), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_NONE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    

    I get: image image

    opened by MikeTkachuk 0
  • RuntimeWarning: invalid value encountered in true_divide

    RuntimeWarning: invalid value encountered in true_divide

    Some specific contour leads to a warning and to NaN due to division by 0.

    from pyefd import elliptic_fourier_descriptors
    import numpy as np
    
    contour = np.array([(0.0007365261134166801, 0.0008592751780890362), (0.0011385481809349507, 0.0005073326831297464), (0.0016015060818268534, 0.00024058327913523136), (0.002107608603590938, 6.927799610623175e-05), (0.002637406510141327, 0.0), (0.003170539965043462, 3.5411605355473164e-05), (0.0036865209486098838, 0.00017415196403836042), (0.0036865209486098838, 0.00017415196403836042), (0.003301593851628093, 0.0011941724608851567), (0.003301593851628093, 0.0011941724608851567), (0.0029920052614881287, 0.001110928245675824), (0.002672125188546981, 0.0010896812824625624), (0.002354246444616681, 0.0011312480801257685), (0.002050584931558297, 0.0012340312499438122), (0.0017728101910231553, 0.001394080892339833), (0.001531596950512193, 0.0016052463893156954), (0.0013362148995842427, 0.001859412769243729), (0.0011941724608850457, 0.0021468125606828314), (0.001110928245675491, 0.0024564011508226846), (0.0010896812824621183, 0.0027762812237640544), (0.0011312480801258795, 0.003094159967693799), (0.001234031249943368, 0.0033978214807524054), (0.001394080892340055, 0.003675596221287547), (0.0016052463893154734, 0.003916809461798509), (0.00185941276924384, 0.004112191512726571), (0.0021468125606826094, 0.004254233951425768), (0.0017618854637007075, 0.005274254448272675), (0.0012828858113027586, 0.005037517050440643), (0.0008592751780888142, 0.0047118802988938), (0.0005073326831298575, 0.004309858231375752), (0.0002405832791353424, 0.003846900330483627), (6.927799610623175e-05, 0.0033407978087195422), (0.0, 0.0028109999021695975), (3.5411605355584186e-05, 0.0022778664472672405), (0.0001741519640382494, 0.0017618854637008186), (0.00041088936187017033, 0.0012828858113032027), (0.0007365261134166801, 0.0008592751780890362)])
    y = elliptic_fourier_descriptors(contour, order=3, normalize=False)
    print(y)
    

    will give the following output :

    [[nan nan nan nan] [nan nan nan nan] [nan nan nan nan]] /usr/local/lib/python3.7/dist-packages/pyefd.py:67: RuntimeWarning: invalid value encountered in true_divide a = consts * np.sum((dxy[:, 0] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:68: RuntimeWarning: invalid value encountered in true_divide b = consts * np.sum((dxy[:, 0] / dt) * d_sin_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:69: RuntimeWarning: invalid value encountered in true_divide c = consts * np.sum((dxy[:, 1] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:70: RuntimeWarning: invalid value encountered in true_divide d = consts * np.sum((dxy[:, 1] / dt) * d_sin_phi, axis=1)


    Any idea how to fix this ?

    Or how to work-around this ?

    opened by ghost 3
  • Descriptors not consistent across cycled contour indices

    Descriptors not consistent across cycled contour indices

    Description

    I am trying to create invariant descriptors for the same silhouettes at different rotation angles.

    What I Did

    Created rotated copies of the same picture. Ran skimage.measure.find_contours() on it to extract a contour and pyefd.elliptic_fourier_descriptors(normalize=True) on the result. Expected output: Equal with some margin of error for differently rotated copies. Actual output: Result is only sometimes equal.

    Unfortunately my code is spread over several source files and depends on data, so I cannot easily share an example of what I am actually doing. But here is a function that, when inserted into tests.py will result in a failed test:

    def test_normalizing_4():
        contour_2 = np.roll(contour_1[:-1,:], 40, axis=0)
        contour_2 = np.append(contour_2, [contour_2[0]], axis=0)
        c1 = pyefd.elliptic_fourier_descriptors(contour_1, normalize=True)
        c2 = pyefd.elliptic_fourier_descriptors(contour_2, normalize=True)
        np.testing.assert_almost_equal(c1, c2, decimal=12)
    

    The reason for this behaviour is actually mentioned in the original paper in chapter 5.1 and figure 8: For every shape there are two possible classifications, each rotated along one of the two semi-major axes (rotated 180 degrees from each other). It seems like pyefd chooses one of them based on the location of the first point in the contour.

    There might be two solutions to this, firstly to return both classifications or to choose one of them (more) consistently by examining higher harmonic content of the descriptor. Note that the (near-)circular case also exists as outlined in the paper in chapter 5.2, so returning multiple descriptors and normalisation parametres might be required anyway for contours with rotational symmetry.

    bug enhancement help wanted 
    opened by geloescht 2
Releases(v1.6.0)
  • v1.6.0(Dec 9, 2021)

    Version 1.6.0 (2021-12-09)

    Added

    • Added a demo for 3D surfaces with cylindrical symmetries. (examples/example1.py)

    Fixes

    • Fixes incorrectly plotted curves when no imshow has been called.
    • Fixes ugly coefficient calculation code.
    Source code(tar.gz)
    Source code(zip)
  • v1.5.1(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v.1.5.1-2(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Sep 28, 2020)

  • v0.1.0(Feb 9, 2016)

Owner
Henrik Blidh
Mathematician, Python programmer and Pointless Projecteer.
Henrik Blidh
The source code of "SIDE: Center-based Stereo 3D Detector with Structure-aware Instance Depth Estimation", accepted to WACV 2022.

SIDE: Center-based Stereo 3D Detector with Structure-aware Instance Depth Estimation The source code of our work "SIDE: Center-based Stereo 3D Detecto

10 Dec 18, 2022
Introducing neural networks to predict stock prices

IntroNeuralNetworks in Python: A Template Project IntroNeuralNetworks is a project that introduces neural networks and illustrates an example of how o

Vivek Palaniappan 637 Jan 04, 2023
Machine Unlearning with SISA

Machine Unlearning with SISA Lucas Bourtoule, Varun Chandrasekaran, Christopher Choquette-Choo, Hengrui Jia, Adelin Travers, Baiwu Zhang, David Lie, N

CleverHans Lab 70 Jan 01, 2023
Few-NERD: Not Only a Few-shot NER Dataset

Few-NERD: Not Only a Few-shot NER Dataset This is the source code of the ACL-IJCNLP 2021 paper: Few-NERD: A Few-shot Named Entity Recognition Dataset.

THUNLP 319 Dec 30, 2022
3D AffordanceNet is a 3D point cloud benchmark consisting of 23k shapes from 23 semantic object categories, annotated with 56k affordance annotations and covering 18 visual affordance categories.

3D AffordanceNet This repository is the official experiment implementation of 3D AffordanceNet benchmark. 3D AffordanceNet is a 3D point cloud benchma

49 Dec 01, 2022
Moer Grounded Image Captioning by Distilling Image-Text Matching Model

Moer Grounded Image Captioning by Distilling Image-Text Matching Model Requirements Python 3.7 Pytorch 1.2 Prepare data Please use git clone --recurse

YE Zhou 60 Dec 16, 2022
Its a Plant Leaf Disease Detection System based on Machine Learning.

My_Project_Code Its a Plant Leaf Disease Detection System based on Machine Learning. I have used Tomato Leaves Dataset from kaggle. This system detect

Sanskriti Sidola 3 Jun 15, 2022
Unified Interface for Constructing and Managing Workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Couler What is Couler? Couler aims to provide a unified interface for constructing and managing workflows on different workflow engines, such as Argo

Couler Project 781 Jan 03, 2023
Complete U-net Implementation with keras

U Net Lowered with Keras Complete U-net Implementation with keras Original Paper Link : https://arxiv.org/abs/1505.04597 Special Implementations : The

Sagnik Roy 14 Oct 10, 2022
DLFlow is a deep learning framework.

DLFlow是一套深度学习pipeline,它结合了Spark的大规模特征处理能力和Tensorflow模型构建能力。利用DLFlow可以快速处理原始特征、训练模型并进行大规模分布式预测,十分适合离线环境下的生产任务。利用DLFlow,用户只需专注于模型开发,而无需关心原始特征处理、pipeline构建、生产部署等工作。

DiDi 152 Oct 27, 2022
A Light CNN for Deep Face Representation with Noisy Labels

A Light CNN for Deep Face Representation with Noisy Labels Citation If you use our models, please cite the following paper: @article{wulight, title=

Alfred Xiang Wu 715 Nov 05, 2022
Exploit Camera Raw Data for Video Super-Resolution via Hidden Markov Model Inference

RawVSR This repo contains the official codes for our paper: Exploit Camera Raw Data for Video Super-Resolution via Hidden Markov Model Inference Xiaoh

Xiaohong Liu 23 Oct 08, 2022
Geometry-Free View Synthesis: Transformers and no 3D Priors

Geometry-Free View Synthesis: Transformers and no 3D Priors Geometry-Free View Synthesis: Transformers and no 3D Priors Robin Rombach*, Patrick Esser*

CompVis Heidelberg 293 Dec 22, 2022
Computationally Efficient Optimization of Plackett-Luce Ranking Models for Relevance and Fairness

Computationally Efficient Optimization of Plackett-Luce Ranking Models for Relevance and Fairness This repository contains the code used for the exper

H.R. Oosterhuis 28 Nov 29, 2022
Only works with the dashboard version / branch of jesse

Jesse optuna Only works with the dashboard version / branch of jesse. The config.yml should be self-explainatory. Installation # install from git pip

Markus K. 8 Dec 04, 2022
50-days-of-Statistics-for-Data-Science - This repository consist of a 50-day program

50-days-of-Statistics-for-Data-Science - This repository consist of a 50-day program. All the statistics required for the complete understanding of data science will be uploaded in this repository.

komal_lamba 22 Dec 09, 2022
Official Implementation for the paper DeepFace-EMD: Re-ranking Using Patch-wise Earth Mover’s Distance Improves Out-Of-Distribution Face Identification

DeepFace-EMD: Re-ranking Using Patch-wise Earth Mover’s Distance Improves Out-Of-Distribution Face Identification Official Implementation for the pape

Anh M. Nguyen 36 Dec 28, 2022
Implements VQGAN+CLIP for image and video generation, and style transfers, based on text and image prompts. Emphasis on ease-of-use, documentation, and smooth video creation.

VQGAN-CLIP-GENERATOR Overview This is a package (with available notebook) for running VQGAN+CLIP locally, with a focus on ease of use, good documentat

Ryan Hamilton 98 Dec 30, 2022
A Keras implementation of YOLOv3 (Tensorflow backend)

keras-yolo3 Introduction A Keras implementation of YOLOv3 (Tensorflow backend) inspired by allanzelener/YAD2K. Quick Start Download YOLOv3 weights fro

7.1k Jan 03, 2023
Deep Learning agent of Starcraft2, similar to AlphaStar of DeepMind except size of network.

Introduction This repository is for Deep Learning agent of Starcraft2. It is very similar to AlphaStar of DeepMind except size of network. I only test

Dohyeong Kim 136 Jan 04, 2023