Typical: Fast, simple, & correct data-validation using Python 3 typing.

Overview

typical: Python's Typing Toolkit

image image image image Test & Lint Coverage Code style: black Netlify Status

How Typical

Introduction

Typical is a library devoted to runtime analysis, inference, validation, and enforcement of Python types, PEP 484 Type Hints, and custom user-defined data-types.

It provides a high-level Functional API and Object API to suite most any occasion.

Getting Started

Installation is as simple as pip install -U typical. For more installation options to make typical even faster, see the Install section in the documentation.

Help

The latest documentation is hosted at python-typical.org.

Starting with version 2.0, All documentation is hand-crafted markdown & versioned documentation can be found at typical's Git Repo. (Versioned documentation is still in-the-works directly on our domain.)

A Typical Use-Case

The decorator that started it all:

typic.al(...)

import typic


@typic.al
def hard_math(a: int, b: int, *c: int) -> int:
    return a + b + sum(c)

hard_math(1, "3")
#> 4


@typic.al(strict=True)
def strict_math(a: int, b: int, *c: int) -> int:
    return a + b + sum(c)

strict_math(1, 2, 3, "4")
#> Traceback (most recent call last):
#>  ...
#> typic.constraints.error.ConstraintValueError: Given value <'4'> fails constraints: (type=int, nullable=False, coerce=False)
  

Typical has both a high-level Object API and high-level Functional API. In general, any method registered to one API is also available to the other.

The Object API

from typing import Iterable

import typic


@typic.constrained(ge=1)
class ID(int):
    ...


@typic.constrained(max_length=280)
class Tweet(str):
    ...


@typic.klass
class Tweeter:
    id: ID
    tweets: Iterable[Tweet]
    

json = '{"id":1,"tweets":["I don\'t understand Twitter"]}'
t = Tweeter.transmute(json)

print(t)
#> Tweeter(id=1, tweets=["I don't understand Twitter"])

print(t.tojson())
#> '{"id":1,"tweets":["I don\'t understand Twitter"]}'

Tweeter.validate({"id": 0, "tweets": []})
#> Traceback (most recent call last):
#>  ...
#> typic.constraints.error.ConstraintValueError: Given value <0> fails constraints: (type=int, nullable=False, coerce=False, ge=1)

The Functional API

import dataclasses
from typing import Iterable

import typic


@typic.constrained(ge=1)
class ID(int):
    ...


@typic.constrained(max_length=280)
class Tweet(str):
    ...


@dataclasses.dataclass # or typing.TypedDict or typing.NamedTuple or annotated class...
class Tweeter:
    id: ID
    tweets: Iterable[Tweet]


json = '{"id":1,"tweets":["I don\'t understand Twitter"]}'
protocol = typic.protocol(Tweeter)

t = protocol.transmute(json)  # or typic.transmute()
print(t)
#> Tweeter(id=1, tweets=["I don't understand Twitter"])

print(protocol.tojson(t))
#> '{"id":1,"tweets":["I don\'t understand Twitter"]}'

protocol.validate({"id": 0, "tweets": []})  # or typic.validate()
#> Traceback (most recent call last):
#>  ...
#> typic.constraints.error.ConstraintValueError: Tweeter.id: value <0> fails constraints: (type=int, nullable=False, coerce=False, ge=1)

Changelog

See our Releases.

Comments
  • Wrong behavior when Union with constrained types

    Wrong behavior when Union with constrained types

    • typical version: 2.6.3
    • Python version: 3.8.5
    • Operating System: CentOS 8

    Description

    In this version(2.6.3), it can handle Union if the given argument is just an instance of one of the types in the Union, but not support subtypes, checking type by issubclass or isinstance can sovle this. But if Union contains constrained types, the transmute function did not behave like we hope.

    import typic, typing
    
    PositiveFloat = typic.constrained(float, gt=0)
    PositiveInt = typic.constrained(int, gt=0)
    
    
    proto = typic.protocol(typing.Union[PositiveInt, PositiveFloat])
    
    proto.transmute(PositiveFloat(2.1))  # 2.1
    proto.transmute(2.1)  # 2
    isinstance(2.1, PositiveFloat)  # False
    issubclass(PositiveFloat, float)  # True
    
    opened by xbanke 8
  • Coercion to Union does not result in error but returns passed in value -- Union could be expanded

    Coercion to Union does not result in error but returns passed in value -- Union could be expanded

    It's stated in the README:

    The following annotations are special forms which cannot be resolved: Union Any Because these signal an unclear resolution, Typical will ignore this flavor of annotation ...

    However, when I try to coerce some values to Unions, I didn't expect Typical to silently ignore Union annotations: (e.g. it returns a str when a str is passed in, when none of the Union members are str). Since it is ignored, I can't rely on passing my value through coercion to actually get the correct type returned to me. I think this is a major limitation and a source of errors.

    from typing import Union
    import typic
    
    
    # Case A. Incorrect. Expected: ValueError
    print(typic.coerce("foo", Union[float, int]))  # foo
    
    from enum import Enum
    
    class MyEnum(Enum):
      a = "foo"
      b = "bar"
    
    # Case C1. Correct: Union with None works.
    print(typic.coerce("foo", Union[MyEnum, None]))  # MyEnum.a
    print(typic.coerce(None, Union[MyEnum, None]))  # None
    
    # Case C1. Incorrect. Expected: MyEnum.a
    print(typic.coerce("foo", Union[MyEnum, int]))  # foo
    print(typic.coerce("foo", Union[MyEnum, bool]))  # foo
    
    class Sentinel:
      pass
    
    # Case B. Incorrect. Expected: MyEnum.a
    print(typic.coerce("foo", Union[MyEnum, Sentinel]))  # foo
    
    # Case C1: Correct: The value is an instance of one of the types
    print(typic.coerce(1, Union[bool, int]))  # 1
    print(typic.coerce(False, Union[bool, int]))  # False
    
    # Case B. Incorrect. Expected: True (since it can be coerced to bool, but not to int)
    print(typic.coerce("asdf", Union[bool, int]))  # asdf
    
    # Case C2. Expected: TypeError, since it could be coerced to both
    print(typic.coerce("1", Union[bool, int]))  # 1
    
    
    

    Would it really be problematic to handle coercing to Union in some cases?

    I can imagine scenarios where it's unambiguous, and one scenario where it is and an error could be raised:

    • A. If the value cannot be coerced to any of the types: raise one of the coercion errors (e.g. the last)
    • B. If the value can be coerced to only one of the types (the others may raise ValueError/TypeError): return that coerced value
    • C. If the value can be coerced to multiple of the types:
      • C1: If the value is an instance of one of the union's types, then "coerce" it to that type -- coerce in this context means just perform any constraint checks that there might be (This coercion is already handled for Union[T, None])
      • C2: Otherwise: Raise a TypeError (this is the ambiguous scenario)

    What do you think?

    In my mind, it might look something like this:

    from typing import Any, Iterable, Type
    from typic import coerce
    
    # Sentinel value
    class Empty:
        pass
    
    
    _empty = Empty()
    
    
    def coerce_union(value: Any, union: Iterable[Type]) -> Any:
        coerced_value = _empty
    
        for typ in union:
            # If the value is already an instance of one of the types in the Union, coerce using that type
            # Presumably, if the type was constrained, coercing could raise errors here, but that would be expected.
            if isinstance(value, typ):
                return coerce(value, typ)
        error = None
    
        for typ in union:
            try:
                new_coerced_value = coerce(value, typ)
            except (ValueError, TypeError) as e:
                # Store the error and continue
                error = e
                continue
            if coerced_value is not _empty:
                raise TypeError("Ambiguous coercion: the value could be coerced to multiple types in the Union")
            coerced_value = new_coerced_value
        if coerced_value is not _empty:
            return coerced_value
        # The value couldn't be coerced to any of the types in the Union
        raise error
    
    enhancement 
    opened by syastrov 8
  • Schema generation: Name objects after their type rather the field nam…

    Schema generation: Name objects after their type rather the field nam…

    …e where they were referenced

    This helps to allow definitions to be reused. Previously, the behavior would be that there would be 2 definitions (named First and Second) rather than one.

    It's also the behavior I originally expected, as I wondered why my definitions got a different name than the name of the e.g. dataclass referencced.

    However this change is backwards-incompatible. Not sure if there should be an option somewhere to configure the schema builder's behavior (it could be that you could provide callable which returns the name for an object)? If so, I'm not sure where the appropriate place, API-wise, to place such an option would be. Another option would be to make defname not a staticmethod so you could easily subclass SchemaBuilder and override defname.

    There is also the issue that the name could conflict with another object (which can also happen with the previous naming behavior) from a different module with the same name. In that case, either an error should be thrown or perhaps the module name should be included and prepended to the type's __name__?

    What do you think?

    enhancement 
    opened by syastrov 7
  • Not expected behavior for datetime with strict option.

    Not expected behavior for datetime with strict option.

    • typical version: 2.0.0
    • Python version: 3.8.2
    • Operating System: macOS Catalina

    Expected that when using strict with datetime, that it only accept datetime object. And do not do any type coercer. But the following example do not behave as I expected. Expected to get an exception in both cases.

    >>> import typic
    >>> typic.transmute(typic.Strict[pendulum.DateTime], '2020-04-09') 
    DateTime(2020, 4, 9, 0, 0, 0, tzinfo=Timezone('UTC'))
    

    or similary

    >>> @typic.klass(strict=True)
    ... class Test:
    ... date: pendulum.DateTime
    >>> Test('2020-04-09') 
    Test(date=DateTime(2020, 4, 9, 0, 0, 0, tzinfo=Timezone('UTC')))
    
    enhancement 
    opened by kfollesdal 6
  • Still not working with PEP 604 Union Operators

    Still not working with PEP 604 Union Operators

    • typical version: 2.4.0
    • Python version: 3.10.b2
    • Operating System: Linux
    import typic
    
    @typic.al
    def foo(bar: str | int):
        pass
    
    

    Would raise these errors:

    Traceback (most recent call last):
      File "/home/lsongzhi/code/python/pydemo/main.py", line 4, in <module>
        def foo(bar: str | int):
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/api.py", line 384, in typed
        return _typed(_cls_or_callable) if _cls_or_callable is not None else _typed
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/api.py", line 378, in _typed
        return wrap(obj, delay=delay, strict=strict)  # type: ignore
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/api.py", line 172, in wrap
        protocols(func)
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/serde/resolver.py", line 782, in protocols
        resolved = self.resolve(
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/serde/resolver.py", line 709, in resolve
        resolved = self._resolve_from_annotation(anno, namespace=namespace)
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/serde/resolver.py", line 524, in _resolve_from_annotation
        deserializer, validator = self.des.factory(
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/serde/des.py", line 866, in factory
        deserializer = self._build_des(annotation, key, namespace)
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/serde/des.py", line 709, in _build_des
        anno_name = get_unique_name(origin)
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/util.py", line 255, in get_unique_name
        return f"{get_name(obj)}_{id(obj)}".replace("-", "_")
      File "/home/lsongzhi/.pyenv/versions/3.10.0b2/lib/python3.10/site-packages/typic/util.py", line 236, in get_name
        return obj.__name__
    AttributeError: 'types.Union' object has no attribute '__name__'. Did you mean: '__ne__'?
    
    opened by songzhi 5
  • bug: RecursionError when typic.klass used in metaclass if slots = True

    bug: RecursionError when typic.klass used in metaclass if slots = True

    • typical version: 2.0.19
    • Python version: 3.7.4
    • Operating System:

    Description

    import typic
    
    class FooMeta(type):
        def __new__(mcs, name, bases, namespace):
            cls = super().__new__(mcs, name, bases, namespace)
            ...
            cls = typic.klass(cls, slots=True)
            ...
            return cls
    
    class Foo(metaclass=FooMeta):   # RecursionError
        a: int
    

    What I Did

    Paste the command(s) you ran and the output.
    If there was a crash, please include the traceback here.
    
    wontfix 
    opened by xbanke 4
  • JSON schema generated for date field is invalid upon calling `validate`

    JSON schema generated for date field is invalid upon calling `validate`

    It seems like typical doesn't convert the StringFormat.DATE enum field into a primitive (e.g. str), which fastjsonschema doesn't like.

    >>> import typic
    >>> from datetime import date
    >>> from dataclasses import dataclass
    >>> @typic.al
    ... @dataclass
    ... class Foo:
    ...   foo: date
    ...
    >>> typic.schema(Foo)
    ObjectSchemaField(title='Foo', description='Foo(foo: datetime.date)', properties={'foo': StrSchemaField(format=<StringFormat.DATE: 'date'>)}, additionalProperties=False, required=('foo',))
    >>> typic.schema(Foo).validate({})
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File ".../venv/lib/python3.7/site-packages/typic/schema/field.py", line 194, in validate
        return self.validator(obj)
      File ".../venv/lib/python3.7/site-packages/typic/util.py", line 226, in __get__
        cache[attrname] = self.func(instance)
      File ".../venv/lib/python3.7/site-packages/typic/schema/field.py", line 189, in validator
        return fastjsonschema.compile(self.asdict())
      File ".../venv/lib/python3.7/site-packages/fastjsonschema/__init__.py", line 167, in compile
        exec(code_generator.func_code, global_state)
      File "<string>", line 5
        raise JsonSchemaException("data must be object", value=data, name="data", definition={'type': 'object', 'title': 'Foo', 'description': 'Foo(foo: datetime.date)', 'properties': {'foo': {'type': 'string', 'format': <StringFormat.DATE: 'date'>}}, 'additionalProperties': False, 'required': ['foo'], 'definitions': {}}, rule='type')
                                                                                                                                                                                                                             ^
    SyntaxError: invalid syntax
    
    bug 
    opened by syastrov 4
  • Switch to own implementation from dataclasses

    Switch to own implementation from dataclasses

    Heyo! I very like your library! But when i saw the docs, i found that typical uses python dataclasses from standart library which, of course, will decrease performance because they are slow. What do you think about reinventing it for more performance? :)

    opened by prostomarkeloff 4
  • Coercion from None to str

    Coercion from None to str

    I didn't expect that None would be coerced to a string for a field marked as str. Can that behavior be disabled?

    >>> @typic.al
    ... @dataclass
    ... class Y:
    ...   a: str
    ... 
    >>> Y(None)
    Y(a='None')
    

    The same behavior happens if you try to coerce other types:

    >>> Y(dict)
    Y(a="<class 'dict'>")
    
    question 
    opened by syastrov 4
  • float should not been converted to int with Union[float, int]

    float should not been converted to int with Union[float, int]

    • typical version: 2.6.2
    • Python version: 3.8.5
    • Operating System: CentOS 8

    Description

    import typic, typing
    
    
    @typic.klass
    class Foo:
        a: typing.Union[float, int] = 5
    
            
    Foo(0.3)  # Foo(a=0)
    
    opened by xbanke 3
  • Using typical with a class that has a custom from_dict method

    Using typical with a class that has a custom from_dict method

    • typical version: 2.0.29
    • Python version: 3.8.3
    • Operating System: Linux

    Description

    I have a class with a custom from_dict class method that does some stuff and then calls transmute. However I note that the deserializer used in transmute creates and then uses a from_dict class method internally. This means that because I have defined my own method of the same name that in turn calls transmute, I obviously hit the recursion limit.

    Simple example:

    @typic.al
    @dataclass
    class Person:
        name: str
        age: int
    
        @classmethod
        def from_dict(cls, dict):
            return cls.transmute(dict)
    
    
    person = Person.from_dict({"name": "Jack", "age": 10})
    

    Gives (full stacktrace ommited due to obvious size):

      ...
      File "<typical generated deserializer__283563495926319966>", line 7, in deserializer__283563495926319966
      File "/home/arranhs/Desktop/typical/typical.py, line 20, in from_dict
        return cls.transmute(dict)
      File "<typical generated deserializer__283563495926319966>", line 3, in deserializer__283563495926319966
    RecursionError: maximum recursion depth exceeded in __instancecheck__
    

    And the generated deserializer:

    def deserializer__283563495926319966(val):
        _, val = __eval(val) if isinstance(val, (str, bytes)) else (False, val)
        vtype = val.__class__
        if vtype is Person_38601264:
            return val
        val = Person_38601264.from_dict(val)
        return val
    

    What I Did

    Naturally the simple solution (and I suppose the good enough one) is to rename my from_dict method. Unfortunately for me this is a breaking change and I would prefer to avoid it.

    Is there a way to get around this? The only path i could see thus far would be to fork the repo and make from_dict, _from_dict. Would this break anything? Could this feasibly and easily be done in this repo for avoid issues for others since this is a pretty common function name. If not, maybe I can help add a documentation warning as this took me quite a while to understand the root cause of.

    On a side note, I also have a to_dict method, and I note you also implement this in typical with from_dict. So far I have not noticed any issues having this method and overriding yours, but maybe there is something I am missing. Could this cause issues?

    opened by ahobsonsayers 3
  • Question on your approach to strict mode

    Question on your approach to strict mode

    Thank you so much for the work that you've done with this library. I haven't gotten to actually use it yet (so I may be missing important details, and would appreciate knowing that), and I'm extremely impressed. It is taking the right approach to design in ways that have really caused me trouble with other typing and serialization libraries, or that have caused me to just not be happy enough to want to use them.

    The principles you lay out as guiding principles are spot on. By leaning into the standard tooling to provide extensibility over those tools rather than working a different parallel path, or even in opposition to them, we're better able to gradually improve our code. Bit by bit, making it better, as we learn and grow, without having to risk an entire rewrite to get the benefits.


    When I think about working with Python instead of against it, which is a key feature that I'm very impressed with your work in typical for data classes and other built-in types, I find the approach of the default non-strict mode to be a very notable departure from that principle. Python doesn't do implicit type coercion, and that has been an intentional and deliberate design choice of the language for as long as I've known it, and I think probably since its inception.

    The non-strict mode default turns that design principle absolutely on its head. You've provided strict mode, and that helps us work around the concern, but the philosophy is still reversed from the Pythonic first principles. If I care to preserve this principle, I'm left with a variety of (easy to use) options, where I have to constantly re-affirm that I agree with this Pythonic first principle.

    The global strict mode to solve this is a non-starter for non-trivial applications, because it breaks assumptions other libraries might very reasonably be making about the state of that. I want everyone to use typical, so I want that concern to be a common thing to encounter. IMO, it is a misfeature to even offer that API, it's far too powerful of a foot gun.

    All that said, easy and loose type coercion is extremely valuable. You had an eye toward this with your initial definition of @typic.al. That is a great tool, and I don't wish for you to take it out of the toolbox. However, I think it should be a different tool than the typing and serialization layer's defaults.


    Personally, I find this core philosophical difference between typical as it currently exists and the way I think of the principles that have guided your design of typical to be so significant that its worth having an entirely separate API if needed that defaults to strict mode. There are a few approaches I can see to doing this, depending on what you want typical to be.

    1. Do nothing, because the current design is what you intend typical to be.
    2. Cut a major, breaking release of typical, fixing and reversing this default.
    3. Release new APIs in this package that are reliably strict by default.
    4. Create a new package in this distribution with strict by default APIs.

    Gut reactions to which of these is best might be further informed by these additional considerations:

    • If the current design is the right choice for typical, I can see myself considering releasing another distribution package to PyPI, that perhaps works with typical under the hood, but exposes the strict mode by default.
    • A new package in this distribution could be typical, matching the distribution package name. This could leave the cute typic.al shortcut names defaulting to the non-strict mode, which may be preferable to many people, and allowing others to choose to use the strict APIs that might have more no-frills business-mode names like attrs ended up adding.
    • I'm sure we can find a nice keyword for non-strict mode, less boring than "non-strict". magic=True, maybe? Magic is cool, as long as you've asked for it. friendly? autoconvert?

    If you got through all that, I hope that it came through in the intended spirit of gratitude and deference. I greatly appreciate that you've released this to the world, and that I get to see it. Still, this is your project, and it is and should be what you say it is, and I respect that.

    What would you like to be the future of strict mode in typical? Do you agree with me that it's critical enough to warrant one of these significant options to allow for really changing the default mode? Or is that just not what you want typical to be? Have I perhaps missed something important?

    opened by ryanhiebert 2
  • Is there a way to use choices with values and labels when generating schemas?

    Is there a way to use choices with values and labels when generating schemas?

    A lot of Python libraries/frameworks support the use of choices as collection of <value>/<label> tuples (or even an enum, for example http.HTTPStatus. Is there a specific way to have the labels for such a collection to the generated schema?

    I'm considering using typical as a replacement for pydantic, and I already asked a similar question for pydantic here.

    Thanks in advance

    opened by dodumosu 2
  • Version conflicts installed the pinned typing-extensions

    Version conflicts installed the pinned typing-extensions

    • typical version: 2.8.0
    • Python version: 3.8.13
    • Operating System: Linux - Redhat EL7

    Description

    Starting to use typical, and recently installed my package which also has a dependency on "rich". "rich" requires "typing-extensions" > 4 and "typical" has "typing-extensions" pinned to 3.10.0.2.

    What I Did

    Searched "typing-extensions" in the "typical" github repository and it isn't used in any code - so there must be a deeper reason why "typical" needs the pinned version.

    Paste the command(s) you ran and the output.
    If there was a crash, please include the traceback here.
    
    opened by timcera 2
  • Bump numpy from 1.21.4 to 1.22.0

    Bump numpy from 1.21.4 to 1.22.0

    Bumps numpy from 1.21.4 to 1.22.0.

    Release notes

    Sourced from numpy's releases.

    v1.22.0

    NumPy 1.22.0 Release Notes

    NumPy 1.22.0 is a big release featuring the work of 153 contributors spread over 609 pull requests. There have been many improvements, highlights are:

    • Annotations of the main namespace are essentially complete. Upstream is a moving target, so there will likely be further improvements, but the major work is done. This is probably the most user visible enhancement in this release.
    • A preliminary version of the proposed Array-API is provided. This is a step in creating a standard collection of functions that can be used across application such as CuPy and JAX.
    • NumPy now has a DLPack backend. DLPack provides a common interchange format for array (tensor) data.
    • New methods for quantile, percentile, and related functions. The new methods provide a complete set of the methods commonly found in the literature.
    • A new configurable allocator for use by downstream projects.

    These are in addition to the ongoing work to provide SIMD support for commonly used functions, improvements to F2PY, and better documentation.

    The Python versions supported in this release are 3.8-3.10, Python 3.7 has been dropped. Note that 32 bit wheels are only provided for Python 3.8 and 3.9 on Windows, all other wheels are 64 bits on account of Ubuntu, Fedora, and other Linux distributions dropping 32 bit support. All 64 bit wheels are also linked with 64 bit integer OpenBLAS, which should fix the occasional problems encountered by folks using truly huge arrays.

    Expired deprecations

    Deprecated numeric style dtype strings have been removed

    Using the strings "Bytes0", "Datetime64", "Str0", "Uint32", and "Uint64" as a dtype will now raise a TypeError.

    (gh-19539)

    Expired deprecations for loads, ndfromtxt, and mafromtxt in npyio

    numpy.loads was deprecated in v1.15, with the recommendation that users use pickle.loads instead. ndfromtxt and mafromtxt were both deprecated in v1.17 - users should use numpy.genfromtxt instead with the appropriate value for the usemask parameter.

    (gh-19615)

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump django from 2.2.25 to 2.2.28

    Bump django from 2.2.25 to 2.2.28

    Bumps django from 2.2.25 to 2.2.28.

    Commits
    • 5c33000 [2.2.x] Bumped version for 2.2.28 release.
    • 29a6c98 [2.2.x] Fixed CVE-2022-28347 -- Protected QuerySet.explain(**options) against...
    • 2c09e68 [2.2.x] Fixed CVE-2022-28346 -- Protected QuerySet.annotate(), aggregate(), a...
    • 8352b98 [2.2.x] Added stub release notes for 2.2.28.
    • 2801f29 [2.2.x] Reverted "Fixed forms_tests.tests.test_renderers with Jinja 3.1.0+."
    • e03648f [2.2.x] Fixed forms_tests.tests.test_renderers with Jinja 3.1.0+.
    • 9d13d8c [2.2.x] Fixed typo in release notes.
    • 047ece3 [2.2.x] Added CVE-2022-22818 and CVE-2022-23833 to security archive.
    • 2427b2f [2.2.x] Post-release version bump.
    • e541f2d [2.2.x] Bumped version for 2.2.27 release.
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Use orjson for other libraries benchmark

    Use orjson for other libraries benchmark

    Current benchmark is not fair for libraries like pydantic or marshmallow because they use standard json library while typical use orjson, which is a lot faster.

    opened by wyfo 0
Releases(v2.8.0)
  • v2.8.0(Dec 23, 2021)

    What's Changed

    • Improved Serialization by @seandstewart in https://github.com/seandstewart/typical/pull/190

    Full Changelog: https://github.com/seandstewart/typical/compare/v2.7.11...v2.8.0

    Source code(tar.gz)
    Source code(zip)
  • v2.7.11(Dec 15, 2021)

    What's Changed

    • CPython 3.9.8+ and 3.10.1+ have new logic for type resolution of ForwardRefs which caused a regression in our runtime type analysis. This change adds handling for this new logic.

    Full Changelog: https://github.com/seandstewart/typical/compare/v2.7.10...v2.7.11

    Source code(tar.gz)
    Source code(zip)
  • v2.7.10(Dec 15, 2021)

    This release fixes a regression in our serializer factory which failed to account for optional/nullable enum types when dumping to a primitive, leading to an AttributeError when attempting to access the enum value.

    Source code(tar.gz)
    Source code(zip)
  • v2.7.9(Nov 3, 2021)

    What's Changed

    • Fix Decimal serdes by @qhelix7 in https://github.com/seandstewart/typical/pull/189

    New Contributors

    • @qhelix7 made their first contribution in https://github.com/seandstewart/typical/pull/189

    Full Changelog: https://github.com/seandstewart/typical/compare/v2.7.8...v2.7.9

    Source code(tar.gz)
    Source code(zip)
  • v2.7.8(Nov 2, 2021)

    sqlite3.Row objects are C extension types which meet the contract for a mapping, but do not evaluate as a subclass of the Mapping generic. This change adds them to our list of "mapping-compliant" types.

    Source code(tar.gz)
    Source code(zip)
  • v2.7.6(Nov 1, 2021)

    NamedTuple objects were improperly treated as a simple builtin subtype when coercing user-defined types, which resulted in unexpected behavior. This fixes deserialization logic to use the standard translator protocol for named tuples.

    Source code(tar.gz)
    Source code(zip)
  • v2.7.5(Oct 15, 2021)

    Derived classes of a typic.klass object were improperly recognized as a simple Iterable.

    This caused a recursion error when attempting to serialize these objects, since the __iter__ magic method relies upon the iterator factory, but the iterator factory simply called the __iter__ magic method.

    Resolves #185

    Source code(tar.gz)
    Source code(zip)
  • v2.7.4(Oct 11, 2021)

  • v2.7.3(Oct 8, 2021)

    Fixes:

    • Support collections.deque as Array constraint (resolves #181)
    • Fix union constraints generation on stable py3.10 release.

    Misc:

    • Extend DSN support for sqlite urls.
    Source code(tar.gz)
    Source code(zip)
  • v2.7.2(Oct 5, 2021)

    Fixes

    • Nested mappings with exactly two keys could be improperly deserialized (#179)
    • Early return with stdlib get_tojson meant we weren't properly bootstrapping docs.

    Improvements

    • asyncpg.Record objects are now recognized as a Mapping type.
    Source code(tar.gz)
    Source code(zip)
  • v2.7.1(Sep 28, 2021)

    Enhancements

    • typical will now default to orjson for json serialization and deserialization if it is installed. This provides up to 3X performance boost, but has different behavior and output than ujson or stdlib json.
    • Callable class instances are now treated as function types when wrapped (thank you @xbanke, #173)

    Fixes

    • Expose the always parameter to typic.klass, since we are raising warnings about its usage (#177)
    Source code(tar.gz)
    Source code(zip)
  • v2.7.0(Sep 21, 2021)

    Features

    • Wrapped routines and frozen dataclasses now coerce inputs up to 10x faster than before (on-par with wrapped classes or bound protocols) (#175).
    • More descriptive and correct type-hints for the public API.
    • Type-hinting with an abstract base class (ABC) will now result in validation against that type rather than coercion (#170)

    Fixes

    • Resolves issue where we'd sometimes fail to clean up the repr of subscripted generics (#170)
    • Resolves issue where we failed to locate the correct target type for a value within a Union of overlapping types (#170)
    • Resolves issue where we'd iterate over the fields of a non-iterable, primitive type when coercing to a collection, rather than raise a TypeError as expected (#174).
    Source code(tar.gz)
    Source code(zip)
  • v2.6.3(Sep 1, 2021)

    Fixes

    • Add intelligent instance-checks for unions (resolves #167)
    • Make field.default positional (resolves #168)
    • Add __version__ to typic.__init__ and add bumpver to manage multiple versions (resolves #168)
    Source code(tar.gz)
    Source code(zip)
  • v2.6.2(Aug 26, 2021)

    Fixes

    • This release resolves an issue when building serializers for Unions of pandas collection types (e.g., DataFrame, Series) which prevented proper compilation (#166)

    Changes

    • Tagged Unions must be tagged with "public" fields (e.g., no fields starting with _ are considered).
    Source code(tar.gz)
    Source code(zip)
  • v2.6.1(Aug 26, 2021)

  • v2.6.0(Aug 24, 2021)

    Features

    • Support un-tagged, "Generic" Unions (e.g., Union[int, str]) (#19, docs)
    • Basic support for Callable types (#159)

    Misc

    • Re-worked handler dispatch in deserializer builder.
    • More idiomatic implementation of delayed SerdeProtocols
    Source code(tar.gz)
    Source code(zip)
  • v2.5.0(Aug 19, 2021)

    Features

    • This release adds support for Python @ 3.10.0rc1 (#162)

    Bugfixes

    • Fix resolution of env-var aliases.
    • Use lazy iterator factory when we can't guess the type.
    Source code(tar.gz)
    Source code(zip)
  • v2.4.2(Jul 21, 2021)

    Bugfixes

    • The new serialization protocol didn't properly check for whether the type was a TypedDict.
    • Translation and Iteration now work as expected against TypeDict types.

    Misc

    • Allow __slots__ to indicates fields which can be used during iteration and translation.
    Source code(tar.gz)
    Source code(zip)
  • v2.4.1(Jul 20, 2021)

  • v2.4.0(Jul 14, 2021)

    Features

    • Improve serialization performance by 3x (#161)
    • Add iterate(...) to top-level API (#161, docs)
    • Add support for "downcasting" an object to a naive collection or iterator in translator (#161)
    • Add support for PEP 585 builtin generics and PEP 604 Union Operators (#161)
    • Advanced type hints for SerdeProtocol using Generics and Protocols, to support SerdeProtocol[int], etc. (#158)

    Bugfixes

    • Fix support for NamedTuples in translation and iteration (#161)
    Source code(tar.gz)
    Source code(zip)
  • v2.3.3(Jul 2, 2021)

    Previously, if a default value was provided for a field in a settings class, we wouldn't lookup field in the environment. This was counter-intuitive. Now, if a default value is provided, we will defer to the environment before using the provided default.

    Source code(tar.gz)
    Source code(zip)
  • v2.3.2(Jul 2, 2021)

  • v2.3.1(Jul 2, 2021)

  • v2.3.0(Jun 8, 2021)

  • 2.2.0(Jun 7, 2021)

    Features

    • Add UPPER-KEBAB-CASE and UPPER.DOT.CASE (#153) as field-name output options.
    • Add support for custom encoders and decoders (#156) (Documentation here and here).
    • Updated translator to support SQLAlchemy 1.4.

    Chores

    • Cleaned up some dead code-paths.
    • Cleaned up some unwieldy interfaces.
    Source code(tar.gz)
    Source code(zip)
  • v2.1.3(Apr 27, 2021)

    Types defined within the local namespace of a function could not be resolved if they were forward references. This change introduces support for these types. (Resolves #150)

    Source code(tar.gz)
    Source code(zip)
  • v2.1.2(Mar 3, 2021)

    When determining whether a Union of types has a tag which may be used in deserialization, member descriptors should not be considered valid.

    Additionally, drop usage of guard_recursion() in the slotted() decorator.

    Source code(tar.gz)
    Source code(zip)
  • v2.1.1(Feb 10, 2021)

Owner
Sean
New-York based Pythonista ? and Software Engineer ?‍?
Sean
Auto-generate PEP-484 annotations

PyAnnotate: Auto-generate PEP-484 annotations Insert annotations into your source code based on call arguments and return types observed at runtime. F

Dropbox 1.4k Dec 26, 2022
Checkov is a static code analysis tool for infrastructure-as-code.

Checkov - Prevent cloud misconfigurations during build-time for Terraform, Cloudformation, Kubernetes, Serverless framework and other infrastructure-as-code-languages with Checkov by Bridgecrew.

Bridgecrew 5.1k Jan 03, 2023
C/C++ Dependency Analyzer: a rewrite of John Lakos' dep_utils (adep/cdep/ldep) from

Version bêta d'un système pour suivre les prix des livres chez Books to Scrape, un revendeur de livres en ligne. En pratique, dans cette version bêta, le programme n'effectuera pas une véritable surv

Olzhas Rakhimov 125 Sep 21, 2022
Print a directory tree structure in your Python code.

directory-structure Print a directory tree structure in your Python code. Download You can simply: pip install directory-structure Or you can also: Cl

Gabriel Stork 45 Dec 19, 2022
Learning source code review, spot vulnerability, find some ways how to fix it.

Learn Source Code Review Learning source code review, spot vulnerability, find some ways how to fix it. WordPress Plugin Authenticated Stored XSS on C

Shan 24 Dec 31, 2022
An app to show the total number of lines of code written by an user.

Lines of code Have you ever wondered how many lines of code you wrote in github? This tool will calculate it for you! To calculate the total number of

B.Jothin kumar 10 Jan 26, 2022
Typical: Fast, simple, & correct data-validation using Python 3 typing.

typical: Python's Typing Toolkit Introduction Typical is a library devoted to runtime analysis, inference, validation, and enforcement of Python types

Sean 170 Dec 26, 2022
A very minimalistic python module that lets you track the time your code snippets take to run.

Clock Keeper A very minimalistic python module that lets you track the time your code snippets take to run. This package is available on PyPI! Run the

Rajdeep Biswas 1 Jan 19, 2022
ticktock is a minimalist library to profile Python code

ticktock is a minimalist library to profile Python code: it periodically displays timing of running code.

Victor Benichoux 30 Sep 28, 2022
The uncompromising Python code formatter

The Uncompromising Code Formatter “Any color you like.” Black is the uncompromising Python code formatter. By using it, you agree to cede control over

Python Software Foundation 30.7k Dec 28, 2022
An analysis tool for Python that blurs the line between testing and type systems.

CrossHair An analysis tool for Python that blurs the line between testing and type systems. THE LATEST NEWS: Check out the new crosshair cover command

Phillip Schanely 836 Jan 08, 2023
Guesslang detects the programming language of a given source code

Detect the programming language of a source code

Y. SOMDA 618 Dec 29, 2022
Robocop is a tool that performs static code analysis of Robot Framework code.

Robocop Introduction Documentation Values Requirements Installation Usage Example Robotidy FAQ Watch our talk from RoboCon 2021 about Robocop and Robo

marketsquare 132 Dec 29, 2022
A system for Python that generates static type annotations by collecting runtime types

MonkeyType MonkeyType collects runtime types of function arguments and return values, and can automatically generate stub files or even add draft type

Instagram 4.1k Jan 02, 2023
Collection of library stubs for Python, with static types

typeshed About Typeshed contains external type annotations for the Python standard library and Python builtins, as well as third party packages as con

Python 3.3k Jan 02, 2023
Find dead Python code

Vulture - Find dead code Vulture finds unused code in Python programs. This is useful for cleaning up and finding errors in large code bases. If you r

Jendrik Seipp 2.4k Jan 03, 2023
Inspects Python source files and provides information about type and location of classes, methods etc

prospector About Prospector is a tool to analyse Python code and output information about errors, potential problems, convention violations and comple

Python Code Quality Authority 1.7k Dec 31, 2022
A simple stopwatch for measuring code performance with static typing.

A simple stopwatch for measuring code performance. This is a fork from python-stopwatch, which adds static typing and a few other things.

Rafael 2 Feb 18, 2022
🦔 PostHog is developer-friendly, open-source product analytics.

PostHog provides open-source product analytics, built for developers. Automate the collection of every event on your website or app, with no need to send data to 3rd parties. With just 1 click you ca

PostHog 10.3k Jan 01, 2023
Optional static typing for Python 3 and 2 (PEP 484)

Mypy: Optional Static Typing for Python Got a question? Join us on Gitter! We don't have a mailing list; but we are always happy to answer questions o

Python 14.4k Jan 05, 2023