Make your functions return something meaningful, typed, and safe!

Overview

Returns logo


Build Status codecov Documentation Status Python Version wemake-python-styleguide Telegram chat Demo


Make your functions return something meaningful, typed, and safe!

Features

  • Brings functional programming to Python land
  • Provides a bunch of primitives to write declarative business logic
  • Enforces better architecture
  • Fully typed with annotations and checked with mypy, PEP561 compatible
  • Adds emulated Higher Kinded Types support
  • Provides type-safe interfaces to create your own data-types with enforced laws
  • Has a bunch of helpers for better composition
  • Pythonic and pleasant to write and to read 🐍
  • Support functions and coroutines, framework agnostic
  • Easy to start: has lots of docs, tests, and tutorials

Quickstart right now!

Installation

pip install returns

You are also required to configure mypy correctly and install our plugin to fix this existing issue:

# In setup.cfg or mypy.ini:
[mypy]
plugins =
  returns.contrib.mypy.returns_plugin

We also recommend to use the same mypy settings we use.

Make sure you know how to get started, check out our docs! Try our demo.

Contents

Maybe container

None is called the worst mistake in the history of Computer Science.

So, what can we do to check for None in our programs? You can use builtin Optional type and write a lot of if some is not None: conditions. But, having null checks here and there makes your code unreadable.

user: Optional[User]
discount_program: Optional['DiscountProgram'] = None

if user is not None:
     balance = user.get_balance()
     if balance is not None:
         credit = balance.credit_amount()
         if credit is not None and credit > 0:
             discount_program = choose_discount(credit)

Or you can use Maybe container! It consists of Some and Nothing types, representing existing state and empty (instead of None) state respectively.

from typing import Optional
from returns.maybe import Maybe, maybe

@maybe  # decorator to convert existing Optional[int] to Maybe[int]
def bad_function() -> Optional[int]:
    ...

maybe_number: Maybe[float] = bad_function().bind_optional(
    lambda number: number / 2,
)
# => Maybe will return Some[float] only if there's a non-None value
#    Otherwise, will return Nothing

You can be sure that .bind_optional() method won't be called for Nothing. Forget about None-related errors forever!

We can also bind a Optional-returning function over a container. To achieve this, we are going to use .bind_optional method.

And that's how your initial refactored code will look like:

user: Optional[User]

# Type hint here is optional, it only helps the reader here:
discount_program: Maybe['DiscountProgram'] = Maybe.from_optional(
    user,
).bind_optional(  # This won't be called if `user is None`
    lambda real_user: real_user.get_balance(),
).bind_optional(  # This won't be called if `real_user.get_balance()` is None
    lambda balance: balance.credit_amount(),
).bind_optional(  # And so on!
    lambda credit: choose_discount(credit) if credit > 0 else None,
)

Much better, isn't it?

RequiresContext container

Many developers do use some kind of dependency injection in Python. And usually it is based on the idea that there's some kind of a container and assembly process.

Functional approach is much simpler!

Imagine that you have a django based game, where you award users with points for each guessed letter in a word (unguessed letters are marked as '.'):

from django.http import HttpRequest, HttpResponse
from words_app.logic import calculate_points

def view(request: HttpRequest) -> HttpResponse:
    user_word: str = request.POST['word']  # just an example
    points = calculate_points(user_word)
    ...  # later you show the result to user somehow

# Somewhere in your `words_app/logic.py`:

def calculate_points(word: str) -> int:
    guessed_letters_count = len([letter for letter in word if letter != '.'])
    return _award_points_for_letters(guessed_letters_count)

def _award_points_for_letters(guessed: int) -> int:
    return 0 if guessed < 5 else guessed  # minimum 6 points possible!

Awesome! It works, users are happy, your logic is pure and awesome. But, later you decide to make the game more fun: let's make the minimal accountable letters threshold configurable for an extra challenge.

You can just do it directly:

def _award_points_for_letters(guessed: int, threshold: int) -> int:
    return 0 if guessed < threshold else guessed

The problem is that _award_points_for_letters is deeply nested. And then you have to pass threshold through the whole callstack, including calculate_points and all other functions that might be on the way. All of them will have to accept threshold as a parameter! This is not useful at all! Large code bases will struggle a lot from this change.

Ok, you can directly use django.settings (or similar) in your _award_points_for_letters function. And ruin your pure logic with framework specific details. That's ugly!

Or you can use RequiresContext container. Let's see how our code changes:

from django.conf import settings
from django.http import HttpRequest, HttpResponse
from words_app.logic import calculate_points

def view(request: HttpRequest) -> HttpResponse:
    user_word: str = request.POST['word']  # just an example
    points = calculate_points(user_words)(settings)  # passing the dependencies
    ...  # later you show the result to user somehow

# Somewhere in your `words_app/logic.py`:

from typing_extensions import Protocol
from returns.context import RequiresContext

class _Deps(Protocol):  # we rely on abstractions, not direct values or types
    WORD_THRESHOLD: int

def calculate_points(word: str) -> RequiresContext[int, _Deps]:
    guessed_letters_count = len([letter for letter in word if letter != '.'])
    return _award_points_for_letters(guessed_letters_count)

def _award_points_for_letters(guessed: int) -> RequiresContext[int, _Deps]:
    return RequiresContext(
        lambda deps: 0 if guessed < deps.WORD_THRESHOLD else guessed,
    )

And now you can pass your dependencies in a really direct and explicit way. And have the type-safety to check what you pass to cover your back. Check out RequiresContext docs for more. There you will learn how to make '.' also configurable.

We also have RequiresContextResult for context-related operations that might fail. And also RequiresContextIOResult and RequiresContextFutureResult.

Result container

Please, make sure that you are also aware of Railway Oriented Programming.

Straight-forward approach

Consider this code that you can find in any python project.

import requests

def fetch_user_profile(user_id: int) -> 'UserProfile':
    """Fetches UserProfile dict from foreign API."""
    response = requests.get('/api/users/{0}'.format(user_id))
    response.raise_for_status()
    return response.json()

Seems legit, does it not? It also seems like a pretty straightforward code to test. All you need is to mock requests.get to return the structure you need.

But, there are hidden problems in this tiny code sample that are almost impossible to spot at the first glance.

Hidden problems

Let's have a look at the exact same code, but with the all hidden problems explained.

import requests

def fetch_user_profile(user_id: int) -> 'UserProfile':
    """Fetches UserProfile dict from foreign API."""
    response = requests.get('/api/users/{0}'.format(user_id))

    # What if we try to find user that does not exist?
    # Or network will go down? Or the server will return 500?
    # In this case the next line will fail with an exception.
    # We need to handle all possible errors in this function
    # and do not return corrupt data to consumers.
    response.raise_for_status()

    # What if we have received invalid JSON?
    # Next line will raise an exception!
    return response.json()

Now, all (probably all?) problems are clear. How can we be sure that this function will be safe to use inside our complex business logic?

We really cannot be sure! We will have to create lots of try and except cases just to catch the expected exceptions. Our code will become complex and unreadable with all this mess!

Or we can go with the top level except Exception: case to catch literally everything. And this way we would end up with catching unwanted ones. This approach can hide serious problems from us for a long time.

Pipe example

import requests
from returns.result import Result, safe
from returns.pipeline import flow
from returns.pointfree import bind

def fetch_user_profile(user_id: int) -> Result['UserProfile', Exception]:
    """Fetches `UserProfile` TypedDict from foreign API."""
    return flow(
        user_id,
        _make_request,
        bind(_parse_json),
    )

@safe
def _make_request(user_id: int) -> requests.Response:
    # TODO: we are not yet done with this example, read more about `IO`:
    response = requests.get('/api/users/{0}'.format(user_id))
    response.raise_for_status()
    return response

@safe
def _parse_json(response: requests.Response) -> 'UserProfile':
    return response.json()

Now we have a clean and a safe and declarative way to express our business needs:

  • We start from making a request, that might fail at any moment,
  • Then parsing the response if the request was successful,
  • And then return the result.

Now, instead of returning regular values we return values wrapped inside a special container thanks to the @safe decorator. It will return Success[YourType] or Failure[Exception]. And will never throw exception at us!

We also use flow and bind functions for handy and declarative composition.

This way we can be sure that our code won't break in random places due to some implicit exception. Now we control all parts and are prepared for the explicit errors.

We are not yet done with this example, let's continue to improve it in the next chapter.

IO container

Let's look at our example from another angle. All its functions look like regular ones: it is impossible to tell whether they are pure or impure from the first sight.

It leads to a very important consequence: we start to mix pure and impure code together. We should not do that!

When these two concepts are mixed we suffer really bad when testing or reusing it. Almost everything should be pure by default. And we should explicitly mark impure parts of the program.

That's why we have created IO container to mark impure functions that never fail.

These impure functions use random, current datetime, environment, or console:

import random
import datetime as dt

from returns.io import IO

def get_random_number() -> IO[int]:  # or use `@impure` decorator
    return IO(random.randint(1, 10))  # isn't pure, because random

now: Callable[[], IO[dt.datetime]] = impure(dt.datetime.now)

@impure
def return_and_show_next_number(previous: int) -> int:
    next_number = previous + 1
    print(next_number)  # isn't pure, because does IO
    return next_number

Now we can clearly see which functions are pure and which ones are impure. This helps us a lot in building large applications, unit testing you code, and composing business logic together.

Troublesome IO

As it was already said, we use IO when we handle functions that do not fail.

What if our function can fail and is impure? Like requests.get() we had earlier in our example.

Then we have to use a special IOResult type instead of a regular Result. Let's find the difference:

  • Our _parse_json function always returns the same result (hopefully) for the same input: you can either parse valid json or fail on invalid one. That's why we return pure Result, there's no IO inside
  • Our _make_request function is impure and can fail. Try to send two similar requests with and without internet connection. The result will be different for the same input. That's why we must use IOResult here: it can fail and has IO

So, in order to fulfill our requirement and separate pure code from impure one, we have to refactor our example.

Explicit IO

Let's make our IO explicit!

import requests
from returns.io import IOResult, impure_safe
from returns.result import safe
from returns.pipeline import flow
from returns.pointfree import bind_result

def fetch_user_profile(user_id: int) -> IOResult['UserProfile', Exception]:
    """Fetches `UserProfile` TypedDict from foreign API."""
    return flow(
        user_id,
        _make_request,
        # before: def (Response) -> UserProfile
        # after safe: def (Response) -> ResultE[UserProfile]
        # after bind_result: def (IOResultE[Response]) -> IOResultE[UserProfile]
        bind_result(_parse_json),
    )

@impure_safe
def _make_request(user_id: int) -> requests.Response:
    response = requests.get('/api/users/{0}'.format(user_id))
    response.raise_for_status()
    return response

@safe
def _parse_json(response: requests.Response) -> 'UserProfile':
    return response.json()

And later we can use unsafe_perform_io somewhere at the top level of our program to get the pure (or "real") value.

As a result of this refactoring session, we know everything about our code:

  • Which parts can fail,
  • Which parts are impure,
  • How to compose them in a smart, readable, and typesafe manner.

Future container

There are several issues with async code in Python:

  1. You cannot call async function from a sync one
  2. Any unexpectedly thrown exception can ruin your whole event loop
  3. Ugly composition with lots of await statements

Future and FutureResult containers solve these issues!

Mixing sync and async code

The main feature of Future is that it allows to run async code while maintaining sync context. Let's see an example.

Let's say we have two functions, the first one returns a number and the second one increments it:

async def first() -> int:
    return 1

def second():  # How can we call `first()` from here?
    return first() + 1  # Boom! Don't do this. We illustrate a problem here.

If we try to just run first(), we will just create an unawaited coroutine. It won't return the value we want.

But, if we would try to run await first(), then we would need to change second to be async. And sometimes it is not possible for various reasons.

However, with Future we can "pretend" to call async code from sync code:

from returns.future import Future

def second() -> Future[int]:
    return Future(first()).map(lambda num: num + 1)

Without touching our first async function or making second async we have achieved our goal. Now, our async value is incremented inside a sync function.

However, Future still requires to be executed inside a proper eventloop:

import anyio  # or asyncio, or any other lib

# We can then pass our `Future` to any library: asyncio, trio, curio.
# And use any event loop: regular, uvloop, even a custom one, etc
assert anyio.run(second().awaitable) == 2

As you can see Future allows you to work with async functions from a sync context. And to mix these two realms together. Use raw Future for operations that cannot fail or raise exceptions. Pretty much the same logic we had with our IO container.

Async code without exceptions

We have already covered how Result works for both pure and impure code. The main idea is: we don't raise exceptions, we return them. It is especially critical in async code, because a single exception can ruin all our coroutines running in a single eventloop.

We have a handy combination of Future and Result containers: FutureResult. Again, this is exactly like IOResult, but for impure async code. Use it when your Future might have problems: like HTTP requests or filesystem operations.

You can easily turn any wild throwing coroutine into a calm FutureResult:

import anyio
from returns.future import future_safe
from returns.io import IOFailure

@future_safe
async def raising():
    raise ValueError('Not so fast!')

ioresult = anyio.run(raising.awaitable)  # all `Future`s return IO containers
assert ioresult == IOFailure(ValueError('Not so fast!'))  # True

Using FutureResult will keep your code safe from exceptions. You can always await or execute inside an eventloop any FutureResult to get sync IOResult instance to work with it in a sync manner.

Better async composition

Previously, you had to do quite a lot of awaiting while writing async code:

async def fetch_user(user_id: int) -> 'User':
    ...

async def get_user_permissions(user: 'User') -> 'Permissions':
    ...

async def ensure_allowed(permissions: 'Permissions') -> bool:
    ...

async def main(user_id: int) -> bool:
    # Also, don't forget to handle all possible errors with `try / except`!
    user = await fetch_user(user_id)  # We will await each time we use a coro!
    permissions = await get_user_permissions(user)
    return await ensure_allowed(permissions)

Some people are ok with it, but some people don't like this imperative style. The problem is that there was no choice.

But now, you can do the same thing in functional style! With the help of Future and FutureResult containers:

import anyio
from returns.future import FutureResultE, future_safe
from returns.io import IOSuccess, IOFailure

@future_safe
async def fetch_user(user_id: int) -> 'User':
    ...

@future_safe
async def get_user_permissions(user: 'User') -> 'Permissions':
    ...

@future_safe
async def ensure_allowed(permissions: 'Permissions') -> bool:
    ...

def main(user_id: int) -> FutureResultE[bool]:
    # We can now turn `main` into a sync function, it does not `await` at all.
    # We also don't care about exceptions anymore, they are already handled.
    return fetch_user(user_id).bind(get_user_permissions).bind(ensure_allowed)

correct_user_id: int  # has required permissions
banned_user_id: int  # does not have required permissions
wrong_user_id: int  # does not exist

# We can have correct business results:
assert anyio.run(main(correct_user_id).awaitable) == IOSuccess(True)
assert anyio.run(main(banned_user_id).awaitable) == IOSuccess(False)

# Or we can have errors along the way:
assert anyio.run(main(wrong_user_id).awaitable) == IOFailure(
    UserDoesNotExistError(...),
)

Or even something really fancy:

from returns.pointfree import bind
from returns.pipeline import flow

def main(user_id: int) -> FutureResultE[bool]:
    return flow(
        fetch_user(user_id),
        bind(get_user_permissions),
        bind(ensure_allowed),
    )

Later we can also refactor our logical functions to be sync and to return FutureResult.

Lovely, isn't it?

More!

Want more? Go to the docs! Or try our demo. Or read these articles:

Do you have an article to submit? Feel free to open a pull request!

⭐️

Drylabs maintains dry-python and helps those who want to use it inside their organizations.

Read more at drylabs.io

Comments
  • Improve Failure tracibility

    Improve Failure tracibility

    dry-rb has a cool feature: tracing failures https://dry-rb.org/gems/dry-monads/1.3/tracing-failures/ We need something similar to improve debugging experience. It might be included into the core, or just into some contrib/ package.

    enhancement help wanted 
    opened by sobolevn 33
  • Add Future[A] that helps with async code

    Add Future[A] that helps with async code

    I am lately writing a lot of asynchronous code (using asyncio mostly) and while async/await is fairly straight forward and nice to work with, it hinders writing modular code with small functions and reuse of said functions. This is imho due to two issues in particular:

    1. regular functions def foo(... and coroutines async def bar(... have different calling semantics. For a regular function you just do foo(...) to get your result back while for coroutines you need to await the result, i.e. use await bar(...)
    2. coroutines don't compose well (without pouring await all over your code)

    Let's define a couple simple functions (both regular and coroutines) to illustrate these issues:

    def double(x): 
        return 2*x
    
    async def delay(x):
        return x
    
    async def double_async(x):
        return 2*x
    

    Cases:

    1. Composing regular functions just for reference - easy:

      double(double(1))
      
    2. Composing regular and coroutine functions V1 - not so bad,:

      await double_async(double(1))
      
    3. Composing regular and coroutine functions V2 - awkward with the await thrown into the middle:

      double(await double_async(1))
      
    4. Composing two coroutine functions - awaits galore :

      await double_async(await double_async(1))
      

    To ease this pain I propose a (monad-like) container that implements map and bind so that functions can be chained nicely

    class Promise:
        
        def __init__(self, coro):
            self._coro = coro
            
        def map(self, f):
            async def coro():
                x = await self._coro
                return f(x)
            return Promise(coro())
        
        def bind(self, f):
            async def coro():
                x = await self._coro
                return await f(x)
            return Promise(coro())
        
        def __await__(self):
            return self._coro.__await__()
    

    Usage

    • Wrap a coroutine (function defined with async def) to create an instance of Promise.
      p = Promise(delay(1))
      await p  # get the result of the wrapped coroutine
      
    • Call a regular function on an instance of Promise
      await Promise(delay(1)).map(double)
      
    • Call a coroutine on an instance of Promise
      await Promise(delay(1)).bind(double_async)
      

    Extended example

    Since the examples above don't look too intimidating, behold:

    await (
        double_async(
            double(
                await double_async(
                    await delay(1)
                )
            )
        )
    )
    

    vs.:

    await (
        Promise(delay(1))
        .bind(double_async)
        .map(double)
        .bind(double_async)
    )
    

    Feedback greatly appreciated!

    enhancement help wanted 
    opened by stereobutter 31
  • Typing issue with chained Result

    Typing issue with chained Result

    Take the following two functions as an example of two (or more) functions with unique failure modes (each function may fail in their own way)

    def fraction(x: float ) -> Result[float, ZeroDivisionError]:
        try: return Success(1/x)
        except ZeroDivisionError as e: return Failure(e)
    
    def log_of_arg_minus_one(x: float) -> Result[float, ValueError]:
        try: return Success(log(1-x))
        except ValueError as e: return Failure(e)
    

    then a computation like fraction(x).bind(log_of_arg_minus_one) has potentially three different outcomes:

    1. the happy path where both functions return a Success, i.e. fraction(x) returns Success(y) where y is a float and log_of_arg_minus_one(y) returns Success(z) where z is also a float; example fraction(2).bind(log_of_arg_minus_one)
    2. the failure path of fraction , i.e. Failure(ZeroDivisionError); example: fraction(0).bind(log_of_arg_minus_one)
    3. the failure path of log_of_arg_minus_one, i.e. Failure(ValueError); example: fraction(1).bind(log_of_arg_minus_one)

    Sadly this seems not to type correctly:

    FailureCases = Union[ZeroDivisionError, ValueError]
    
    z: Result[float, FailureCases] = fraction(1).bind(log_of_arg_minus_one)
    
    # Argument 1 to "bind" of "Result" has incompatible type 
    # "Callable[[float], Result[float, ValueError]]"; 
    # expected "Callable[[float], Result[float, Union[ZeroDivisionError, ValueError]]]"
    

    The following also doesn't work:

    z: Result[float, Exception] = fraction(1).bind(log_of_arg_minus_one)
    
    # Argument 1 to "bind" of "Result" has incompatible type 
    # "Callable[[float], Result[float, ValueError]]"; 
    # expected "Callable[[float], Result[float, Exception]]"
    

    I don't know it this is an issue with bind (that might be fixable with a plugin) or just that Result[S, F] is not properly covariant with respect to F i.e. Result[X, A] is considered a subtype of Result[X, B] when A is a subtype of B

    enhancement 
    opened by stereobutter 25
  • WIP: fixing compositions in Maybe

    WIP: fixing compositions in Maybe

    I don't expect it to be merged (bc it breaks compatibility), just sharing my thoughts on how proper Maybe should work

    1. Some(None) is as normal as [[]] and [None]. Fact that it doesn't make sense don't mean that users shouldn't be allowed to create it
    2. There is one vital property of map that's violated in this library. Map should ALWAYS preserve the structure. Some(x).map(any_function) should always be equal to Some(y). Like list of 5 elements mapped to some function should be also list of 5 elements
    3. All practical problems with it are solved by allowing bind (because it's natural for bind to do that) to promote non-maybe values to Optional[] and then to Maybe[] using usual conversion

    Closes #488 Closes #501

    opened by nurumaik 24
  • Active `fail_on_warning` RTD

    Active `fail_on_warning` RTD

    Now we're ignoring the sphinx warnings and some of these warnings is about "broken" references so we need to configure our ReadTheDocs build to fail on warnings. The default value is false, reference!!

    Another point here is that we got a lot of warnings like this:

    WARNING: Cannot resolve forward reference in type annotations of "returns.interfaces.specific.result.ResultLikeN.bind_result": name 'Result' is not defined
    

    It's because our configuration is set to disable the TYPE_CHECKING flag as we can see here

    refs #547

    documentation 
    opened by thepabloaguilar 21
  • Enables Pattern Matching support for `Result` container

    Enables Pattern Matching support for `Result` container

    Enables Pattern Matching support for Result container

    Checklist

    • [X] I have double checked that there are no unrelated changes in this pull request (old patches, accidental config files, etc)
    • [X] I have updated the documentation for the changes I have made
    • [X] I have added my changes to the CHANGELOG.md
    opened by thepabloaguilar 15
  • We have a problem with @pipeline and unwrap()

    We have a problem with @pipeline and unwrap()

    from typing import TYPE_CHECKING
    from returns import Failure, pipeline, Result
    
    @pipeline
    def test(x: int) -> Result[int, str]:
        res = Failure(bool).unwrap()
        return Failure('a')
    
    if TYPE_CHECKING:
        reveal_type(test(1))
        # => Revealed type is 'returns.result.Result[builtins.int, builtins.str]'
    
    print(test(1))
    # => <Failure: <class 'bool'>>
    

    I am pretty sure, that we will have to change how @pipeline works. This is also related to #89

    bug 
    opened by sobolevn 14
  • Unwrapping failures erases exception type information

    Unwrapping failures erases exception type information

    Let's say I unwrap a failure and want to catch it using try-except.

    The only way to do so currently is:

    from returns.result import Result
    from returns.functions import safe
    from returns.primitives.exceptions import UnwrapFailedError
    
    @safe
    def foo(i) -> Result[int, Exception]:
      if i == 0:
        raise ValueError("problem")
      return i + 5
    
    try:
      result = foo('a') # Will raise TypeError
      # do stuff
      result.unwrap()
    except UnwrapFailedError as e:
      try:
        raise e.halted_container.failure() from e
      except ValueError:
        print("Don't use zero")
      except TypeError:
        print("Dont use strings")
    

    This makes exception handling code more complex than it should be.

    Naturally you could have also used:

    @safe
    def handle_err(e) -> Result[None, Exception]:
      if isinstance(e, ValueError):
        print("Don't use zero")
      elif isinstance(e, TypeError):
        print("Dont use strings")
      else:
        raise e from e
    
    try:
      foo('a').rescue(handle_err).unwrap()
    except UnwrapFailedError:
      print("Unhandled exception!!!")
    

    But the error handling code above simply feels unpythonic and awkward.

    Rescuing from an error fits the usecase where you want to log the error and proceed or any other generic error handling situation.

    You could make rescue() accept a mapping between types to callables and call the appropriate error handler when an exception occured but I don't see why you'd have to do so.

    It seems to me that the UnwrapFailedError exception is not necessary and hinders code readability. The try-except clause was created for such situations. We should use it.

    opened by thedrow 13
  • returns.assert_trace doesn't work with @safe-wrapped functions

    returns.assert_trace doesn't work with @safe-wrapped functions

    First of all, thank you! I really appreciate your work and your help on the Telegram group :)

    Bug report

    What's wrong

    returns.assert_trace fails to "intercept" containers created from @safe decorators.

    # works
    def func_1(n) -> Result[int, Exception]:
        if n == 0:
            return Failure(Exception())
        return Success(n + 1)
    
    # doesn't work
    @safe
    def func_1(n) -> int:
        if n == 0:
            raise Exception()
        return n + 1
    
    
    def test_func_1(returns: ReturnsAsserts):
        with returns.assert_trace(Success, func_1):
            flow(1, func_1)
    

    The test with the wrapped function returns

    FAILED returns_test.py::test_func_1 - Failed: No container Success was created
    

    I tried doing this, but no difference:

    def test_func_1(returns: ReturnsAsserts):
        with returns.assert_trace(Success, safe(func_1)):
            flow(1, safe(func_1))
    
    

    How is that should be

    The above test should pass

    System information

    • python version: 3.8.0
    • returns version: 0.15.0
    • pytest version (if any): 6.2.2
    bug 
    opened by ftruzzi 12
  • Consider adding `.cond` to `Result` container

    Consider adding `.cond` to `Result` container

    Either provides a way to avoid the boilerplate below:

    def func():
        if something:
            return Success(...)
        return Failure(...)
    

    It provides the .cond method, with Result container will be something like this:

    def func():
        return Result.cond(
            my_condition,   # The boolean value
            success_value,  # If boolean is `True` the return will be Success(success_value)
            failure_value,  # If boolean is `False` the return will be Failure(failure_value)
        )
    
    opened by thepabloaguilar 12
  • Consider implementing __str__ / __repr__ for suitable containers

    Consider implementing __str__ / __repr__ for suitable containers

    Bug report

    What's wrong

    When writing test cases for a method that returns Maybe[T], I found that pytest reports a failure as follows which is not very readable:

    <returns.maybe._Some object at 0x7fddfe56c140> != <returns.maybe._Some object at 0x7fddfe5aea00>

    How is that should be

    It would be much better, if Some implemented __str__ (and maybe also __repr__) so that it could look like below:

    <Some(10) != Some(20)>

    System information

    • python version: 3.8
    • returns version: 0.14.0
    bug help wanted good first issue hacktoberfest 
    opened by mysticfall 11
  • build(deps-dev): bump slotscheck from 0.16.0 to 0.16.2

    build(deps-dev): bump slotscheck from 0.16.0 to 0.16.2

    Bumps slotscheck from 0.16.0 to 0.16.2.

    Changelog

    Sourced from slotscheck's changelog.

    0.16.2 (2022-12-29)

    • Don't flag Protocol classes as needing slots if strict require-subclass option is enabled.

    0.16.1 (2022-11-21)

    • Don't flag TypedDict subclasses as missing slots (#120).
    Commits
    • 9fc82b2 Merge pull request #131 from ariebovenberg/protocols
    • 2abe53a require-subclass option now doesn't flag protocol classes (#130)
    • 215cf2a Merge pull request #122 from ariebovenberg/dependabot/pip/sphinx-click-approx...
    • 37a502d Update sphinx-click requirement from ~=4.3.0 to ~=4.4.0
    • e23154f fix date in changelog
    • af1c122 Merge pull request #121 from ariebovenberg/typeddict
    • 3cbcdd8 fix false positive for typeddicts
    • 4ffe482 Merge pull request #119 from ariebovenberg/dependabot/pip/mypy-0.991
    • e87649d Bump mypy from 0.990 to 0.991
    • 708335d Merge pull request #118 from ariebovenberg/dependabot/pip/tox-3.27.1
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies python 
    opened by dependabot[bot] 0
  • latest python + mypy + returns panicking on simple flow invocation

    latest python + mypy + returns panicking on simple flow invocation

    Bug report

    What's wrong

    I have been unable to typecheck any pipe or flow results yet. Here's a very simple example:

    from returns.pipeline import flow
    
    
    def test_flow() -> None:
        func = lambda x: x + 1
        return flow(1, func, func, func)
    
    print(test_flow())
    
    ❯ python test_mypy_returns_plugin.py
    4
    
    ❯ mypy --show-traceback test_mypy_returns_plugin.py
    test_mypy_returns_plugin.py:6: error: INTERNAL ERROR -- Please try using mypy master on GitHub:
    https://mypy.readthedocs.io/en/stable/common_issues.html#using-a-development-mypy-build
    Please report a bug at https://github.com/python/mypy/issues
    version: 0.991
    Traceback (most recent call last):
      File "mypy/checkexpr.py", line 4657, in accept
      File "mypy/checkexpr.py", line 410, in visit_call_expr
      File "mypy/checkexpr.py", line 530, in visit_call_expr_inner
      File "mypy/checkexpr.py", line 1181, in check_call_expr_with_callee_type
      File "mypy/checkexpr.py", line 1264, in check_call
      File "mypy/checkexpr.py", line 1466, in check_callable_call
      File "mypy/checkexpr.py", line 1004, in apply_function_plugin
      File "/home/bill/.cache/pypoetry/virtualenvs/pareto-gaming-6nwDpNVq-py3.11/lib/python3.11/site-packages/returns/contrib/mypy/_features/flow.py", line 50, in analyze
        ).from_callable_sequence(
          ^^^^^^^^^^^^^^^^^^^^^^^
      File "/home/bill/.cache/pypoetry/virtualenvs/pareto-gaming-6nwDpNVq-py3.11/lib/python3.11/site-packages/returns/contrib/mypy/_typeops/inference.py", line 122, in from_callable_sequence
        analyze_call(
      File "/home/bill/.cache/pypoetry/virtualenvs/pareto-gaming-6nwDpNVq-py3.11/lib/python3.11/site-packages/returns/contrib/mypy/_typeops/analtype.py", line 56, in analyze_call
        return_type, checked_function = checker.check_call(
                                        ^^^^^^^^^^^^^^^^^^^
    TypeError: 'arg_messages' is an invalid keyword argument for check_call()
    test_mypy_returns_plugin.py:6: : note: use --pdb to drop into pdb
    

    How it should be

    I'm very new to the library, so I don't know precisely what should be happening, but the docs definitely indicate that the plugin should type flow, pipe and others.

    System information

    • python version: 3.11.1

    • returns version: 0.19.0

    • mypy version: 0.991

    • hypothesis version (if any): n/a

    • pytest version (if any): n/a

    bug 
    opened by billwanjohi 0
  • build(deps-dev): bump attrs from 22.1.0 to 22.2.0

    build(deps-dev): bump attrs from 22.1.0 to 22.2.0

    Bumps attrs from 22.1.0 to 22.2.0.

    Release notes

    Sourced from attrs's releases.

    22.2.0

    Highlights

    It's been a lot busier than the changelog indicates, but a lot of the work happened under the hood (like some impressive performance improvements). But we've got still one big new feature that's are worthy the holidays:

    Fields now have an alias argument that allows you to set the field's name in the generated __init__ method. This is especially useful for those who aren't fans of attrs's behavior of stripping underscores from private attribute names.

    Special Thanks

    This release would not be possible without my generous sponsors! Thank you to all of you making sustainable maintenance possible! If you would like to join them, go to https://github.com/sponsors/hynek and check out the sweet perks!

    Above and Beyond

    Variomedia AG (@​variomedia), Tidelift (@​tidelift), Sentry (@​getsentry), HiredScore (@​HiredScore), FilePreviews (@​filepreviews), and Daniel Fortunov (@​asqui).

    Maintenance Sustainers

    @​rzijp, Adam Hill (@​adamghill), Dan Groshev (@​si14), Tamir Bahar (@​tmr232), Adi Roiban (@​adiroiban), Magnus Watn (@​magnuswatn), David Cramer (@​dcramer), Moving Content AG (@​moving-content), Stein Magnus Jodal (@​jodal), Iwan Aucamp (@​aucampia), ProteinQure (@​ProteinQure), Jesse Snyder (@​jessesnyder), Rivo Laks (@​rivol), Thomas Ballinger (@​thomasballinger), @​medecau, Ionel Cristian Mărieș (@​ionelmc), The Westervelt Company (@​westerveltco), Philippe Galvan (@​PhilippeGalvan), Birk Jernström (@​birkjernstrom), Jannis Leidel (@​jezdez), Tim Schilling (@​tim-schilling), Chris Withers (@​cjw296), and Christopher Dignam (@​chdsbd).

    Not to forget 2 more amazing humans who chose to be generous but anonymous!

    Full Changelog

    Backwards-incompatible Changes

    • Python 3.5 is not supported anymore. #988

    Deprecations

    • Python 3.6 is now deprecated and support will be removed in the next release. #1017

    Changes

    • attrs.field() now supports an alias option for explicit __init__ argument names.

      Get __init__ signatures matching any taste, peculiar or plain! The PEP 681 compatible alias option can be use to override private attribute name mangling, or add other arbitrary field argument name overrides. #950

    • attrs.NOTHING is now an enum value, making it possible to use with e.g. typing.Literal. #983

    • Added missing re-import of attr.AttrsInstance to the attrs namespace. #987

    • Fix slight performance regression in classes with custom __setattr__ and speedup even more. #991

    • Class-creation performance improvements by switching performance-sensitive templating operations to f-strings.

      You can expect an improvement of about 5% -- even for very simple classes. #995

    ... (truncated)

    Changelog

    Sourced from attrs's changelog.

    22.2.0 - 2022-12-21

    Backwards-incompatible Changes

    • Python 3.5 is not supported anymore. #988

    Deprecations

    • Python 3.6 is now deprecated and support will be removed in the next release. #1017

    Changes

    • attrs.field() now supports an alias option for explicit __init__ argument names.

      Get __init__ signatures matching any taste, peculiar or plain! The PEP 681 compatible alias option can be use to override private attribute name mangling, or add other arbitrary field argument name overrides. #950

    • attrs.NOTHING is now an enum value, making it possible to use with e.g. typing.Literal. #983

    • Added missing re-import of attr.AttrsInstance to the attrs namespace. #987

    • Fix slight performance regression in classes with custom __setattr__ and speedup even more. #991

    • Class-creation performance improvements by switching performance-sensitive templating operations to f-strings.

      You can expect an improvement of about 5% -- even for very simple classes. #995

    • attrs.has() is now a TypeGuard for AttrsInstance. That means that type checkers know a class is an instance of an attrs class if you check it using attrs.has() (or attr.has()) first. #997

    • Made attrs.AttrsInstance stub available at runtime and fixed type errors related to the usage of attrs.AttrsInstance in Pyright. #999

    • On Python 3.10 and later, call abc.update_abstractmethods() on dict classes after creation. This improves the detection of abstractness. #1001

    • attrs's pickling methods now use dicts instead of tuples. That is safer and more robust across different versions of a class. #1009

    • Added attrs.validators.not_(wrapped_validator) to logically invert wrapped_validator by accepting only values where wrapped_validator rejects the value with a ValueError or TypeError (by default, exception types configurable). #1010

    • The type stubs for attrs.cmp_using() now have default values. #1027

    • To conform with PEP 681, attr.s() and attrs.define() now accept unsafe_hash in addition to hash. #1065

    Commits
    • a9960de Prepare 22.2.0
    • 566248a Don't linkcheck tree links
    • 0f62805 Make towncrier marker independent from warning
    • b9f35eb Fix minor stub issues (#1072)
    • 4ad4ea0 Use MyST-native include
    • 519423d Use MyST-native doctest blocks in all MD
    • 403adab Remove stray file
    • 6957e4a Use new typographic branding in the last rst file, too
    • 1bb2864 Convert examples.rst to md
    • c1c24cc Convert glossary.rst to md
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies python 
    opened by dependabot[bot] 0
  • Fixes dependencies security issues

    Fixes dependencies security issues

    I have made things!

    Checklist

    • [X] I have double checked that there are no unrelated changes in this pull request (old patches, accidental config files, etc)
    • [X] I have created at least one test case for the changes I have made
    • [X] I have updated the documentation for the changes I have made
    • [X] I have added my changes to the CHANGELOG.md
    opened by thepabloaguilar 1
  • build(deps-dev): bump hypothesis from 6.57.1 to 6.61.0

    build(deps-dev): bump hypothesis from 6.57.1 to 6.61.0

    Bumps hypothesis from 6.57.1 to 6.61.0.

    Release notes

    Sourced from hypothesis's releases.

    Hypothesis for Python - version 6.61.0

    This release improves our treatment of database keys, which based on (among other things) the source code of your test function. We now post-process this source to ignore decorators, comments, trailing whitespace, and blank lines - so that you can add "@​example()"s or make some small no-op edits to your code without preventing replay of any known failing or covering examples.

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.60.1

    This patch updates our vendored list of top-level domains, which is used by the provisional "domains()" strategy.

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.60.0

    This release improves Hypothesis' ability to resolve forward references in type annotations. It fixes a bug that prevented "builds()" from being used with pydantic models that possess updated forward references. See issue #3519.

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.59.0

    The "@​example(...)" decorator now has a ".via()" method, which future tools will use to track automatically-added covering examples (issue #3506).

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.58.2

    This patch updates our vendored list of top-level domains, which is used by the provisional "domains()" strategy.

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.58.1

    This patch shifts "hypothesis[lark]" from depending on the old lark- parser package to the new lark package. There are no code changes in Hypothesis, it's just that Lark got a new name on PyPI for version 1.0 onwards.

    The canonical version of these notes (with links) is on readthedocs.

    Hypothesis for Python - version 6.58.0

    "register_random()" has used "weakref" since 6.27.1 - 2021-11-22, allowing the "Random"-compatible objects to be garbage-collected when there are no other references remaining in order to avoid memory leaks. We now raise an error or emit a warning when this seems likely

    ... (truncated)

    Commits
    • 226028d Bump hypothesis-python version to 6.61.0 and update changelog
    • 49d2d81 Merge pull request #3523 from Zac-HD/decorator-free-database-key
    • 6e04f03 Bump hypothesis-python version to 6.60.1 and update changelog
    • 8cea3f1 Merge pull request #3528 from HypothesisWorks/create-pull-request/patch
    • 022df2d Fix autoupdate tools for new tox
    • abff9f2 Update pinned dependencies
    • ccd3231 Improve database keys
    • 3b3568b Tweak DJANGO_COLORS handling
    • 4c3ec36 Tune test for stability
    • 032375e Tidy up nocover pragmas
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies python 
    opened by dependabot[bot] 0
  • build(deps-dev): bump flake8-pyi from 22.10.0 to 22.11.0

    build(deps-dev): bump flake8-pyi from 22.10.0 to 22.11.0

    Bumps flake8-pyi from 22.10.0 to 22.11.0.

    Release notes

    Sourced from flake8-pyi's releases.

    22.11.0

    Bugfixes:

    • Specify encoding when opening files. Prevents UnicodeDecodeError on Windows when the file contains non-CP1252 characters. Contributed by Avasam.
    • Significant changes have been made to the Y041 check. Previously, Y041 flagged "redundant numeric unions" (e.g. float | int, complex | float or complex | int) in all contexts outside of type aliases. This was incorrect. PEP 484 only specifies that type checkers should treat int as an implicit subtype of float in the specific context of parameter annotations for functions and methods. Y041 has therefore been revised to only emit errors on "redundant numeric unions" in the context of parameter annotations.

    Other changes:

    • Support running with flake8 v6.
    Changelog

    Sourced from flake8-pyi's changelog.

    22.11.0

    Bugfixes:

    • Specify encoding when opening files. Prevents UnicodeDecodeError on Windows when the file contains non-CP1252 characters. Contributed by Avasam.
    • Significant changes have been made to the Y041 check. Previously, Y041 flagged "redundant numeric unions" (e.g. float | int, complex | float or complex | int) in all contexts outside of type aliases. This was incorrect. PEP 484 only specifies that type checkers should treat int as an implicit subtype of float in the specific context of parameter annotations for functions and methods. Y041 has therefore been revised to only emit errors on "redundant numeric unions" in the context of parameter annotations.

    Other changes:

    • Support running with flake8 v6.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies python 
    opened by dependabot[bot] 0
Releases(0.19.0)
  • 0.19.0(Mar 13, 2022)

  • 0.18.0(Dec 31, 2021)

    New Year Release! 🎄

    Features

    • Now requires typing_extensions>=4.0
    • Now requires mypy>=0.930
    • Removes plugin for @safe, @maybe, @future, etc. Because we now use ParamSpec type to properly type decorators

    Bugfixes

    • Fixes __slots__ not being set properly in containers and their base classes
    • Fixes patching of containers in pytest plugin not undone after each test
    Source code(tar.gz)
    Source code(zip)
  • 0.17.0(Oct 5, 2021)

    Features

    • Python3.10 support
    • Enables Pattern Matching support for Result containers
    • Enables Pattern Matching support for Maybe container
    • Enables Pattern Matching support for IOResult container
    • Improves hypothesis plugin, now we detect when type cannot be constructed and give a clear error message
    Source code(tar.gz)
    Source code(zip)
  • 0.16.0(Mar 26, 2021)

    Features

    • Makes _Nothing a singleton
    • Refactor flow function to be faster

    Bugfixes

    • Fixes that assert_trace was not catching containers from @safe-wrapped functions

    Misc

    • Fixes typos in documentation
    Source code(tar.gz)
    Source code(zip)
  • 0.15.0(Oct 21, 2020)

    Features

    • Adds Higher Kinded Types partial support

    • Breaking: drops python3.6 support

    • Breaking: makes our mypy plugin not optional, but required!

    • Breaking: changes all RequiresContext-based type arguments order, previously we used to specify _EnvType as the first type argument, now it is the last one. This is done to respect new HKT rules

    • Breaking: renames .rescue to .lash

    • Breaking: removes all old interfaces from primitives/interfaces.py, use new typeclasses instead

    • Breaking: Maybe is fully reworked to be lawful

    • Breaking: removes value_or pointfree method, because it is impossible to express with HKT

    • Breaking: removes .value_or, .unwrap, and .failure methods from FutureResult and RequiresContext-based types, because we do require these methods to raise an exception on failure, but these methods were lazy and did not raise the required exception

    • Breaking: changes how is_successful is typed: now we allow any Unwrappable interface instances there, including custom ones

    • Breaking: changes UnwrapFailedError constructor, now it does accept an Unwrappable instance instead of a BaseContainer

    • Breaking: removes .fix method from all containers, also removes fix pointfree function

    • Breaking: Removes coalesce function, because it is impossible to properly type it

    • Breaking: Removes all Context* based types with .ask() method, use new .ask() methods on the Reader-based containers

    • Breaking: Now Future and FutureResult can be awaited multiple times

    • Breaking: Removes .unify() method from several containers, use unify() pointfree function instead

    • Breaking: Removes .from_iterable method from all containers, instead adds better iterables support, we now have returns.iterables module with Fold helper

    • Breaking: Renames property empty to no_args of all RequiresContext-based classes

    • Adds new public interfaces: see returns.interfaces

    • Adds methods package with several helpful things inside

    • Adds FutureSuccess and FutureFailure unit functions to be similar to Result and IOResult

    • Adds .swap method to Result, IOResult, FutureResult, and other result based containers

    • Adds .modify_env method to all RequiresContext* types

    • Adds .rescue to Maybe

    • Adds .equals methods to types that can be compared directly: Result, Maybe, IO, IOResult

    • Adds missing from_requires_context_future_result to RequiresContext

    • Adds .from_optional and .bind_optional to Maybe container

    • Adds __slots__ to UnwrapFailedError with halted_container

    • Changes flatten to work with KindN and any possible container

    • Adds a helper to test traces to our pytest plugin

    • Adds cond function to pointfree and methods packages

    • Adds compose_result HKT method and pointfree function

    • Adds unify HKT pointfree function

    • Adds bimap pointfree function

    • Adds unwrap_or_failure function to methods package

    • Adds collect_trace helper function for better development experience

    • Adds hypothesis intergration and pre-defined "monad laws as values"

    • Adds assert_equal method to our pytest plugin

    Bugfixes

    • Breaking: fixes serious typing issue and changes how flow works
    • Breaking: fixes serious typing issue and changes how pipe works, now it has a hard limit of 20 parameters
    • Fixes that RequiresContextFutureResult was not supported by pytest plugin
    • Fixes incorrect partial behaviour in an edge case, #618
    • Fixes that .apply method of IOResult was working incorrectly, it was returning IOFailure(2) as a result of IOFailure(1).apply(IOFailure(2))
    • Fixes bug that safe(tap(...)) was revealing invalid types sometimes

    Misc

    • Adds a lot of new typetests
    • Checks that now all math laws are checked for all types
    • Changes docs structure, adds new Interfaces, HKT, and Methods pages
    • Changed __str__ method in BaseContainer class to __repr__ method
    • Adds Quickstart guide
    Source code(tar.gz)
    Source code(zip)
  • 0.14.0(Jun 7, 2020)

    Special thanks to:

    • @orsinium
    • @thedrow
    • @thepabloaguilar
    • and other (code and ideas) contributors for making this release possible. You are awesome!

    Announcement: https://sobolevn.me/2020/06/how-async-should-have-been

    Features

    • Breaking: renames mypy plugin from decorator_plugin to returns_plugin because of a complete rewrite and lots of new features

    • Breaking: changes @safe, @impure, impure_safe, @maybe semantics: they do not work with async functions anymore; now you are forced to use Future and its helpers to work with async functions

    • Breaking: renames Maybe.new to Maybe.from_value. Because all our other containers support this protocol. Only Maybe was different, sorry for that!

    • Breaking: renames .from_success() to .from_value(), there's no need in two separate methods

    • Breaking: renames .from_successful_io() to .from_io(), there's no need in two separate methods

    • Breaking: renames .from_successful_context() to .from_context(), there's no need in two separate methods

    • Breaking: since we now support .apply() method, there's no more need in *_squash converters, they are removed

    • Breaking: renamed Instanceable to Applicative

    • Breaking: changes .from_io and .from_failed_io of IOResult to return Any instead of NoReturn unfilled type

    • Breaking: removes .lift and .lift_* methods from all containers, use map_, bind_result, bind_io, and other pointfree helpers instead

    • Breaking: removes @pipeline function. It was a mistake: it does not work with mixed container types, it does not type failures properly, it does not work with IO and Future, it enforces to write imperative code in a functional codebase. Use flow instead

    • Adds typed partial and curry mypy plugins!

    • Adds typed flow plugin, now it can accept any number of arguments, it now also has excelent type inference

    • Adds typed pipe plugin, now it can accept any number of arguments, it now also has good type inference

    • Adds managed pipeline function that is useful for working with stateful computations

    • Adds typed map_, fix, and alt pointfree functions

    • Adds typed bind_result, bind_io, bind_ioresult, bind_context, bind_context_result, bind_future, bind_async, and bind_awaitable pointfree functions

    • Adds typed bind_async_future and bind_async_future_result pointfree functions

    • Adds typed unify pointfree function

    • Adds typed apply pointfree function

    • Adds typed value_or pointfree function

    • Adds pytest plugin with the ability to tests error handling

    • Adds Future container to easily work with async functions

    • Adds FutureResult container to easily work with async function that might fail

    • Adds RequiresContextFutureResult container

    • Adds ReaderFutureResult alias for RequiresContextFutureResult

    • Adds RequiresContextFutureResultE and ReaderFutureResultE aliases

    • Adds Future, FutureResult and RequiresContextFutureResult support for all existing pointfree functions

    • Adds bind_io method to IOResult

    • Adds bind_io method to RequiresContextIOResult

    • Adds or_else method to Maybe

    • Adds .from_io and .from_failed_io to RequiresContextIOResult

    • Syncs naming in from_* methods, now all parameters are named inner_value

    • Adds not_ composition helper

    • Adds flatten support for Future, FutureResult and RequiresContextFutureResult

    • Adds __copy__ and __deepcopy__ magic methods to Immutable class

    • Speeds up is_successful function

    • Makes all Context context helpers abstract, so you cannot create new instances of this class, also adds __slots__ to these classes

    • Improves RequiresContext* types with NoDeps where it is logically true

    Bugfixes

    • Fixes that @safe decorator was generating incorrect signatures for functions with Any
    • Fixes that .rescue() of RequiresContextResult was returning Any
    • Fixes that .rescue() of RequiresContextIOResult was returning Any
    • Fixes that RequiresContextResult and RequiresContextIOResult were not final
    • Fixes that ImmutableStateError was not a subclass of AttributeError
    • Fixes that IOResult was not showing str representation of wrapped inner_value

    Misc

    • Replaces pytest-asyncio with anyio plugin, now we test compatibility with any IO stack: asyncio, trio, curio
    • Updates lots of dependencies
    • Adds lots of new tests
    • Updates lots of docs
    • Removes "IO marker" name from docs in favor for "IO container", it is not special at all. Why would we call it differently?
    Source code(tar.gz)
    Source code(zip)
  • 0.13.0(Feb 2, 2020)

    Announcing article: https://sobolevn.me/2020/02/typed-functional-dependency-injection

    Features

    • Breaking: renames join to flatten, sorry!

    • Breaking: renames box to bind and moves it to returns.pointfree

    • Breaking: removes Maybe.rescue and Maybe.fix methods

    • Breaking: renames io_squash to squash_io and moves it to returns.converters

    • Breaking: moves all interfaces from returns.primitives.container to returns.primitives.interfaces

    • Adds rescue pointfree function

    • Adds ResultE alias for Result[..., Exception]

    • Adds RequiresContext container and Context helper class

    • Adds RequiresContext support for bind pointfree function

    • Adds RequiresContext support for flatten function

    • Adds RequiresContextResult container

    • Adds RequiresContextResultE alias

    • Adds ReaderResult and ReaderResultE aliases for RequiresContextResult[..., ..., Exception]

    • Adds RequiresContextResult support for bind and rescue

    • Adds RequiresContextResult support for flatten

    • Adds IOResult helper to work better with IO[Result[a, b]]

    • Adds IOResultE alias for IOResult[a, Exception]

    • Adds IOResult support for bind

    • Adds IOResult support for flatten

    • Adds IOResult support for @pipeline

    • Adds IOResult support for coalesce

    • Adds IOResult support for is_successful

    • Adds RequiresContextIOResult container

    • Adds RequiresContextIOResultE alias

    • Adds ReaderIOResult and ReaderIOResultE aliases for RequiresContextIOResult[..., ..., Exception]

    • Adds RequiresContextIOResult support for bind and rescue

    • Adds RequiresContextIOResult support for flatten

    • Adds Result.lift, Maybe.lift, RequiresContext.lift, and RequiresContextResult.lift functions in addition to IO.lift

    • Adds Immutable primitive type

    • Adds Unitable protocol and .from_success() and .from_failure() methods for all Result realted classes

    • Adds Instanceable protocol and .from_value() method for IO and RequiresContext

    • Adds flow function, which is similar to pipe

    • Adds swap coverter for Result and IOResult

    • Adds squash_context function to squash RequiresContext similar to IO

    Bugfixes

    • Now Success and Failure (both io and pure) return Any and not NoReturn
    • Fixes how flatten works, also adds more tests and docs about Failure case
    • Fixes Unwrappable type being parametrized with only one TypeVar
    • Changes Success and Failure to return Any instead of NoReturn

    Misc

    • Updates poetry version in travis
    • Imporves pipe docs with lambda and Generic problem
    • Improves docs in several places
    • Now examples in docs tries to be docstests where possible
    • Changes how tests are checked with mypy in CI
    Source code(tar.gz)
    Source code(zip)
  • 0.12.0(Dec 21, 2019)

    Features

    • Breaking: now @pipeline requires a container type when created: @pipeline(Result) or @pipeline(Maybe)
    • Maybe and Result now has success_type and failure_type aliases
    • Adds Result.unify utility method for better error type composition
    • We now support dry-python/classes as a first-class citizen
    • Adds io_squash to squash several IO containers into one container with a tuple inside, currently works with 9 containers max at a time
    • Adds untap function which does convert return type to None

    Bugfixes

    • Fixes that containers were not usable with multiprocessing
    • Changes the inheritance order, now BaseContainer is the first child
    • Fixes that Nothing had incorrect docstrings

    Misc

    • Now generated package is protected
    • Updates poetry to 1.0
    Source code(tar.gz)
    Source code(zip)
  • 0.11.0(Aug 29, 2019)

    returns@0.11

    Features

    • Breaking: now pipe() does not require argument to be the first value, instead it is required to use: pipe(f1, f2, f3, f4)(value)
    • Breaking: dropped everything from returns/__init__.py, because we now have quite a lot of stuff
    • Breaking: dropped support of zero argument functions for Nothing.fix
    • Breaking: dropped support of zero argument functions for Nothing.rescue
    • Maybe now has .failure() to match the same API as Result
    • Adds identity function
    • Adds tap function
    • Now pipe allows to pipe 8 steps
    • Adds coalesce_result and coalesce_maybe coverters

    Bugfixes

    • Fixes that code inside .fix and .rescue of Maybe might be called twice

    Misc

    • Now all methods have doctests
    • Updates docs about Success and _Success, Failure and _Failure
    • Updates docs about @pipeline
    • Typechecks async functions and decorators inside typesafety/ tests
    Source code(tar.gz)
    Source code(zip)
  • 0.10.0(Aug 18, 2019)

    twitter

    Features

    • Breaking: python>=3.7,<=3.7.2 are not supported anymore, because of a bug inside typing module
    • Breaking: Now bind does not change the type of an error
    • Breaking: Now rescue does not change the type of a value
    • Breaking: Renames map_failure to alt
    • Adds box() function with the ability to box function for direct container composition like: a -> Container[b] to Container[a] -> Container[b]
    • Adds IO.lift() function to lift a -> a to IO[a] -> IO[a]
    • Adds pipe() function to pipeline.py
    • Adds __hash__() magic methods to all containers

    Bugfixes

    • Changes Any to NoReturn in Success and Failure
    • Now all type parameters in Result, Maybe, and IO are covariant

    Misc

    • Massive docs rewrite
    • Updates mypy version
    • Updates wemake-python-styleguide and introduces nitpick
    • Updates pytest-plugin-mypy, all tests now use yml
    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Jul 1, 2019)

    Features

    • Provides a bunch of primitive interfaces to write your own containers
    • Adds .map_failure() method
    • Adds join() function to join nested containers

    Bugfixes

    • Fixes type of Maybe.fix and Maybe.rescue to work with both lambda: 1 and lambda _: 1

    Misc

    • Improves README
    Source code(tar.gz)
    Source code(zip)
  • 0.8.0(Jun 17, 2019)

    Features

    • Reintroduces the Maybe monad, typed!
    • Introduces converters from one type to another
    • Adds mypy plugin to type decorators
    • Complete rewrite of Result types
    • Partial API change, now Success and Failure are not types, but functions
    • New internal types introduced: FixableContainer and ValueUnwrapContainer

    Bugfixes

    • Fixes issue when you could return IO container from Result.bind
    • Fixes @pipeline return type

    Misc

    • Reapplied all types to .py files
    • Improved docs about IO and Container concept
    • Adds docs about container composition
    • Moves from Alpha to Beta
    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(Jun 11, 2019)

    Features

    • Adds IO marker
    • Adds unsafe module with unsafe functions
    • Changes how functions are located inside the project

    Bugfixes

    • Fixes container type in @pipeline
    • Now is_successful is public
    • Now raise_exception is public

    Misc

    • Changes how str() function works for container types
    • Total rename to "container" in the source code
    Source code(tar.gz)
    Source code(zip)
  • 0.6.0(Jun 7, 2019)

  • 0.5.0(Jun 1, 2019)

    Features

    • Adds compose helper function
    • Adds public API to import returns
    • Adds raise_exception helper function
    • Adds full traceback to .unwrap()

    Misc

    • Updates multiple dev-dependencies, including mypy
    • Now search in the docs is working again
    • Relicenses this project to BSD
    • Fixes copyright notice in the docs
    Source code(tar.gz)
    Source code(zip)
  • 0.4.0(Feb 4, 2019)

    Features

    • Moves all types to .pyi files
    • Renames all classes according to new naming pattern
    • HUGE improvement of types
    • Renames fmap to map
    • Renames do_notation to pipeline, moves it to functions.py
    • Renames ebind to rescue
    • Renames efmap to fix
    • Renames Monad to Container
    • Removes Maybe monad, since typing does not have NonNullable type
    Source code(tar.gz)
    Source code(zip)
  • 0.3.1(Feb 2, 2019)

  • 0.3.0(Feb 2, 2019)

    The project is renamed to returns and moved to dry-python org.

    Features

    • Adds .pyi files for all modules, to enable mypy support for 3rd party users
    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Jan 30, 2019)

    Features

    • Adds Maybe monad
    • Adds immutability and __slots__ to all monads
    • Adds methods to work with failures
    • Adds safe decorator to convert exceptions to Either monad
    • Adds is_successful() function to detect if your result is a success
    • Adds failure() method to unwrap values from failed monads

    Bugfixes

    • Changes the type of .bind method for Success monad
    • Changes how equality works, so now Failure(1) != Success(1)
    • Changes how new instances created on unused methods

    Misc

    • Improves docs
    Source code(tar.gz)
    Source code(zip)
  • 0.1.1(Jan 28, 2019)

  • 0.1.0(Jan 28, 2019)

Owner
dry-python
A set of libraries for pluggable business logic components.
dry-python
Cython implementation of Toolz: High performance functional utilities

CyToolz Cython implementation of the toolz package, which provides high performance utility functions for iterables, functions, and dictionaries. tool

894 Jan 02, 2023
Functional programming in Python: implementation of missing features to enjoy FP

Fn.py: enjoy FP in Python Despite the fact that Python is not pure-functional programming language, it's multi-paradigm PL and it gives you enough fre

Oleksii Kachaiev 3.3k Jan 04, 2023
Simple, elegant, Pythonic functional programming.

Coconut Coconut (coconut-lang.org) is a variant of Python that adds on top of Python syntax new features for simple, elegant, Pythonic functional prog

Evan Hubinger 3.6k Jan 03, 2023
A fancy and practical functional tools

Funcy A collection of fancy functional tools focused on practicality. Inspired by clojure, underscore and my own abstractions. Keep reading to get an

Alexander Schepanovski 2.9k Dec 29, 2022
More routines for operating on iterables, beyond itertools

More Itertools Python's itertools library is a gem - you can compose elegant solutions for a variety of problems with the functions it provides. In mo

2.9k Jan 04, 2023
Make your functions return something meaningful, typed, and safe!

Make your functions return something meaningful, typed, and safe! Features Brings functional programming to Python land Provides a bunch of primitives

dry-python 2.5k Jan 05, 2023
A functional standard library for Python.

Toolz A set of utility functions for iterators, functions, and dictionaries. See the PyToolz documentation at https://toolz.readthedocs.io LICENSE New

4.1k Jan 03, 2023
粤语编程语言.The Cantonese programming language.

粤语编程语言.The Cantonese programming language.

Stepfen Shawn 895 Dec 24, 2022