HTTP traffic mocking and testing made easy in Python

Overview

pook Build Status PyPI Coverage Status Documentation Status Stability Code Climate Python Versions

Versatile, expressive and hackable utility library for HTTP traffic mocking and expectations made easy in Python. Heavily inspired by gock.

To get started, see the documentation, how it works, FAQ or examples.

Features

  • Simple, expressive and fluent API.
  • Provides both Pythonic and chainable DSL API styles.
  • Full-featured HTTP response definitions and expectations.
  • Matches any HTTP protocol primitive (URL, method, query params, headers, body...).
  • Full regular expressions capable mock expectations matching.
  • Supports most popular HTTP clients via interceptor adapters.
  • Configurable volatile, persistent or TTL limited mocks.
  • Works with any testing framework/engine (unittest, pytest, nosetests...).
  • First-class JSON & XML support matching and responses.
  • Supports JSON Schema body matching.
  • Works in both runtime and testing environments.
  • Can be used as decorator and/or via context managers.
  • Supports real networking mode with optional traffic filtering.
  • Map/filter mocks easily for generic or custom mock expectations.
  • Custom user-defined mock matcher functions.
  • Simulated raised error exceptions.
  • Network delay simulation (only available for aiohttp).
  • Pluggable and hackable API.
  • Customizable HTTP traffic mock interceptor engine.
  • Supports third-party mocking engines, such as mocket.
  • Fits good for painless test doubles.
  • Does not support WebSocket traffic mocking.
  • Works with Python +2.7 and +3.0 (including PyPy).
  • Dependency-less: just 2 small dependencies for JSONSchema and XML tree comparison.

Supported HTTP clients

pook can work with multiple mock engines, however it provides a built-in one by default, which currently supports traffic mocking in the following HTTP clients:

More HTTP clients can be supported progressively.

Note: only recent HTTP client package versions were tested.

Installation

Using pip package manager (requires pip 1.8+):

pip install --upgrade pook

Or install the latest sources from Github:

pip install -e git+git://github.com/h2non/pook.git#egg=pook

Getting started

See ReadTheDocs documentation:

Documentation Status

API

See annotated API reference documention.

Examples

See examples documentation for full featured code and use case examples.

Basic mocking:

import pook
import requests

@pook.on
def test_my_api():
    mock = pook.get('http://twitter.com/api/1/foobar', reply=404, response_json={'error': 'not found'})

    resp = requests.get('http://twitter.com/api/1/foobar')
    assert resp.status_code == 404
    assert resp.json() == {"error": "not found"}
    assert mock.calls == 1

Using the chainable API DSL:

import pook
import requests

@pook.on
def test_my_api():
    mock = (pook.get('http://twitter.com/api/1/foobar')
              .reply(404)
              .json({'error': 'not found'}))

    resp = requests.get('http://twitter.com/api/1/foobar')
    assert resp.json() == {"error": "not found"}
    assert mock.calls == 1

Using the decorator:

import pook
import requests

@pook.get('http://httpbin.org/status/500', reply=204)
@pook.get('http://httpbin.org/status/400', reply=200)
def fetch(url):
    return requests.get(url)

res = fetch('http://httpbin.org/status/400')
print('#1 status:', res.status_code)

res = fetch('http://httpbin.org/status/500')
print('#2 status:', res.status_code)

Simple unittest integration:

import pook
import unittest
import requests


class TestUnitTestEngine(unittest.TestCase):

    @pook.on
    def test_request(self):
        pook.get('server.com/foo').reply(204)
        res = requests.get('http://server.com/foo')
        self.assertEqual(res.status_code, 204)

    def test_request_with_context_manager(self):
        with pook.use():
            pook.get('server.com/bar', reply=204)
            res = requests.get('http://server.com/bar')
            self.assertEqual(res.status_code, 204)

Using the context manager for isolated HTTP traffic interception blocks:

import pook
import requests

# Enable HTTP traffic interceptor
with pook.use():
    pook.get('http://httpbin.org/status/500', reply=204)

    res = requests.get('http://httpbin.org/status/500')
    print('#1 status:', res.status_code)

# Interception-free HTTP traffic
res = requests.get('http://httpbin.org/status/200')
print('#2 status:', res.status_code)

Example using mocket Python library as underlying mock engine:

import pook
import requests
from mocket.plugins.pook_mock_engine import MocketEngine

# Use mocket library as underlying mock engine
pook.set_mock_engine(MocketEngine)

# Explicitly enable pook HTTP mocking (optional)
pook.on()

# Target server URL to mock out
url = 'http://twitter.com/api/1/foobar'

# Define your mock
mock = pook.get(url,
                reply=404, times=2,
                headers={'content-type': 'application/json'},
                response_json={'error': 'foo'})

# Run first HTTP request
requests.get(url)
assert mock.calls == 1

# Run second HTTP request
res = requests.get(url)
assert mock.calls == 2

# Assert response data
assert res.status_code == 404
assert res.json() == {'error': 'foo'}

# Explicitly disable pook (optional)
pook.off()

Example using Hy language (Lisp dialect for Python):

(import [pook])
(import [requests])

(defn request [url &optional [status 404]]
  (doto (.mock pook url) (.reply status))
  (let [res (.get requests url)]
    (. res status_code)))

(defn run []
  (with [(.use pook)]
    (print "Status:" (request "http://server.com/foo" :status 204))))

;; Run test program
(defmain [&args] (run))

Development

Clone the repository:

git clone [email protected]:h2non/pook.git

Install dependencies:

pip install -r requirements.txt -r requirements-dev.txt

Install Python dependencies:

make install

Lint code:

make lint

Run tests:

make test

Generate documentation:

make htmldocs

License

MIT - Tomas Aparicio

Comments
  • What about using Mocket as core library?

    What about using Mocket as core library?

    Hi there, I am the author of mocket and I think our projects are really similar and well connected. What about joining forces? I think mocket is still a great idea, but currently I am the only one still working on it (we were two friends/colleagues at the time we presented it at EuroPython 2013, and another colleague helped us to fix some issues). My idea is basically having mocket as pook engine. What do you think about it? Thanks in advance, Giorgio

    https://github.com/mocketize/python-mocket

    enhancement 
    opened by mindflayer 62
  • aiohttp mocking binary content

    aiohttp mocking binary content

    I am refactoring one of my project from using requests to aiohttp as http client library and my exists test failing. I found 2 issues in mockig aiohttp First is url handling, when i write test for requests library - pook require full url to match

    url = 'http://data.alexa.com/data?cli=10&dat=snbamz&url=http%3A%2F%2Ftest.org'
    

    when i replace requests to aiohttp pook stop matching full url and need now url without parameters

    url = 'http://data.alexa.com/data'
    

    Second issue with content mocking, when i use requests i mock binary content

        mock_response_content = b'<?xml version="1.0" encoding="UTF-8"?>\r\n\r\n<!-- Need more Alexa data?  Find our APIs here: https://aws.amazon.com/alexa/ -->\r\n<ALEXA VER="0.9" URL="test.org/" HOME="0" AID="=" IDN="test.org/">...</ALEXA>'
    

    this code now throw exception with aiohttp but not throw any exception with requests here is exception

    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    natrix\plugins\alexa_data\alexa_data.py:20: in perform_scan
        async with self.http_session.get(DATA_ALEXA_API_URL, params=alexa_api_params) as resp:
    env\lib\site-packages\aiohttp\client.py:529: in __aenter__
        self._resp = yield from self._coro
    env\lib\site-packages\pook\interceptors\aiohttp.py:124: in handler
        data=data, headers=headers, **kw)
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    
    self = <pook.interceptors.aiohttp.AIOHTTPInterceptor object at 0x051331D0>
    _request = <function ClientSession._request at 0x04578B28>
    session = <aiohttp.client.ClientSession object at 0x04E968D0>, method = 'GET'
    url = 'http://data.alexa.com/data', data = None, headers = []
    kw = {'allow_redirects': True, 'params': {'cli': '10', 'dat': 'snbamz', 'url': 'http://test.org'}}
    req = Request(
      method=GET,
      headers=HTTPHeaderDict({}),
      body=None,
      url=http://data.alexa.com/data,
      query={},
    )
    mock = Mock(
      matches=1,
      times=0,
      persist=False,
      matchers=MatcherEngine([
        MethodMatcher(GET),
        URLMatcher(http:...ARITY URL="test.org/" TEXT="2285614" SOURCE="panel"/><REACH RANK="1897255"/><RANK DELTA="+608439"/></SD></ALEXA>'
      )
    )
    res = Response(
        headers=HTTPHeaderDict({}),
        status=200,
        body=b'<?xml version="1.0" encoding="UTF-8"?>\r\n\r\n<!-...OPULARITY URL="test.org/" TEXT="2285614" SOURCE="panel"/><REACH RANK="1897255"/><RANK DELTA="+608439"/></SD></ALEXA>'
    )
    _res = <ClientResponse(http://data.alexa.com/data) [200 OK]>
    <CIMultiDictProxy()>
    
    
        @asyncio.coroutine
        def _on_request(self, _request, session, method, url,
                        data=None, headers=None, **kw):
            # Create request contract based on incoming params
            req = Request(method)
            req.headers = headers or {}
            req.body = data
        
            # Expose extra variadic arguments
            req.extra = kw
        
            # Compose URL
            req.url = str(url)
        
            # Match the request against the registered mocks in pook
            mock = self.engine.match(req)
        
            # If cannot match any mock, run real HTTP request if networking
            # or silent model are enabled, otherwise this statement won't
            # be reached (an exception will be raised before).
            if not mock:
                return _request(session, method, url,
                                data=data, headers=headers, **kw)
        
            # Simulate network delay
            if mock._delay:
                yield from asyncio.sleep(mock._delay / 1000)  # noqa
        
            # Shortcut to mock response
            res = mock._response
        
            # Aggregate headers as list of tuples for interface compatibility
            headers = []
            for key in res._headers:
                headers.append((key, res._headers[key]))
        
            # Create mock equivalent HTTP response
            _res = HTTPResponse(req.method, self._url(urlunparse(req.url)))
        
            # response status
            _res.version = (1, 1)
            _res.status = res._status
            _res.reason = http_reasons.get(res._status)
            _res._should_close = False
        
            # Add response headers
            _res.raw_headers = tuple(headers)
            _res.headers = multidict.CIMultiDictProxy(
                multidict.CIMultiDict(headers)
            )
        
            # Define `_content` attribute with an empty string to
            # force do not read from stream (which won't exists)
            _res._content = ''
            if res._body:
    >           _res._content = res._body.encode('utf-8', errors='replace')
    E           AttributeError: 'bytes' object has no attribute 'encode'
    
    env\lib\site-packages\pook\interceptors\aiohttp.py:110: AttributeError
    

    with string content no exception and all working as expected

    bug 
    opened by sky-code 7
  • fix(urllib3): interceptor is never really disabled

    fix(urllib3): interceptor is never really disabled

    The urllib3 interceptor is never really disabled - see the committed unit test.

    My assumption is that this is because of the two patches - the native one and the requests one presumably point to the same function. When the first mock is created it points to the original, while at the time of creation of the second mock, it will the point to the already mocked version.

    The fix is to stop the mocks in reverse order.

    opened by kvalev 6
  • Add activate_async decorator

    Add activate_async decorator

    If you apply pook.activate on async def function you will get RuntimeWarning: coroutine 'xxx' was never awaited with this patch you can use pook.activate on async def functions without any warnings.

    an example:

    @pook.activate
    async def test_anything():
        # setup mock
        url = 'http://google.com/'
        pook.get(url, reply=200, response_body='...')
    
        no_problem = await actual_method_that_do_http_request()
        assert True == no_problem
    
    opened by sky-code 6
  • @pook.activate or @pook.on seem to skip pytest tests decorated with @pytest.mark.asyncio

    @pook.activate or @pook.on seem to skip pytest tests decorated with @pytest.mark.asyncio

    Hello and thanks for this cool library.

    I am trying to use it in a test suite where we use pytest and @pytest.mark.asyncio

    I am currently trying something like this:

    
    @pytest.mark.asyncio
    @pook.on
    async def some_test():
      with pook.post("somedomain.com/some_path/", reply=200):
        # do actual request using await & assertions
        pass
    

    The test some_test will be flagged as skipped by pytest.

    Is this an expected behaviour or some well known issue?

    question 
    opened by lmammino 5
  • Always use overwritten response

    Always use overwritten response

    Not sure if this is a bug, so this PR is trying to start a discussion.

    It looks like this or check will always ignore the later part since self._response is never None (from init it'll at least get default value Response().

    The proposed solution is: If reply is called, then Mock should always use provided args to be response instead of checking with _response.

    My use case is, I have a callback function that use reply to modify the response dynamically. Maybe there's a better way to do that.

    Thank you!

    opened by qiao-meng-zefr 5
  • Fix py27 compatibility

    Fix py27 compatibility

    Considering first example from the README:

    import pook
    import requests
    
    @pook.on
    def test_my_api():
        mock = pook.get('http://twitter.com/api/1/foobar', reply=404, response_json={'error': 'not found'})
    
        resp = requests.get('http://twitter.com/api/1/foobar')
        assert resp.status_code == 404
        assert resp.json() == {"error": "not found"}
        assert mock.calls == 1
    
    test_my_api()
    

    Running it under Python 2.7.* results in the following traceback:

    Traceback (most recent call last):
      File "example1.py", line 1, in <module>
        import pook
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/__init__.py", line 1, in <module>
        from .api import *  # noqa
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/api.py", line 5, in <module>
        from .engine import Engine
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/engine.py", line 5, in <module>
        from .mock_engine import MockEngine
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/mock_engine.py", line 1, in <module>
        from .interceptors import interceptors
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/interceptors/__init__.py", line 2, in <module>
        from .urllib3 import Urllib3Interceptor
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/interceptors/urllib3.py", line 5, in <module>
        from .http import URLLIB3_BYPASS
      File "/usr/local/var/pyenv/versions/pook-py27/lib/python2.7/site-packages/pook/interceptors/http.py", line 10, in <module>
        from unittest import mock
    ImportError: cannot import name mock
    

    The problem is caused by the following code section from pook.interceptors.http:

    # Support Python 2/3
    try:
        import mock
    except:
        from unittest import mock
    

    mock is available as unittest.mock in Python 3.3 onwards, for Python 2.6 / 2.7 the backport should be installed https://pypi.python.org/pypi/mock

    So, this PR updates requirements.txt with mock~=2.0.0 and appropriate environment marker to properly handle make install. The same is done for setup.py.

    opened by pavdmyt 5
  • Switch to Python >= 3.5 and fix latest aiohttp compatability

    Switch to Python >= 3.5 and fix latest aiohttp compatability

    Fixes #82. In doing so it was necessary to fix everything that used Python 2 and anything older than 3.5 (in order to use async def). This required updating some dependencies.

    This appears to work, however I'm not able to run the full test suite as the Nose library, used to run the nose_suite tests, is not compatible with the latest Python versions: https://github.com/nose-devs/nose/issues/1118

    Potential solutions are: only actually run the nose tests in Python versions up to the latest version supported by Nose (which has been unmaintained for 6 years). I believe this is Python 3.8 or 3.9, but I need to dig a little more to confirm which (or trial and error). Alternatively, remove the nose test suite and make no guarantees of compatibility with the unmaintained testing library.

    @h2non do you have an inclination about which you prefer or any other alternative approaches to take?

    opened by sarayourfriend 3
  • aiohttp mock only works when using request methods as context managers, but not when awaited

    aiohttp mock only works when using request methods as context managers, but not when awaited

    Hello, thank you for the lovely project! It makes HTTP mocking really pleasant to work with.

    I noticed a small issue with the implementation of the aiohttp mock: it appears to only work when using the request methods as context managers, but not when their results are awaited.

    This gist has example code to show this in a pytest unit test: https://gist.github.com/sarayourfriend/44a33e7397849939156247457d8bb773

    Here is the full output of running the tests:

    ➜  aiohttp-example pdm run pytest
    /home/sara/.local/lib/python3.10/site-packages/requests/__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (5.0.0)/charset_normalizer (2.0.12) doesn't match a supported version!
      warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
    =================================== test session starts ===================================
    platform linux -- Python 3.10.8, pytest-7.2.0, pluggy-1.0.0
    rootdir: /home/sara/projects/aiohttp-example
    plugins: anyio-3.6.2, asyncio-0.20.2
    asyncio: mode=strict
    collected 2 items                                                                         
    
    main_test.py .F                                                                     [100%]
    
    ======================================== FAILURES =========================================
    _________________________________ test_make_request_await _________________________________
    
        async def test_make_request_await():
            pook.head(URL).reply(200)
        
            pook.on()
    >       res = await make_request_await()
    
    main_test.py:36: 
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    main_test.py:19: in make_request_await
        response = await session.head(URL)
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    
    self = <aiohttp.client._RequestContextManager object at 0x7f7cb9dfc9d0>
    
        def __await__(self) -> Generator[Any, None, _RetType]:
    >       ret = self._coro.__await__()
    E       AttributeError: 'generator' object has no attribute '__await__'. Did you mean: '__init__'?
    
    .venv/lib/python3.10/site-packages/aiohttp/client.py:1134: AttributeError
    ==================================== warnings summary =====================================
    .venv/lib/python3.10/site-packages/pook/interceptors/aiohttp.py:48
      /home/sara/projects/aiohttp-example/.venv/lib/python3.10/site-packages/pook/interceptors/aiohttp.py:48: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
        def read(self, n=-1):
    
    .venv/lib/python3.10/site-packages/pook/interceptors/aiohttp.py:80
      /home/sara/projects/aiohttp-example/.venv/lib/python3.10/site-packages/pook/interceptors/aiohttp.py:80: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
        def _on_request(self, _request, session, method, url,
    
    main_test.py::test_make_request_ctx
    main_test.py::test_make_request_await
      /home/sara/projects/aiohttp-example/.venv/lib/python3.10/site-packages/pook/interceptors/aiohttp.py:153: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
        def handler(session, method, url, data=None, headers=None, **kw):
    
    -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
    ================================= short test summary info =================================
    FAILED main_test.py::test_make_request_await - AttributeError: 'generator' object has no attribute '__await__'. Did you mean: '__init...
    ========================= 1 failed, 1 passed, 4 warnings in 0.09s =========================
    

    As you can see, test_make_request_await fails whereas the other test case which uses the request method as a context manger passes fine.

    It looks like this comes down to an issue with the return value of the _request mock that pook is applying: it must be a coroutine but is currently a generator. I'm not sure exactly why this is the case, because the mock handler is wrapped in @asyncio.coroutine.

    I've managed to patch this locally by removing the usage of the asyncio.coroutine decorator and replacing the methods with regular async def methods. I'll open a PR with these changes later this week.

    opened by sarayourfriend 3
  • Pook 1.0.1 Error of JSON module (Py2 only)

    Pook 1.0.1 Error of JSON module (Py2 only)

    Version: Pook==1.0.1 Environment: Python 2.7

    Demo code

    # demo.py
    import pook, requests, json
    pook.on()
    pook.post('http://x.com', json={"a": "a"}, reply=300)
    requests.post('http://x.com', data=json.dumps({"a": "a"}))
    pook.off()
    

    Error message:

    => Detailed matching errors:
    JSONMatcher: 'module' object has no attribute 'dumps'
    

    Error source code:

    image

    Debugging

    After a bit of debugging at runtime console, I found out that the json module showed above was not actually builtin json but rather a "magic mock":

    >>> json
    <module 'pook.matchers.json' from '/Users/someone/virtualenv/venv2/lib/python2.7/site-packages/pook/matchers/json.pyc'>
    >>> json.dumps
    AttributeError: 'module' object has no attribute 'dumps'
    
    >>> import ujson
    >>> ujson.dumps({1:1})
    '{"1":1}'
    
    opened by solomonxie 3
  • Mocking chunked responses

    Mocking chunked responses

    This PR extends pook's API to allow configuring a chunked response (only supported for urllib3).

    Example:

    import pook
    import urllib3
    
    # Mock HTTP traffic only in the given context
    with pook.use():
        (pook.get('httpbin.org/chunky')
            .reply(200)
            .body(['returned', 'as', 'chunks'], chunked=True))
    
        # Intercept request
        http = urllib3.PoolManager()
        r = http.request('GET', 'httpbin.org/chunky')
        print('Chunks:', list(r.read_chunked()))
    
    • [x] implemenation
    • [x] tests
    • [x] documentation
    opened by kvalev 3
  • Importing pook fails on Python 3.11

    Importing pook fails on Python 3.11

    Trying to use Pook on Python 3.11 fails:

    /usr/local/lib/python3.11/site-packages/pook/activate_async.py:2: in <module>
        from asyncio import iscoroutinefunction, coroutine
    E   ImportError: cannot import name 'coroutine' from 'asyncio' (/usr/local/lib/python3.11/asyncio/__init__.py)
    

    It was removed in 3.11 https://github.com/python/cpython/pull/26369

    opened by martydill 1
  • Disable PookNoMatch attempted matching output

    Disable PookNoMatch attempted matching output

    Hello,

    Whenever I run into the classic PookNoMatches error, it seems like pook will try to show the "closest" mock that exists that is similar to the request that was just made. 99.9% of the time, the "closest" mock is not even remotely similar to what I just made, and my terminal is basically filled up with more confusing than helpful text. My questions are:

    1. Why is this a feature?
    2. Does pook come with functionality to disable this or do I have to implement this myself?

    Thanks, Henry

    opened by henry-nextroll 1
  • Fixing #72 issue

    Fixing #72 issue

    This PR fixes the #72 issue. A detailed description is available in the last commit.

    I've achieved it by keeping track of the real module path when we do patching. For example when patching:

    • a.b.c
    • d.e.f

    Somehow a.b.c can be equal to d.e.f so, when we trying to patch the second object we should check if it was already patched, and if it is, then I just skip it.

    opened by libbkmz 1
  • pook.use() context manager is stilll active after use even in follow-up testcases

    pook.use() context manager is stilll active after use even in follow-up testcases

    Hi, thanks for pook, it's a very handy library that I use in my pytest tests.

    However I think I found a bug (using pook v1.0.1, Python 3.8.2, requests 2.21.0, Windows 10): When using pook.use() as context manager to replace a requests.get() call pook does not seem to remove the patching afterwards. It will also stay active across any follow-up testcases. I can reproduce it using the following pytest:

    def test_pookUse_contextManager_isDisabledAfterUse():
        url = 'http://foo.bar'
        with pytest.raises(requests.exceptions.ConnectionError):
            requests.get(url)
        respJson = {"resp": "ok"}
        with pook.use():
            pook.get(url).reply().json(respJson)
            resp = requests.get(url)
            assert resp.json() == respJson
            assert pook.isdone()
        assert not pook.isactive()
        with pytest.raises(requests.exceptions.ConnectionError):
            requests.get(url)
    

    Pytest output:

    tests\integration\engines_test.py:14 (test_pookUse_contextManager_isDisabledAfterUse)
    def test_pookUse_contextManager_isDisabledAfterUse():
            url = 'http://foo.bar'
            with pytest.raises(requests.exceptions.ConnectionError):
                requests.get(url)
            respJson = {"resp": "ok"}
            with pook.use():
                pook.get(url).reply().json(respJson)
                resp = requests.get(url)
                assert resp.json() == respJson
                assert pook.isdone()
            assert not pook.isactive()
            with pytest.raises(requests.exceptions.ConnectionError):
    >           requests.get(url)
    
    tests\integration\engines_test.py:27: 
    
    [...]
    
                # Raise no matches exception
    >           raise PookNoMatches(msg)
    E           pook.exceptions.PookNoMatches: pook error!
    E           
    E           => Cannot match any mock for the following request:
    E           ==================================================
    E           Method: GET
    E           URL: http://foo.bar:80/
    E           Headers: HTTPHeaderDict({'User-Agent': 'python-requests/2.21.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'})
    E           ==================================================
    
    pook\engine.py:443: PookNoMatches
    

    I put the testcase as the first one that is executed in all modules to avoid any effects from previous testcases (I put it on top in tests/integration/engines_test.py ). I could not yet figure out what the issue is. Could you have a look if it is reproducible on your side as well? Thank you.

    bug enhancement 
    opened by MoleMan1024 6
  • Issues using with urllib

    Issues using with urllib

    When using with urllib an exception is throw as response doesn't have a code attribute. Uncovered this while trying to mock a prometheus push gateway endpoint with prometheus client library. Simplified code snippet below:

    Exception raised:

    self = <urllib.request.HTTPErrorProcessor object at 0x7f79100e3c18>
    request = <urllib.request.Request object at 0x7f79100e3748>
    response = <http.client.HTTPResponse object at 0x7f7910608048>
    
        def http_response(self, request, response):
    >       code, msg, hdrs = response.code, response.msg, response.info()
    E       AttributeError: 'HTTPResponse' object has no attribute 'code'
    

    Test code snippet:

        @pook.on
        def test_pook(self):
            mock = pook.put('http://prometheus_test', reply=201, response_json={'test': 'test'})
    
            # resp = requests.put('http://prometheus_test')
            do_request(
                url='http://prometheus_test', method='PUT', timeout=30, data=b'',
            )()
    
            # assert resp.status_code == 200
            assert mock.calls == 1
    
    def do_request(url, method, timeout, data):
            request = Request(url, data=data)
            request.get_method = lambda: method
            resp = build_opener(HTTPHandler).open(request, timeout=timeout)
            if resp.code >= 400:
                raise IOError("error talking to pushgateway: {0} {1}".format(
                    resp.code, resp.msg))
    
    help wanted 
    opened by timgentonzo 4
Releases(v1.0.2)
Owner
Tom
Computers harasser
Tom
An interactive TLS-capable intercepting HTTP proxy for penetration testers and software developers.

mitmproxy mitmproxy is an interactive, SSL/TLS-capable intercepting proxy with a console interface for HTTP/1, HTTP/2, and WebSockets. mitmdump is the

mitmproxy 29.7k Jan 02, 2023
Automated Security Testing For REST API's

Astra REST API penetration testing is complex due to continuous changes in existing APIs and newly added APIs. Astra can be used by security engineers

Flipkart Incubator 2.1k Dec 31, 2022
Checks for a 200 response from your subdomain list.

Check for available subdomains Written in Python, this terminal based application looks for a 200 response from the subdomain list you've provided. En

Sean 1 Nov 03, 2021
Automated Penetration Testing Framework

Automated Penetration Testing Framework

OWASP 2.1k Jan 01, 2023
Obsei is a low code AI powered automation tool.

Obsei is a low code AI powered automation tool. It can be used in various business flows like social listening, AI based alerting, brand image analysis, comparative study and more .

Obsei 782 Dec 31, 2022
Load and performance benchmark tool

Yandex Tank Yandextank has been moved to Python 3. Latest stable release for Python 2 here. Yandex.Tank is an extensible open source load testing tool

Yandex 2.2k Jan 03, 2023
Code coverage measurement for Python

Coverage.py Code coverage testing for Python. Coverage.py measures code coverage, typically during test execution. It uses the code analysis tools and

Ned Batchelder 2.3k Jan 04, 2023
Screenplay pattern base for Python automated UI test suites.

ScreenPy TITLE CARD: "ScreenPy" TITLE DISAPPEARS.

Perry Goy 39 Nov 15, 2022
CNE-OVS-SIT - OVS System Integration Test Suite

CNE-OVS-SIT - OVS System Integration Test Suite Introduction User guide Discussion Introduction CNE-OVS-SIT is a test suite for OVS end-to-end functio

4 Jan 09, 2022
A Python Selenium library inspired by the Testing Library

Selenium Testing Library Slenium Testing Library (STL) is a Python library for Selenium inspired by Testing-Library. Dependencies Python 3.6, 3.7, 3.8

Anže Pečar 12 Dec 26, 2022
Doggo Browser

Doggo Browser Quick Start $ python3 -m venv ./venv/ $ source ./venv/bin/activate $ pip3 install -r requirements.txt $ ./sobaki.py References Heavily I

Alexey Kutepov 9 Dec 12, 2022
Enabling easy statistical significance testing for deep neural networks.

deep-significance: Easy and Better Significance Testing for Deep Neural Networks Contents ⁉️ Why 📥 Installation 🔖 Examples Intermezzo: Almost Stocha

Dennis Ulmer 270 Dec 20, 2022
PoC getting concret intel with chardet and charset-normalizer

aiohttp with charset-normalizer Context aiohttp.TCPConnector(limit=16) alpine linux nginx 1.21 python 3.9 aiohttp dev-master chardet 4.0.0 (aiohttp-ch

TAHRI Ahmed R. 2 Nov 30, 2022
HTTP client mocking tool for Python - inspired by Fakeweb for Ruby

HTTPretty 1.0.5 HTTP Client mocking tool for Python created by Gabriel Falcão . It provides a full fake TCP socket module. Inspired by FakeWeb Github

Gabriel Falcão 2k Jan 06, 2023
WomboAI Art Generator

WomboAI Art Generator Automate AI art generation using wombot.art. Also integrated into SnailBot for you to try out. Setup Install Python Go to the py

nbee 7 Dec 03, 2022
🐍 Material for PyData Global 2021 Presentation: Effective Testing for Machine Learning Projects

Effective Testing for Machine Learning Projects Code for PyData Global 2021 Presentation by @edublancas. Slides available here. The project is develop

Eduardo Blancas 73 Nov 06, 2022
A pytest plugin that enables you to test your code that relies on a running Elasticsearch search engine

pytest-elasticsearch What is this? This is a pytest plugin that enables you to test your code that relies on a running Elasticsearch search engine. It

Clearcode 65 Nov 10, 2022
Faker is a Python package that generates fake data for you.

Faker is a Python package that generates fake data for you. Whether you need to bootstrap your database, create good-looking XML documents, fill-in yo

Daniele Faraglia 15.2k Jan 01, 2023
Test django schema and data migrations, including migrations' order and best practices.

django-test-migrations Features Allows to test django schema and data migrations Allows to test both forward and rollback migrations Allows to test th

wemake.services 382 Dec 27, 2022
Run ISP speed tests and save results

SpeedMon Automatically run periodic internet speed tests and save results to a variety of storage backends. Supported Backends InfluxDB v1 InfluxDB v2

Matthew Carey 9 May 08, 2022