RedisTimeSeries python client

Overview

license PyPI version CircleCI GitHub issues Codecov Language grade: Python Known Vulnerabilities

redistimeseries-py

Forum Discord

Deprecation notice

As of redis-py 4.0.0 this library is deprecated. It's features have been merged into redis-py. Please either install it from pypy or the repo.


redistimeseries-py is a package that gives developers easy access to RedisTimeSeries module. The package extends redis-py's interface with RedisTimeSeries's API.

Installation

$ pip install redistimeseries

Development

  1. Create a virtualenv to manage your python dependencies, and ensure it's active. virtualenv -v venv
  2. Install pypoetry to manage your dependencies. pip install poetry
  3. Install dependencies. poetry install

tox runs all tests as its default target. Running tox by itself will run unit tests. Ensure you have a running redis, with the module loaded.

API

The complete documentation of RedisTimeSeries's commands can be found at RedisTimeSeries's website.

Usage example

# Simple example
from redistimeseries.client import Client
rts = Client()
rts.create('test', labels={'Time':'Series'})
rts.add('test', 1, 1.12)
rts.add('test', 2, 1.12)
rts.get('test')
rts.incrby('test',1)
rts.range('test', 0, -1)
rts.range('test', 0, -1, aggregation_type='avg', bucket_size_msec=10)
rts.range('test', 0, -1, aggregation_type='sum', bucket_size_msec=10)
rts.info('test').__dict__

# Example with rules
rts.create('source', retention_msecs=40)
rts.create('sumRule')
rts.create('avgRule')
rts.createrule('source', 'sumRule', 'sum', 20)
rts.createrule('source', 'avgRule', 'avg', 15)
rts.add('source', '*', 1)
rts.add('source', '*', 2)
rts.add('source', '*', 3)
rts.get('sumRule')
rts.get('avgRule')
rts.info('sumRule').__dict__

Further notes on back-filling time series

Since RedisTimeSeries 1.4 we've added the ability to back-fill time series, with different duplicate policies.

The default behavior is to block updates to the same timestamp, and you can control it via the duplicate_policy argument. You can check in detail the duplicate policy documentation.

Bellow you can find an example of the LAST duplicate policy, in which we override duplicate timestamps with the latest value:

from redistimeseries.client import Client
rts = Client()
rts.create('last-upsert', labels={'Time':'Series'}, duplicate_policy='last')
rts.add('last-upsert', 1, 10.0)
rts.add('last-upsert', 1, 5.0)
# should output [(1, 5.0)]
print(rts.range('last-upsert', 0, -1))

License

BSD 3-Clause

Comments
  • ResponseError: unknown command `TS.CREATE`, with args beginning with: `source`, `RETENTION`,

    ResponseError: unknown command `TS.CREATE`, with args beginning with: `source`, `RETENTION`,

    In [6]: rts.create('source', retention_msecs=40)
    ---------------------------------------------------------------------------
    ResponseError                             Traceback (most recent call last)
    <ipython-input-6-0d98d59077cf> in <module>()
    ----> 1 rts.create('source', retention_msecs=40)
    
    e:\redistimeseries-py\redistimeseries\client.py in create(self, key, retention_msecs, labels)
        135         self.appendLabels(params, labels)
        136
    --> 137         return self.execute_command(self.CREATE_CMD, *params)
        138
        139     def alter(self, key, retention_msecs=None, labels={}):
    
    C:\ProgramData\Anaconda3\lib\site-packages\redis\client.py in execute_command(self, *args, **options)
        837         try:
        838             conn.send_command(*args)
    --> 839             return self.parse_response(conn, command_name, **options)
        840         except (ConnectionError, TimeoutError) as e:
        841             conn.disconnect()
    
    C:\ProgramData\Anaconda3\lib\site-packages\redis\client.py in parse_response(self, connection, command_name, **options)
        851         "Parses a response from the Redis server"
        852         try:
    --> 853             response = connection.read_response()
        854         except ResponseError:
        855             if EMPTY_RESPONSE in options:
    
    C:\ProgramData\Anaconda3\lib\site-packages\redis\connection.py in read_response(self)
        716
        717         if isinstance(response, ResponseError):
    --> 718             raise response
        719         return response
        720
    
    ResponseError: unknown command `TS.CREATE`, with args beginning with: `source`, `RETENTION`, `40`,
    
    In [7]:
    
    invalid 
    opened by yutiansut 18
  • key error  'memoryUsage'

    key error 'memoryUsage'

    In [6]: rts.info(3) {'lastTimestamp': 0, 'retentionTime': 0, 'chunkCount': 1, 'maxSamplesPerChunk': 360, 'labels': [[b'Redis', b'Labs']], 'sourceKey': None, 'rules': []}

    KeyError Traceback (most recent call last) in () ----> 1 rts.info(3)

    E:\redistimeseries-py\redistimeseries\client.py in info(self, key) 263 def info(self, key): 264 """Gets information of key""" --> 265 return self.execute_command(self.INFO_CMD, key) 266 267 def queryindex(self, filters):

    C:\ProgramData\Anaconda3\lib\site-packages\redis\client.py in execute_command(self, *args, **options) 837 try: 838 conn.send_command(*args) --> 839 return self.parse_response(conn, command_name, **options) 840 except (ConnectionError, TimeoutError) as e: 841 conn.disconnect()

    C:\ProgramData\Anaconda3\lib\site-packages\redis\client.py in parse_response(self, connection, command_name, **options) 857 raise 858 if command_name in self.response_callbacks: --> 859 return self.response_callbacks[command_name](response, **options) 860 return response 861

    E:\redistimeseries-py\redistimeseries\client.py in init(self, args) 23 self.sourceKey = response['sourceKey'] 24 self.chunkCount = response['chunkCount'] ---> 25 self.memory_usage = response['memoryUsage'] 26 self.total_samples = response['totalSamples'] 27 self.labels = list_to_dict(response['labels'])

    KeyError: 'memoryUsage'

    opened by yutiansut 9
  • Bug in renaming TS key

    Bug in renaming TS key

    I run into the following situation (while using Python library)

    Calling the following (via Python commands):

    TS.CREATE sensor2 LABELS sensor_id 2
    TS.ADD sensor2 1548149180000 26
    RENAME sensor2 sendor22
    TS.MRANGE - + WIDTHLABELS FILTER sensor_id=2
    

    The result of MRANGE is returned with the old key (sensor2) instead of the new one (sensor22)

    bug 
    opened by cloud-rocket 5
  • Response of TS.ADD is incorrectly parsed

    Response of TS.ADD is incorrectly parsed

    I am getting the following error message when trying to add an entry to the time series:

    AttributeError: 'long' object has no attribute 'encode'
    

    The command which I am using is:

    rts.add(timeseries, ts, v)
    

    I took a quick look and it seems that it's expected that the return value is a string (i.e. OK) but it is a number (i.e. the timestamp itself). Here the full stack trace:

    Traceback (most recent call last):
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/1-load.py", line 52, in <module>
        load_to_db(con, "prices.json", prices_ts)
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/1-load.py", line 40, in load_to_db
        con.add(timeseries, ts, v)
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/venv/lib/python2.7/site-packages/redistimeseries/client.py", line 140, in add
        return self.execute_command(self.ADD_CMD, *params)
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/venv/lib/python2.7/site-packages/redis/client.py", line 775, in execute_command
        return self.parse_response(connection, command_name, **options)
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/venv/lib/python2.7/site-packages/redis/client.py", line 795, in parse_response
        return self.response_callbacks[command_name](response, **options)
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/venv/lib/python2.7/site-packages/redis/client.py", line 314, in bool_ok
        return nativestr(response) == 'OK'
      File "/Users/david/Drive/Local/Users/david/Projects/Git/redislabs-training/enablement-timeseries/src/bitcoin-courses/venv/lib/python2.7/site-packages/redis/_compat.py", line 90, in nativestr
        return x if isinstance(x, str) else x.encode('utf-8', 'replace')
    
    bug 
    opened by nosqlgeek 5
  • Pubsub integration

    Pubsub integration

    What should be the best way to approach even-driven integration with Redis TS.

    I.e.

    • I would like one process to push data to multiple different time series streams (i.e. temperature / pressure with timestamps)
    • All time-series streams are timestamped together (temperature and pressure are receiving the same timestamp every time)
    • Other processes should do something when new values are available (i.e. to display it, process it and etc.).

    Should this mechanism be addressed separately from TS data processing or is there any better way?

    Thanks

    enhancement 
    opened by cloud-rocket 4
  • Redis 5.0.3 crashed by signal: 11 (Redistimeseries.so with compaction policy set)

    Redis 5.0.3 crashed by signal: 11 (Redistimeseries.so with compaction policy set)

    Redis frequently crashing when compaction policy is enabled.

    loadmodule /opt/redis/redistimeseries.so DUPLICATE_POLICY Last COMPACTION_POLICY sum:300s:90d;count:300s:90d;sum:1800s:180d;count:1800s:180d

    When the module is loaded without COMPACTION_POLICY, It is a seamless.

    === REDIS BUG REPORT START: Cut & paste starting from here === 14861:M 29 Oct 2020 07:45:30.469 # Redis 5.0.3 crashed by signal: 11 14861:M 29 Oct 2020 07:45:30.469 # Crashed running the instruction at: 0x7fa241f7cb05 14861:M 29 Oct 2020 07:45:30.469 # Accessing address: (nil) 14861:M 29 Oct 2020 07:45:30.469 # Failed assertion: (:0)

    ------ STACK TRACE ------ EIP: /opt/redis/redistimeseries.so(Uncompressed_UpsertSample+0x125)[0x7fa241f7cb05]

    Backtrace: /usr/bin/redis-server 0.0.0.0:6379(logStackTrace+0x44)[0x55df2b955c54] /usr/bin/redis-server 0.0.0.0:6379(sigsegvHandler+0xb5)[0x55df2b9563d5] /lib64/libpthread.so.0(+0x12dd0)[0x7fa2413e5dd0] /opt/redis/redistimeseries.so(Uncompressed_UpsertSample+0x125)[0x7fa241f7cb05] /opt/redis/redistimeseries.so(SeriesUpsertSample+0xda)[0x7fa241f85ada] /opt/redis/redistimeseries.so(SeriesUpsertSample+0x3a8)[0x7fa241f85da8] /opt/redis/redistimeseries.so(+0xcad2)[0x7fa241f80ad2] /opt/redis/redistimeseries.so(TSDB_add+0x162)[0x7fa241f82d42] /usr/bin/redis-server 0.0.0.0:6379(RedisModuleCommandDispatcher+0x66)[0x55df2b983946] /usr/bin/redis-server 0.0.0.0:6379(call+0x93)[0x55df2b90e853] /usr/bin/redis-server 0.0.0.0:6379(processCommand+0x49e)[0x55df2b90fcfe] /usr/bin/redis-server 0.0.0.0:6379(processInputBuffer+0x181)[0x55df2b91fb01] /usr/bin/redis-server 0.0.0.0:6379(aeProcessEvents+0x119)[0x55df2b9081f9] /usr/bin/redis-server 0.0.0.0:6379(aeMain+0x2b)[0x55df2b90865b] /usr/bin/redis-server 0.0.0.0:6379(main+0x3de)[0x55df2b90523e] /lib64/libc.so.6(__libc_start_main+0xf3)[0x7fa2410346a3] /usr/bin/redis-server 0.0.0.0:6379(_start+0x2e)[0x55df2b90558e]

    ------ INFO OUTPUT ------

    Server

    redis_version:5.0.3 redis_git_sha1:00000000 redis_git_dirty:0 redis_build_id:28849dbea6f07cc8 redis_mode:standalone os:Linux 4.18.0-80.el8.x86_64 x86_64 arch_bits:64 multiplexing_api:epoll atomicvar_api:atomic-builtin gcc_version:8.3.1 process_id:14861 run_id:4dfba04838528974a4243b5231d8363e1c62a677 tcp_port:6379 uptime_in_seconds:1 uptime_in_days:0 hz:10 configured_hz:10 lru_clock:10136410 executable:/usr/bin/redis-server config_file:/etc/redis.conf

    Clients

    connected_clients:1 client_recent_max_input_buffer:0 client_recent_max_output_buffer:0 blocked_clients:0

    Memory

    used_memory:30728624 used_memory_human:29.31M used_memory_rss:36397056 used_memory_rss_human:34.71M used_memory_peak:30728624 used_memory_peak_human:29.31M used_memory_peak_perc:100.56% used_memory_overhead:1120656 used_memory_startup:791304 used_memory_dataset:29607968 used_memory_dataset_perc:98.90% allocator_allocated:30647824 allocator_active:30879744 allocator_resident:36335616 total_system_memory:8193073152 total_system_memory_human:7.63G used_memory_lua:37888 used_memory_lua_human:37.00K used_memory_scripts:0 used_memory_scripts_human:0B number_of_cached_scripts:0 maxmemory:0 maxmemory_human:0B maxmemory_policy:noeviction allocator_frag_ratio:1.01 allocator_frag_bytes:231920 allocator_rss_ratio:1.18 allocator_rss_bytes:5455872 rss_overhead_ratio:1.00 rss_overhead_bytes:61440 mem_fragmentation_ratio:1.19 mem_fragmentation_bytes:5838216 mem_not_counted_for_evict:522 mem_replication_backlog:0 mem_clients_slaves:0 mem_clients_normal:49694 mem_aof_buffer:522 mem_allocator:jemalloc-5.1.0 active_defrag_running:0 lazyfree_pending_objects:0

    Persistence

    loading:0 rdb_changes_since_last_save:12046 rdb_bgsave_in_progress:0 rdb_last_save_time:1603971929 rdb_last_bgsave_status:ok rdb_last_bgsave_time_sec:-1 rdb_current_bgsave_time_sec:-1 rdb_last_cow_size:0 aof_enabled:1 aof_rewrite_in_progress:0 aof_rewrite_scheduled:0 aof_last_rewrite_time_sec:-1 aof_current_rewrite_time_sec:-1 aof_last_bgrewrite_status:ok aof_last_write_status:ok aof_last_cow_size:0 aof_current_size:2840711 aof_base_size:2839964 aof_pending_rewrite:0 aof_buffer_length:0 aof_rewrite_buffer_length:0 aof_pending_bio_fsync:0 aof_delayed_fsync:0

    Stats

    total_connections_received:1 total_commands_processed:4 instantaneous_ops_per_sec:0 total_net_input_bytes:969 total_net_output_bytes:48 instantaneous_input_kbps:0.00 instantaneous_output_kbps:0.00 rejected_connections:0 sync_full:0 sync_partial_ok:0 sync_partial_err:0 expired_keys:0 expired_stale_perc:0.00 expired_time_cap_reached_count:0 evicted_keys:0 keyspace_hits:9 keyspace_misses:0 pubsub_channels:0 pubsub_patterns:0 latest_fork_usec:0 migrate_cached_sockets:0 slave_expires_tracked_keys:0 active_defrag_hits:0 active_defrag_misses:0 active_defrag_key_hits:0 active_defrag_key_misses:0

    Replication

    role:master connected_slaves:0 master_replid:7cd9b1523a1994fde5ed6227465506d441c0a7bd master_replid2:0000000000000000000000000000000000000000 master_repl_offset:0 second_repl_offset:-1 repl_backlog_active:0 repl_backlog_size:1048576 repl_backlog_first_byte_offset:0 repl_backlog_histlen:0

    CPU

    used_cpu_sys:0.021736 used_cpu_user:0.213450 used_cpu_sys_children:0.000000 used_cpu_user_children:0.000000

    Commandstats

    cmdstat_info:calls=1,usec=18,usec_per_call=18.00 cmdstat_ts.add:calls=3,usec=494,usec_per_call=164.67

    Cluster

    cluster_enabled:0

    Keyspace

    db0:keys=5340,expires=0,avg_ttl=0

    ------ CLIENT LIST OUTPUT ------ id=10913 addr=172.17.21.2:59224 fd=8 name= age=0 idle=0 flags=N db=0 sub=0 psub=0 multi=-1 qbuf=245 qbuf-free=32523 obl=0 oll=0 omem=0 events=r cmd=ts.add

    ------ CURRENT CLIENT INFO ------ id=10913 addr=172.17.21.2:59224 fd=8 name= age=0 idle=0 flags=N db=0 sub=0 psub=0 multi=-1 qbuf=245 qbuf-free=32523 obl=0 oll=0 omem=0 events=r cmd=ts.add argv[0]: 'TS.ADD' argv[1]: 'billing:2088' argv[2]: '1602565827000' argv[3]: '1' argv[4]: 'RETENTION' argv[5]: '86400000' argv[6]: 'LABELS' argv[7]: 'agentname' argv[8]: 'Laura Jackson' argv[9]: 'agentskillname' argv[10]: 'SK1589' argv[11]: 'region' argv[12]: 'N. Virginia' argv[13]: 'server' argv[14]: 'Server1' argv[15]: 'shift' argv[16]: '1' 14861:M 29 Oct 2020 07:45:30.470 # key 'billing:2088' found in DB containing the following object: 14861:M 29 Oct 2020 07:45:30.470 # Object type: 5 14861:M 29 Oct 2020 07:45:30.470 # Object encoding: 0 14861:M 29 Oct 2020 07:45:30.470 # Object refcount: 1

    ------ REGISTERS ------ 14861:M 29 Oct 2020 07:45:30.470 # RAX:00007fa23d660380 RBX:0000000000000000 RCX:00007fa240000000 RDX:0000000000000002 RDI:00007fffc394abf0 RSI:00007fffc394abcc RBP:00007fa23d660380 RSP:00007fffc394ab70 R8 :00000175205d8040 R9 :0000000000000000 R10:00007fa241f8f838 R11:00007fa24119df20 R12:0000000000000002 R13:00000175205d8040 R14:ffffffffffffffff R15:00007fffc394ac88 RIP:00007fa241f7cb05 EFL:0000000000010246 CSGSFS:002b000000000033 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7f) -> 000055df2b982105 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7e) -> 00007fa23d75e150 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7d) -> 000055df2c5746b0 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7c) -> 00007fa23d660380 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7b) -> 000000002b915e7a 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab7a) -> 0000000000000078 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab79) -> 00007fa240c16100 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab78) -> 0000000000000001 14861:M 29 Oct 2020 07:45:30.470 # (00007fffc394ab77) -> 3ff0000000000000 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab76) -> 0000000000000078 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab75) -> 00007fa241f85ada 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab74) -> ffffffffffffffff 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab73) -> 00000175205d8040 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab72) -> 0000000000000002 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab71) -> 00007fa241f8e780 14861:M 29 Oct 2020 07:45:30.471 # (00007fffc394ab70) -> 000055df2c5747f0

    ------ FAST MEMORY TEST ------ 14861:M 29 Oct 2020 07:45:30.471 # Bio thread for job type #0 terminated 14861:M 29 Oct 2020 07:45:30.471 # Bio thread for job type #1 terminated 14861:M 29 Oct 2020 07:45:30.471 # Bio thread for job type #2 terminated *** Preparing to test memory region 55df2bc82000 (2252800 bytes) *** Preparing to test memory region 55df2c496000 (1622016 bytes) *** Preparing to test memory region 7fa23c400000 (2097152 bytes) *** Preparing to test memory region 7fa23c780000 (36175872 bytes) *** Preparing to test memory region 7fa23eb05000 (2621440 bytes) *** Preparing to test memory region 7fa23ed86000 (8388608 bytes) *** Preparing to test memory region 7fa23f587000 (8388608 bytes) *** Preparing to test memory region 7fa23fd88000 (8388608 bytes) *** Preparing to test memory region 7fa240800000 (8388608 bytes) *** Preparing to test memory region 7fa2413cf000 (16384 bytes) *** Preparing to test memory region 7fa2415ef000 (16384 bytes) *** Preparing to test memory region 7fa241f8f000 (32768 bytes) *** Preparing to test memory region 7fa241fac000 (4096 bytes) .O.O.O.O.O.O.O.O.O.O.O.O.O Fast memory test PASSED, however your memory can still be broken. Please run a memory test for several hours if possible.

    ------ DUMPING CODE AROUND EIP ------ Symbol: Uncompressed_UpsertSample (base: 0x7fa241f7c9e0) Module: /opt/redis/redistimeseries.so (base 0x7fa241f74000) $ xxd -r -p /tmp/dump.hex /tmp/dump.bin $ objdump --adjust-vma=0x7fa241f7c9e0 -D -b binary -m i386:x86-64 /tmp/dump.bin

    14861:M 29 Oct 2020 07:45:30.765 # dump of function (hexdump of 421 bytes): 4156415541545553488b6f10c706000000004c8b07448b4d10490fbfd94885db0f84ff000000488b450831d2488b084939c87720e9f60000000f1f80000000004863ca48c1e104488b0c084c39c10f837c0000004883c2014839da75e34989dd49c1e5044c39c174764989f6488b75184489ca4989fc4889f148c1e9044839ca751e4889c7488b05741101004883c61048897518ff108b5510488945084989d14a8d34284839da7757f3410f6f0c244183c10131c00f110e44894d1041c706010000005b5d415c415d415ec30f1f40004989d54889d349c1e5044c39c1758a4c01e8f20f104708f20f11400831c05b5d415c415d415ec3660f1f840000000000488d7b014829da48c1e70448c1e204488d3c38e8d8faffff488b7508448b4d104c01eeeb84488b0425000000000f0b74b14c8945004531ed31dbe92affffff9041554989fdbf0100000041544989d4555389f3be200000004883ec08488b052d100100ff104889c54c892889581c83e3017407418b5d1083eb01895d084d85e4741931ffe867f9fffff30f6f00410f110424488b401049894424104883c4084889e85b5d41

    === REDIS BUG REPORT END. Make sure to include from START to END. ===

    bug 
    opened by udayshankarv 3
  • `INFO` seems to be throwing error against `master` branch of `RedisTimeSeries`

    `INFO` seems to be throwing error against `master` branch of `RedisTimeSeries`

    Just upgraded RedisTimeSeries to commit e8030d14e7b8e186518b1b551cfa3cde261bf3c3 and am now seeing this when calling client.info() with redistimeseries-py:

        data = self._mclient.info(key)
      File "/opt/venv/lib/python3.7/site-packages/redistimeseries/client.py", line 301, in info
        return self.execute_command(self.INFO_CMD, key)
      File "/opt/venv/lib/python3.7/site-packages/redis-3.4.1-py3.7.egg/redis/client.py", line 901, in execute_command
      File "/opt/venv/lib/python3.7/site-packages/redis-3.4.1-py3.7.egg/redis/client.py", line 921, in parse_response
      File "/opt/venv/lib/python3.7/site-packages/redistimeseries/client.py", line 29, in __init__
        self.maxSamplesPerChunk = response['maxSamplesPerChunk']
    KeyError: 'maxSamplesPerChunk'
    

    Ran INFO on a key in my RedisTimeSeries and am now seeing this response:

     1) totalSamples
     2) (integer) 57
     3) memoryUsage
     4) (integer) 4474
     5) firstTimestamp
     6) (integer) 1598402024391
     7) lastTimestamp
     8) (integer) 1598402629741
     9) retentionTime
    10) (integer) 86400000
    11) chunkCount
    12) (integer) 1
    13) chunkSize
    14) (integer) 4096
    15) labels
    16) 1) 1) "element"
           2) "monitor"
        2) 1) "type"
           2) "process"
        3) 1) "container"
           2) "x86_64"
        4) 1) "device"
           2) "default"
        5) 1) "language"
           2) "Python"
        6) 1) "version"
           2) "1.7.2"
        7) 1) "level"
           2) "INFO"
        8) 1) "subtype0"
           2) "3912:kworker/9:2"
        9) 1) "subtype1"
           2) "cpu_user"
    17) sourceKey
    18) (nil)
    19) rules
    20) (empty array)
    

    Perhaps this response has changed? This was previously working OK against release 1.2.7

    bug 
    opened by dpipemazo 3
  • Getting error 'redis.exceptions.ResponseError: TSDB: Timestamp cannot be older than the latest timestamp in the time series'

    Getting error 'redis.exceptions.ResponseError: TSDB: Timestamp cannot be older than the latest timestamp in the time series'

    When I am executing add command inside a loop I getting an error here is the traceback with code. Traceback (most recent call last): File "demo.py", line 5, in rts.add('demo','*',1000) File "/Users/ave/Desktop/platform-charter/lib/python3.7/site-packages/redistimeseries/client.py", line 186, in add return self.execute_command(self.ADD_CMD, *params) File "/Users/ave/Desktop/platform-charter/lib/python3.7/site-packages/redis/client.py", line 901, in execute_command return self.parse_response(conn, command_name, **options) File "/Users/ave/Desktop/platform-charter/lib/python3.7/site-packages/redis/client.py", line 915, in parse_response response = connection.read_response() File "/Users/ave/Desktop/platform-charter/lib/python3.7/site-packages/redis/connection.py", line 756, in read_response raise response redis.exceptions.ResponseError: TSDB: Timestamp cannot be older than the latest timestamp in the time series

    And here is the code from redistimeseries.client import Client rts = Client() rts.create('demo') while rts.info('demo').chunkCount == 1: rts.add('demo','*',1000) Any kind of help is appreciated. Thanks in advance

    enhancement 
    opened by shubhamivane 3
  • List index out of range in parse_m_get() for emty series.

    List index out of range in parse_m_get() for emty series.

    I use Redis v5.0.5 + RedisTimeSeries module v.1.2.2

    Now TS.MGET return array will contain key1, labels1, lastsample1, ..., keyN, labelsN, lastsampleN if series contain some data point. But if series is empty, then TS.MGET in my case return:

    1) "analog-key1"
    2) 1) 1) "type"
        2) "analog"
    3) (empty list or set)
    

    The function parse_m_get() can fail as soon as it encounters an empty series. Becauseitem [2] [0] and item [2] [1] will refer to an empty list and will throw an exception "list index out of range".

    In version 1.0GA, the RedistimeSeries module when executing TS.MGET would return for an empty series:

    1. "analog-key1"     2) 1) 1) "type"           2) "analog"     3) (integer) 0     4) "0"

    For my project, I changed the behavior of the parse_m_getfunction to maintain compatibility with existing code:

    def parse_m_get(response):
        res = []
        for item in response:
            print(item)
            res.append(
                {
                    nativestr(item[0]) : [list_to_dict(item[1]), 
                    item[2][0] if item[2] else 0,
                    float(item[2][1]) if item[2] else '0']
                    }
                )
        return res
    

    I can make a pull request if this solution is correct.

    bug 
    opened by AlKorochkin 3
  • Add does not update the key and value for a same timestamp

    Add does not update the key and value for a same timestamp

    For my requirement, I was using Redistimeseries module and I have been getting this error saying that

    TSDB: Timestamp cannot be older than the latest timestamp in the time series

    I want to update the Values in the key for the same timestamp by using the command

    TS.ADD

    I hope this is an issue as the value should be updated for the same timestamp and not throw an error.

    opened by harishkumarr95 3
  • TSDB: wrong toTimestamp when calling range('key', 0, -1)

    TSDB: wrong toTimestamp when calling range('key', 0, -1)

    redistimeseries-py version: 1.4.3 redis-py version: 3.5.3

    I'm instantiating a redistimeseries client via rts = Client(redis.Redis(host="XXX", port=6379, password="XXX"))) and subsequently calling a few TS commands.

    This command, however, fails. rts.range('key', 0, -1) The following stacktrace is thrown:

        return self.parse_response(conn, command_name, **options)
      File "/usr/local/lib/python3.6/dist-packages/redis/client.py", line 915, in parse_response
        response = connection.read_response()
      File "/usr/local/lib/python3.6/dist-packages/redis/connection.py", line 756, in read_response
        raise response
    redis.exceptions.ResponseError: TSDB: wrong toTimestamp
    

    Not sure if this can be addressed in this library as it seems like it's an issue with redis-py? The equivalent range command executed directly in redis-cli performs as intended.

    opened by gmax0 2
  • Failing to access redistimeseries features in the Redis-py package

    Failing to access redistimeseries features in the Redis-py package

    Version: Redis-py 4.14

    Platform: Python 3.8.3 on MacOS Big Sur

    Description: According to https://github.com/RedisTimeSeries/redistimeseries-py#user-content-deprecation-notice The features from redistimeseries package have merged into the redis-py module, however we are failing to run such commands using the package version stated above.

    Steps to reproduce the error

    import redis r = redis.Redis(host='localhost', port=6379, db=0) r.create("test_key_name")

    Result AttributeError: 'Redis' object has no attribute 'create'

    @AvitalFineRedis

    help wanted 
    opened by dilonne 1
  • Support `DUPLICATE_POLICY` on `ts.add`

    Support `DUPLICATE_POLICY` on `ts.add`

    TS.ADD supports setting DUPLICATE_POLICY as well as/independently from ON_DUPLICATE. The two have separate behaviors -- DUPLICATE_POLICY sets the default for the key of TS.ADD does an implicit create, and ON_DUPLICATE overrides the existing default for the key on a particular add. It would be nice to support both.

    Relevant issue from RedisTimeSeries to add documentation for this support: https://github.com/RedisTimeSeries/RedisTimeSeries/issues/889

    opened by dpipemazo 0
  • how to store a dictionary? and retrieve using range command

    how to store a dictionary? and retrieve using range command

    Hi,

    I am using Redis for the first time so I don't have a lot of understanding of how this works in python. I have a dictionary with multiple keys along with a time key. I need to save the data in Redis and then retrieve multiple data points using the range function.

    data = {'timestamp': 124567889, 'datetime' :9/27/2018 9:04, 'sensor1': 34, 'sensor2': 56, 'sensor3': 90,'sensor4' : 12}

    Issue 1:

    how can I save this using the Redis time series module?

    currently i am creating seperate time series for every sensor :

    rts.create(sensor1, labels = {'time':'series'},duplicate_policy='last') rts.create(sensor2, labels = {'time':'series'},duplicate_policy='last')

    and then adding individual values to each time series : rts.add(sensor1,unixtime,34) rts.add(sensor2,unixtime,56)

    I will have more than 100 different values in the next stage and my current method takes some time to save the data. so is there an easier way where I can save all values together without having to create and save each value individually?

    Issue 2:

    Also, I am trying to retrieve the data using the range function in the following manner:

    rts.range('datetime',start_time,end_time)

    but when the date is returned it just says '9.0' instead of complete date '9/272018 9:04'.

    any help would be much appreciated.

    Thank You.

    opened by shan4usmani 1
Releases(v1.4.5)
  • v1.4.5(Mar 1, 2022)

  • v1.4.4(Nov 17, 2021)

    Changes

    • Pinning redis to 3.5.3 and updating minor version (#117)
    • Adding deprecation notice (#115)
    • Remove assertion of error message (#106)

    Features

    • Add aggregation ALIGN option to TS.RANGE/REVRANGE and TS.MRANGE/MREVRANGE (#103)
    • Add GROUPBY and REDUCE support for TS.MRANGE and TS.MREVRANGE (#99)
    • Add support for TS.DEL (#98)
    • Add SELECTED_LABELS support for TS.MRANGE and TS.MREVRANGE (#101)
    • add support for Filter by ts and value (#90)
    Source code(tar.gz)
    Source code(zip)
  • v1.4.3(Oct 13, 2020)

    This is a maintenance release for redistimeseries-py version 1.4. Update urgency: Low

    Maintenance

    • Using github actions to automatically publish to PyPi (#74)
    Source code(tar.gz)
    Source code(zip)
  • v1.4.2(Oct 5, 2020)

    This is a maintenance release for redistimeseries-py version 1.4. Update urgency: Low

    Maintenance details:

    • Fixed executing workflows for a git tag (#72)
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Oct 2, 2020)

    This is a maintenance release for redistimeseries-py version 1.4. Update urgency: Low

    Maintenance details:

    • Provide the proper list of supported python version on setup classifiers. (#68)
    Source code(tar.gz)
    Source code(zip)
  • v1.4.0(Sep 21, 2020)

    Changes

    • Added support for DUPLICATE_POLICY / ON_DUPLICATE keywords on TS.CREATE and TS.ADD (#66)
    • Added support for DUPLICATE_POLICY in `TS.INFO response (#66)
    • Added support for TS.REVRANGE and TS.MREVRANGE (#62)
    • Added support to specify the chunk size in Bytes of a time-series on create, add, incrby, and decrby methods (#61)
    • Allow user to pass connection on client setup (#57)

    Fixes

    • Fixed chunkSize addition to TS.INFO ( backwards compatible ) (#59)
    Source code(tar.gz)
    Source code(zip)
  • v0.8.0(Feb 19, 2020)

    • Uncompressed chunks supported #42
    • 0 time sample now included #43
    • Update get/mget return value + with_labels #44
    • Add timestamp, remove time bucket from incr/decrby #47
    • Fix issue 49 get/mget with empty series #50
    Source code(tar.gz)
    Source code(zip)
  • v0.7.0(Jan 14, 2020)

  • v0.6.2(Jan 5, 2020)

  • v0.6.1(Dec 26, 2019)

  • v0.6.0(Dec 20, 2019)

  • v0.5.0(Dec 16, 2019)

    • Update client.py add more aggregation functions ('std.p', 'std.s', 'var.p', 'var.s') (#18)
    • Change return type of range values to float from string (#26)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(May 7, 2019)

Owner
Time Series database over Redis by Redis
Django React - Purity Dashboard (Open-Source) | AppSeed

Django React Purity Dashboard Start your Development with an Innovative Admin Template for Chakra UI and React. Purity UI Dashboard is built with over

App Generator 19 Sep 19, 2022
A visual indicator of what environment/system you're using in django

A visual indicator of what environment/system you're using in django

Mark Walker 4 Nov 26, 2022
Django Rest Framework + React application.

Django Rest Framework + React application.

2 Dec 19, 2022
A django integration for huey task queue that supports multi queue management

django-huey This package is an extension of huey contrib djhuey package that allows users to manage multiple queues. Installation Using pip package ma

GAIA Software 32 Nov 26, 2022
Buckshot++ is a new algorithm that finds highly stable clusters efficiently.

Buckshot++: An Outlier-Resistant and Scalable Clustering Algorithm. (Inspired by the Buckshot Algorithm.) Here, we introduce a new algorithm, which we

John Jung 1 Jul 02, 2022
Show how the redis works with Python (Django).

Redis Leaderboard Python (Django) Show how the redis works with Python (Django). Try it out deploying on Heroku (See notes: How to run on Google Cloud

Tom Xu 4 Nov 16, 2021
A CTF leaderboard for the submission of flags during a CTF challenge. Built using Django.

🚩 CTF Leaderboard The goal of this project is to provide a simple web page to allow the participants of an CTF to enter their found flags. Also the l

Maurice Bauer 2 Jan 17, 2022
A simple app that provides django integration for RQ (Redis Queue)

Django-RQ Django integration with RQ, a Redis based Python queuing library. Django-RQ is a simple app that allows you to configure your queues in djan

RQ 1.6k Jan 06, 2023
Django Email Sender

Email-Sender Django Email Sender Installation 1.clone Repository & Install Packages git clone https://github.com/telman03/Email-Sender.git pip install

Telman Gadimov 0 Dec 26, 2021
A Django application that provides country choices for use with forms, flag icons static files, and a country field for models.

Django Countries A Django application that provides country choices for use with forms, flag icons static files, and a country field for models. Insta

Chris Beaven 1.2k Jan 07, 2023
This is a repository for a web application developed with Django, built with Crowdbotics

assignment_32558 This is a repository for a web application developed with Django, built with Crowdbotics Table of Contents Project Structure Features

Crowdbotics 1 Dec 29, 2021
Django admin CKEditor integration.

Django CKEditor NOTICE: django-ckeditor 5 has backward incompatible code moves against 4.5.1. File upload support has been moved to ckeditor_uploader.

2.2k Jan 02, 2023
A small Django app to easily broadcast an announcement across a website.

django-site-broadcasts The site broadcast application allows users to define short messages and announcements that should be displayed across a site.

Ben Lopatin 12 Jan 21, 2020
A fresh approach to autocomplete implementations, specially for Django.

A fresh approach to autocomplete implementations, specially for Django. Status: v3 stable, 2.x.x stable, 1.x.x deprecated. Please DO regularely ping us with your link at #yourlabs IRC channel

YourLabs 1.6k Dec 22, 2022
A simple page with paypal payment and confiramtion in django

django-paypal a simple page with paypal payment and confiramtion in django Youtube Video : Paypal Smart Button : https://developer.paypal.com/demo/che

Mahmoud Ahmed 5 Feb 19, 2022
A Django Demo Project of Students Management System

Django_StudentMS A Django Demo Project of Students Management System. From NWPU Seddon for DB Class Pre. Seddon simplify the code in 2021/10/17. Hope

2 Dec 08, 2021
The pytest framework makes it easy to write small tests, yet scales to support complex functional testing

The pytest framework makes it easy to write small tests, yet scales to support complex functional testing for applications and libraries. An example o

pytest-dev 9.6k Jan 06, 2023
Use Database URLs in your Django Application.

DJ-Database-URL This simple Django utility allows you to utilize the 12factor inspired DATABASE_URL environment variable to configure your Django appl

Jacob Kaplan-Moss 1.3k Dec 30, 2022
Dynamic Django settings.

Constance - Dynamic Django settings A Django app for storing dynamic settings in pluggable backends (Redis and Django model backend built in) with an

Jazzband 1.5k Jan 07, 2023
A Django app to accept payments from various payment processors via Pluggable backends.

Django-Merchant Django-Merchant is a django application that enables you to use multiple payment processors from a single API. Gateways Following gate

Agiliq 997 Dec 24, 2022