Ransomware leak site monitoring

Related tags

Loggingransomwatch
Overview

RansomWatch

Build Image Docker Hub Publish Docker Hub Image

RansomWatch is a ransomware leak site monitoring tool. It will scrape all of the entries on various ransomware leak sites, store the data in a SQLite database, and send notifications via Slack or Discord when a new victim shows up, or when a victim is removed.

Configuration

In config_vol/, please copy config.sample.yaml to config.yaml, and add the following:

  • Leak site URLs. I decided not to make this list public in order to prevent them from gaining even more noteriety, so if you have them, add them in. If not, this tool isn't for you.
  • Notification destinations. RansomWatch currently supports notifying via.the following:
    • Slack: Follow these instructions to add a new app to your Slack workspace and add the webhook URL to the config.
    • Discord: Follow these instructions to add a new app to your Discord server and add the webhook URL to the config.

Additionally, there are a few environment variables you may need to set:

  • RW_DB_PATH: Path for the SQLite database to use
  • RW_CONFIG_PATH: Path to the config.yaml file

These are both set in the provided docker-compose.yml.

Usage

This is intended to be run in Docker via a cronjob on whatever increment you decide to use.

First, build the container: docker-compose build app

Then, add it to your crontab. Example crontab entry (running every 8 hours):

0 */8 * * * cd /path/to/ransomwatch && docker-compose up --abort-on-container-exit

If you'd prefer, you can use the image published on Docker Hub (captaingeech/ransomwatch) instead, with a docker-compose.yml that looks something like this:

version: "3"

services:
  app:
    image: captaingeech/ransomwatch:latest
    depends_on:
      - proxy
    volumes:
      - ./db_vol:/db
      - ./config_vol:/config
    environment:
      PYTHONUNBUFFERED: 1
      RW_DB_PATH: /db/ransomwatch.db
      RW_CONFIG_PATH: /config/config.yaml

  proxy:
    image: captaingeech/tor-proxy:latest

This can also be run via the command line, but that requires you to have your own Tor proxy (with the control service) running. Example execution:

$ RW_DB_PATH=./db_vol/ransomwatch.db RW_CONFIG_PATH=./config_vol/config.yaml python3 src/ransomwatch.py

Example Slack Messages

Slack notification for new victim

Slack notification for removed victim

Slack notification for site down

Slack notification for an error

The messages sent to Discord are very similar in style, identical in content.

Leak Site Implementations

The following leak sites are (planned to be) supported:

  • Conti
  • MAZE
  • Egregor
  • Sodinokibi/REvil
  • DoppelPaymer (captcha, prob won't be supported for a while)
  • NetWalker
  • Pysa
  • Avaddon
  • DarkSide
  • CL0P
  • Nefilim
  • Mount Locker
  • Suncrypt
  • Everest
  • Ragnarok
  • Ragnar_Locker
  • BABUK LOCKER
  • Pay2Key
  • Cuba
  • RansomEXX
  • Pay2Key
  • Ranzy Locker
  • Astro Team
  • LV

If there are other leak sites you want implemented, feel free to open a PR or DM me on Twitter, @captainGeech42

Comments
  • Pysa timestamp format change

    Pysa timestamp format change

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/pysa.py", line 38, in scrape_victims
        published_dt = datetime.strptime(
      File "/usr/local/lib/python3.9/_strptime.py", line 568, in _strptime_datetime
        tt, fraction, gmtoff_fraction = _strptime(data_string, format)
      File "/usr/local/lib/python3.9/_strptime.py", line 349, in _strptime
        raise ValueError("time data %r does not match format %r" %
    ValueError: time data '22/03/21' does not match format '%m/%d/%y'
    
    opened by captainGeech42 4
  • Something broken with REvil

    Something broken with REvil

    app_1    | 2021/04/20 18:36:25 [ERROR] Got an error while scraping REvil, notifying
    app_1    | 2021/04/20 18:36:25 [ERROR] Error sending Discord notification (400): {"embeds": ["0"]}
    app_1    | 2021/04/20 18:36:25 [ERROR] Failed to send error notification to Discord guild "test-discord"
    app_1    | 2021/04/20 18:36:25 [ERROR] Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | http.client.RemoteDisconnected: Remote end closed connection without response
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
    app_1    |     resp = conn.urlopen(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
    app_1    |     retries = retries.increment(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 532, in increment
    app_1    |     raise six.reraise(type(error), error, _stacktrace)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 734, in reraise
    app_1    |     raise value.with_traceback(tb)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/app/ransomwatch.py", line 52, in main
    app_1    |     s.scrape_victims()
    app_1    |   File "/app/sites/revil.py", line 62, in scrape_victims
    app_1    |     r = p.get(f"{self.url}?page={i}", headers=self.headers)
    app_1    |   File "/app/net/proxy.py", line 101, in get
    app_1    |     return self.session.get(*args, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
    app_1    |     return self.request('GET', url, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    app_1    |     resp = self.send(prep, **send_kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    app_1    |     r = adapter.send(request, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    app_1    |     raise ConnectionError(err, request=request)
    app_1    | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    | 2021/04/20 18:36:25 [INFO] Finished all sites, exiting
    

    not sure what's going on. similar error w/ slack

    bug 
    opened by captainGeech42 3
  • Conti - Scraping Error

    Conti - Scraping Error

    Describe the bug

    Error Message Below:

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    To Reproduce Steps to reproduce the behavior: This error has happened several times over the last 24 hours while ransomwatch has been run on a cron job.

    Expected behavior Parse the contents of the Conti site with no errors or have additional error handling built in to handle this error.

    Screenshots If applicable, add screenshots to help explain your problem.

    Logs

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    Environment

    • OS: Ubuntu 20.04
    • How you are running it: Docker via cron job (read me best practices implementation)

    Additional context Add any other context about the problem here.

    opened by GRIT-5ynax 2
  • Dockerhub image out of date

    Dockerhub image out of date

    Running the Dockerhub image results in

    app_1 | Traceback (most recent call last): app_1 | File "/app/ransomwatch.py", line 98, in app_1 | NotificationManager.send_error_notification(f"Non-scraping failure", tb, fatal=True) app_1 | File "/app/notifications/manager.py", line 30, in send_error_notification app_1 | for workspace, params in Config["slack"].items(): app_1 | KeyError: 'slack'

    Works if the image is built

    bug 
    opened by nhova 2
  • New sites

    New sites

    • [x] Ranzy
    • [x] Astro
    • [x] Pay2Key
    • [x] Cuba
    • [x] RansomEXX
    • [x] Mount Locker
    • [x] Ragnarok
    • [ ] Ragnar Locker
    • [x] Suncrypt
    • [x] Everest
    • [x] Nefilim
    • [x] CL0P
    • [x] Pysa
    opened by captainGeech42 2
  • New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    This pull request adds support for BLACKMATTER, ARVIN, EL COMETA, LORENZ, XING, LOCKBIT.

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc.
    • [x] The data going into the DB is properly parsed and is accurate
    enhancement 
    opened by x-originating-ip 1
  • cl0p scraper broken

    cl0p scraper broken

    Describe the bug Cl0p scraper out of date

    Logs

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/cl0p.py", line 21, in scrape_victims
        victim_list = soup.find("div", class_="collapse-section").find_all("li")
    AttributeError: 'NoneType' object has no attribute 'find_all'
    

    should probably just update this to the v3 site as well

    bug 
    opened by captainGeech42 1
  • Enhance pysa datetimes processing (#50)

    Enhance pysa datetimes processing (#50)

    Describe the changes

    Adding some logics into pysa.py to try to process the datetime better. Also, exception handling has been added to avoid crash of the script.

    Related issue(s)

    #50

    How was it tested?

    Before: scrapping failed at some point if pysa was defined in the yaml config file (see related issue).

    Now:

    • [x] scrapping works
    • [x] dates look good (although as we don't know what is the true value, we can only admit it's relevant)
    • [x] the script does not crash any longer because of the try/catch instructions.
    opened by biligonzales 1
  • Handle missing notifications element in the yaml config file (#52)

    Handle missing notifications element in the yaml config file (#52)

    Describe the changes

    Added minor changes into manager.py so that it does not scream out loud if we do not want to configure notifications. Basically the presence of the notifications element in the Config yaml is tested.

    Related issue(s)

    #52

    How was it tested?

    • [x] Docker started with an empty notifications element
    • [x] Docker started withtout any notifications element
    opened by biligonzales 1
  • Unable to run without configured notifications

    Unable to run without configured notifications

    The notifications part in the config.yaml file needs to be present and configured to avoid any error at runtime. Would be great to be able to leave the notifications part empty (or even not to set it in the yaml config).

    opened by biligonzales 1
  • Conti: scraper fixed (#73)

    Conti: scraper fixed (#73)

    Describe the changes

    Fixed the Conti scraper to use the newsList javascript item because no html elements were available any longer.

    Related issue(s)

    This fixes issue #73

    How was it tested?

    1. Add Conti url to config.yaml
    2. Run docker-compose build app
    3. Run docker-compose up --abort-on-container-exit
    4. Conti results are pushed again in the database

    Checklist for a new scraper (delete if N/A)

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc. (there was one logging.debug there, I left it as it was)
    • [x] The data going into the DB is properly parsed and is accurate
    opened by biligonzales 0
  • Lockbit scraper fixed (now uses playwright) #74

    Lockbit scraper fixed (now uses playwright) #74

    Describe the changes

    Lockbit 2.0 now uses a ddos protection mechanism hence the regular http get method is no longer working.

    As a workaround I have implemented the playwright Microsoft library which behaves as if a proper browser did the request.

    Summary of the changes:

    1. lockbit.py: replaced the use of requests by playwright
    2. requirements.txt: added playwright
    3. Dockerfile: added playwright chromium support as well as required libraries.

    I have also upgraded at the top of the Dockerfile from python3.9-buster to python3.10-bullseye.

    Related issue(s)

    It fixes Issue #74

    Note that the scraping engine for lockbit has been left untouched as it is still perfectly working. Only the web page retrieval method has been altered.

    How was it tested?

    • [x] docker-compose build app
    • [x] docker-compose up --abort-on-container-exit
    • [x] Checked that Lockbit entries have been inserted into the database
    opened by biligonzales 3
  • new victims monitoring is broken, alert only when sites are down

    new victims monitoring is broken, alert only when sites are down

    Describe the bug The app doesn't alert when new victims added to the ransom sites (we noticed that new victim are being added on some of the sites) We get alerts only when the sites are down.

    Expected behavior The app alert when new victim are added to the ransom sits being monitored.

    Logs Starting ransomwatch_proxy_1 ... done Starting ransomwatch_app_1 ... done Attaching to ransomwatch_proxy_1, ransomwatch_app_1 proxy_1 | Feb 07 14:50:31.819 [notice] Tor 0.4.5.7 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1i, Zlib 1.2.11, Liblzma 5.2.5, Libzstd 1.4.5 and Unknown N/A as libc. proxy_1 | Feb 07 14:50:31.822 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://www.torproject.org/download/download#warning proxy_1 | Feb 07 14:50:31.822 [notice] Read configuration file "/etc/tor/torrc". proxy_1 | Feb 07 14:50:31.825 [notice] Opening Socks listener on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Socks listener connection (ready) on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opening Control listener on 0.0.0.0:9051 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Control listener connection (ready) on 0.0.0.0:9051 app_1 | 2022/02/07 14:50:33 [INFO] Initializing app_1 | 2022/02/07 14:50:33 [INFO] Found 30 sites app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Avaddon app_1 | 2022/02/07 14:50:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Conti app_1 | 2022/02/07 14:50:38 [INFO] Scraping victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:48 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:48 [INFO] Finished Conti app_1 | 2022/02/07 14:51:48 [INFO] Starting process for DarkSide app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for REvil app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for Babuk app_1 | 2022/02/07 14:51:50 [INFO] Scraping victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:51 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:51 [INFO] Finished Babuk app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Ranzy app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Astro app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Pay2Key app_1 | 2022/02/07 14:51:53 [INFO] Scraping victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:54 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:54 [INFO] Finished Pay2Key app_1 | 2022/02/07 14:51:54 [INFO] Starting process for Cuba app_1 | 2022/02/07 14:51:57 [INFO] This is the first scrape for Cuba, no victim notifications will be sent app_1 | 2022/02/07 14:51:57 [INFO] Scraping victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:08 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:08 [INFO] Finished Cuba app_1 | 2022/02/07 14:52:08 [INFO] Starting process for RansomEXX app_1 | 2022/02/07 14:52:10 [INFO] Scraping victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:13 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:13 [INFO] Finished RansomEXX app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Mount app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnarok app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnar app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Suncrypt app_1 | 2022/02/07 14:52:15 [INFO] This is the first scrape for Suncrypt, no victim notifications will be sent app_1 | 2022/02/07 14:52:15 [INFO] Scraping victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:17 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:17 [INFO] Finished Suncrypt app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Everest app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Nefilim app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Cl0p app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Pysa app_1 | 2022/02/07 14:52:19 [INFO] Scraping victims app_1 | 2022/02/07 14:52:23 [WARNING] couldn't parse timestamp: 00/00/00 app_1 | /usr/local/lib/python3.9/site-packages/dateparser/date_parser.py:35: PytzUsageWarning: The localize method is no longer necessary, as this time zone supports the fold attribute (PEP 495). For more details on migrating to a PEP 495-compliant implementation, see https://pytz-deprecation-shim.readthedocs.io/en/latest/migration.html app_1 | date_obj = stz.localize(date_obj) app_1 | 2022/02/07 14:52:24 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:24 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:24 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:24 [INFO] Finished Pysa app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Hive app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lockbit app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Xing app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lorenz app_1 | 2022/02/07 14:52:26 [INFO] This is the first scrape for Lorenz, no victim notifications will be sent app_1 | 2022/02/07 14:52:26 [INFO] Scraping victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:27 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:27 [INFO] Finished Lorenz app_1 | 2022/02/07 14:52:27 [INFO] Starting process for ElCometa app_1 | 2022/02/07 14:52:27 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:27 [INFO] Starting process for Arvin app_1 | 2022/02/07 14:52:30 [INFO] This is the first scrape for Arvin, no victim notifications will be sent app_1 | 2022/02/07 14:52:30 [INFO] Scraping victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:33 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:33 [INFO] Finished Arvin app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Blackmatter app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Avoslocker app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for LV app_1 | 2022/02/07 14:52:35 [INFO] Scraping victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:37 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:37 [INFO] Finished LV app_1 | 2022/02/07 14:52:37 [INFO] Starting process for Marketo app_1 | 2022/02/07 14:52:37 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:37 [INFO] Starting process for LockData app_1 | 2022/02/07 14:52:40 [INFO] Scraping victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:42 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:42 [INFO] Finished LockData app_1 | 2022/02/07 14:52:42 [INFO] Starting process for Rook app_1 | 2022/02/07 14:52:42 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:42 [INFO] Finished all sites, exiting

    Environment

    • OS: [Ubuntu 20.04.3]
    • How you are running it: [Docker with cronjob]
    opened by Deventual 1
  • Victim removal detection doesn't work properly when onion changes

    Victim removal detection doesn't work properly when onion changes

    Victim removal detection currently uses the full URL usually, which includes the onion domain. One side effect of this is that whenever the onion addr for a site changes, all of the victims are considered removed and new on the next scrape, which is problematic.

    Change this to just use the URI + site ID.

    bug 
    opened by captainGeech42 0
  • LOCKBIT 2.0 Support

    LOCKBIT 2.0 Support

    Site Info (no URL) LOCKBIT 2.0 was released some time ago. It should be confirmed either the scraper works with the new site or a module should be rewritten.

    Is the site currently online? Yes

    opened by wersas1 5
Releases(v1.2)
  • v1.2(Dec 4, 2021)

    This release fixes a few different bugs on the following scrapers:

    • Ragnar
    • Lorenz
    • Pysa
    • Arvin

    What's Changed

    • fixed #79 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/80
    • fixed #76 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/81
    • fixed #77, changed dateparsing to use lib by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/82
    • changed arvin date parsing to use lib (fixes #75) by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/83

    Full Changelog: https://github.com/captainGeech42/ransomwatch/compare/v1.1...v1.2

    Source code(tar.gz)
    Source code(zip)
  • v1.1(Dec 2, 2021)

    Ransomwatch v1.1

    This release adds support for many new sites, and has a critical security update. For details on the security update, see here.

    Supported Sites

    This release supports the following shame sites:

    • Conti
    • Sodinokibi/REvil
    • Pysa
    • Avaddon
    • DarkSide
    • CL0P
    • Nefilim
    • Mount Locker
    • Suncrypt
    • Everest
    • Ragnarok
    • Ragnar_Locker
    • BABUK LOCKER
    • Pay2Key
    • Cuba
    • RansomEXX
    • Pay2Key
    • Ranzy Locker
    • Astro Team
    • BlackMatter
    • Arvin
    • El_Cometa
    • Lorenz
    • Xing
    • Lockbit
    • AvosLocker
    • LV
    • Marketo
    • Lockdata
    Source code(tar.gz)
    Source code(zip)
  • v1.0(Apr 18, 2021)

    v1.0 Ransomwatch Release

    This initial version of Ransomwatch supports the following sites:

    • Conti
    • REvil/Sodinokibi
    • Avaddon
    • DarkSide

    This release supports notifying via:

    • Slack Webhooks

    More sites/notification capabilities will be added over time. However, this release has been tested in a production capacity and should be suitable to start collections.

    If you find any bugs or run across an problems, please open an issue to help improve Ransomwatch

    Source code(tar.gz)
    Source code(zip)
Owner
Zander Work
@osusec / @OSU-SOC
Zander Work
This is a DemoCode for parsing through large log files and triggering an email whenever there's an error.

LogFileParserDemoCode This is a DemoCode for parsing through large log files and triggering an email whenever there's an error. There are a total of f

2 Jan 06, 2022
Stand-alone parser for User Access Logging from Server 2012 and newer systems

KStrike Stand-alone parser for User Access Logging from Server 2012 and newer systems BriMor Labs KStrike This script will parse data from the User Ac

BriMor Labs 69 Nov 01, 2022
The easy way to send notifications

See changelog for recent changes Got an app or service and you want to enable your users to use notifications with their provider of choice? Working o

Or Carmi 2.4k Dec 25, 2022
Prettify Python exception output to make it legible.

pretty-errors Prettifies Python exception output to make it legible. Install it with python -m pip install pretty_errors If you want pretty_errors to

Iain King 2.6k Jan 04, 2023
Greppin' Logs: Leveling Up Log Analysis

This repo contains sample code and example datasets from Jon Stewart and Noah Rubin's presentation at the 2021 SANS DFIR Summit titled Greppin' Logs. The talk was centered around the idea that Forens

Stroz Friedberg 20 Sep 14, 2022
ClusterMonitor - a very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs

ClusterMonitor A very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs. Usage To start recording

23 Oct 04, 2021
Simple and versatile logging library for python 3.6 above

Simple and versatile logging library for python 3.6 above

Miguel 1 Nov 23, 2022
A basic logging library for Python.

log.py 📖 About: A basic logging library for Python with the capability to: save to files. have custom formats. have custom levels. be used instantiat

Sebastiaan Bij 1 Jan 19, 2022
pyEventLogger - a simple Python Library for making customized Logs of certain events that occur in a program

pyEventLogger is a simple Python Library for making customized Logs of certain events that occur in a program. The logs can be fully customized and can be printed in colored format or can be stored i

Siddhesh Chavan 2 Nov 03, 2022
loghandler allows you to easily log messages to multiple endpoints.

loghandler loghandler allows you to easily log messages to multiple endpoints. Using Install loghandler via pip pip install loghandler In your code im

Mathias V. Nielsen 2 Dec 04, 2021
A Python package which supports global logfmt formatted logging.

Python Logfmter A Python package which supports global logfmt formatted logging. Install $ pip install logfmter Usage Before integrating this library,

Joshua Taylor Eppinette 15 Dec 29, 2022
Debugging-friendly exceptions for Python

Better tracebacks This is a more helpful version of Python's built-in exception message: It shows more code context and the current values of nearby v

Clemens Korndörfer 1.2k Dec 28, 2022
A python library used to interact with webots robocup game web logs

A python library used to interact with webots robocup game web logs

Hamburg Bit-Bots 2 Nov 05, 2021
Scout: an open-source version of the monitoring tool

Badger Scout Scout is an open-source version of the monitoring tool used by Badg

Badger Finance 2 Jan 13, 2022
Summarize LSF job properties by parsing log files.

Summarize LSF job properties by parsing log files of workflows executed by Snakemake.

Kim 4 Jan 09, 2022
Yaml - Loggers are like print() statements

Upgrade your print statements Loggers are like print() statements except they also include loads of other metadata: timestamp msg (same as print!) arg

isaac peterson 38 Jul 20, 2022
Progressbar 2 - A progress bar for Python 2 and Python 3 - "pip install progressbar2"

Text progress bar library for Python. Travis status: Coverage: Install The package can be installed through pip (this is the recommended method): pip

Rick van Hattem 795 Dec 18, 2022
This open-source python3 script is a builder to the very popular token logger that is on my github that many people use.

Discord-Logger-Builder This open-source python3 script is a builder to the very popular token logger that is on my github that many people use. This i

Local 4 Nov 17, 2021
蓝鲸日志平台(BK-LOG)是为解决分布式架构下日志收集、查询困难的一款日志产品,基于业界主流的全文检索引擎

蓝鲸日志平台(BK-LOG)是为解决分布式架构下日志收集、查询困难的一款日志产品,基于业界主流的全文检索引擎,通过蓝鲸智云的专属 Agent 进行日志采集,提供多种场景化的采集、查询功能。

腾讯蓝鲸 102 Dec 22, 2022
Colored terminal output for Python's logging module

coloredlogs: Colored terminal output for Python's logging module The coloredlogs package enables colored terminal output for Python's logging module.

Peter Odding 496 Dec 30, 2022