SIEM Logstash parsing for more than hundred technologies

Overview

LogIndexer Pipeline

Logstash Parsing Configurations for Elastisearch SIEM and OpenDistro for Elasticsearch SIEM

Why this project exists

The overhead of implementing Logstash parsing and applying Elastic Common Schema (ECS) across audit, security, and system logs can be a large drawback when using Elasticsearch as a SIEM (Security Incident and Event Management). The Cargill SIEM team has spent significant time on developing quality Logstash parsing processors for many well-known log vendors and wants to share this work with the community. In addition to Logstash processors, we have also included log collection programs for API-based log collection, as well as the setup scripts used to generate our pipeline-to-pipeline architecture.

Quick start Instructions

"Quick start" mostly depends on how your Logstash configuration is set up. If you have your own setup already established, it might be best to use the processors that apply to your organization's log collection (found in the "config" directory). If you are seeking to use the architecture in this repo, consult the README found in the build_scripts directory. We will be adding an elaborate setup guide soon.

Contributions

We welcome and encourage individual contributions to this repo. Please see the Contribution.md guide in the root of the repo. Please note that we reserve the right to close pull requests or issues that appear to be out of scope for our project, or for other reasons not specified.

Questions, Comments & Expected Level of Attention

Please create an issue and someone will try to respond to your issue within 5 business days. However, it should be noted that while we will try revisit the repository semi-regularly, we are not held beholden to this response time (life happens). We welcome other individuals' answers and input as well.

Licensing

Apache-2.0

Comments
  • improved cisco ACI processor

    improved cisco ACI processor

    Improved the cisco aci processor with the following changes:

    1. simplified grok parsing
    2. removed complex logic used to detected event and error messages
    3. fixed broken parsing of the device hostname sending logs
    4. tmp.rule does NOT rapresent an username , it's instead the even.reason as described by cisco, - The action or condition that caused the event, such as a component failure or a threshold crossing.

    sample messages used for testing

    <186>Dec 08 21:20:20.614 ABC-DCA-NPRD-ACILEF-104 %LOG_LOCAL7-2-SYSTEM_MSG [F0532][raised][interface-physical-down][critical][sys/phys-[eth1/47]/phys/fault-F0532] Port is down, reason being suspended(no LACP PDUs)(connected), used by EPG on node 104 of fabric ACI Fabric1 with hostname ABC-DCA-NPRD-ACILEF-104
    
    <190>Nov 24 18:20:53.237 ABC-DCB-ACIAPC-003 %LOG_LOCAL7-6-SYSTEM_MSG [E4206143][transition][info][fwrepo/fw-aci-apic-dk9.5.2.6e] Firmware aci-apic-dk9.5.2.6e created
    
    opened by anubisg1 3
  • [Help / Documentation] - how to classify incoming syslog messages

    [Help / Documentation] - how to classify incoming syslog messages

    As per title, how would we classify incoming syslog messages so that they end up in the proper process pipeline?

    Let's take a common use case where in the network we have Cisco IOS router and switches , Cisco ACI , Cisco WLC and ISE, then Checkpoint Firewalls , F5 load balancers etc ...

    generally those devices would all be sending logs to the syslog server IP port 514. but how would we classify from where each message is coming from in order to send it to the specific processor ?

    are we supposed to setup a different input queu for each processor (for example, different port ofn the syslog server so that for example, ACI goes to 192.168.10.10 port 5514 whole Checkpoint on port 5515? )

    or is there an ip filter that says, if source IP is X send to ACI processor if Y send to checkpoint ..

    or what other options are there?

    question 
    opened by anubisg1 2
  • host split enrichment error

    host split enrichment error

    For certain hostnames the host split enrichment is causing the pipeline to be blocked until grok timesout.

    [2022-06-10T15:54:58,563][WARN ][org.logstash.plugins.pipeline.PipelineBus][processor] Attempted to send event to 'enrichments' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry. [2022-06-10T15:57:22,451][WARN ][logstash.filters.grok ][enrichments] Timeout executing grok '^(?<[host][tmp]>.?).(?<[host][domain]>.?)$' against field '[host][hostname]' with value 'abc-name123-xyz.domain.com'!

    opened by nnovaes 2
  • Fix deprecation warnings

    Fix deprecation warnings

    User Story - details

    For translate we should use source, target instead of field, destination. On boot logstash 15 shows these warnings:

    [2021-11-09T16:53:33,518][WARN ][logstash.filters.translate] You are using a deprecated config setting "destination" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `target` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    [2021-11-09T16:53:33,519][WARN ][logstash.filters.translate] You are using a deprecated config setting "field" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `source` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    

    Atleast upto Logstash 13 new fields are not supported so let's make this change when we upgrade.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    << Any related code here... >>
    
    opened by KrishnanandSingh 2
  • native vlan mismatch and other improvements

    native vlan mismatch and other improvements

    Description

    • Parsing for Native VLAN mismatch error messages 2021-10-14T13:28:06.497Z {name=abc.com} <188>132685: Oct 14 21:28:07.975 GMT: %CDP-4-NATIVE_VLAN_MISMATCH: Native VLAN mismatch discovered on FastEthernet0/1 (1), with xyz GigabitEthernet1/0/1 (36).
    • Lowercase [actual_msg] field
    • fix typo on timestamp
    • add the timezone to [tmp][devicetimestamp]
    • removed the old parser code for native vlan mismatch
    • removed a catch all condition in the old parser
    • lowercase [rule.category]
    opened by nnovaes 2
  • Feature Request: Add known applications + risk score field based off destination.port fields

    Feature Request: Add known applications + risk score field based off destination.port fields

    User Story - details

    As a SIEM engineer I want to know port numbers associated with the destination.port field. This will allow me to quickly identify potential applications communicating on the session and also the risk of the traffic Im observing

    Tasks

    • Create a port lookup translation.
    • Add risk category score to application (scale of 1-10 or severity name).

    Examples:

    3389 -> Remote Desktop Protocol (high risk)
    22 - Secure Shell (high risk)
    3306 - MySQL (medium risk)
    6881-6889 - Bit Torrent (high risk)
    
    opened by ryanpodonnell1 2
  • Cisco IOS (cisco.router and cisco.switch) new rules

    Cisco IOS (cisco.router and cisco.switch) new rules

    Description

    new parsing rules for cisco.router and cisco.switch. The old version of this processor needs some rework. However, there are functioning bits of it that i have preserved, since they kind of work. the new rules provide some good foundation for future "full" parsing and also covers bgp and interface up/down msgs. the lookup database for translate filters is static.

    opened by nnovaes 2
  • Update syslog_log_security_sdwan.app.conf

    Update syslog_log_security_sdwan.app.conf

    Description

    These updates correct assignment of versa fields to the ECS model. It also adds back versa specific fields that do not map to ECS into a separate [labels][all] field that works like tags. I couldn't find clean way to implement it without using the add_tag command, so i have saved the event tags to another field and then restored back

    @Akhila-Y please review as well.

    opened by nnovaes 1
  • added space, testing new IDE

    added space, testing new IDE

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added new ECS fields to .csv file

    added new ECS fields to .csv file

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added missing fields for coverge reporting to aws cloudtrail

    added missing fields for coverge reporting to aws cloudtrail

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    Describe the bug

    [ERROR] 2022-11-26 07:54:05.540 [[enrichments]>worker14] ruby - Ruby exception occurred: no implicit conversion of nil into String {:class=>"TypeError", :backtrace=>["(ruby filter code):68:in `block in filter_method'", "org/jruby/RubyArray.java:1865:in `each'", "(ruby filter code):67:in `block in filter_method'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:96:in `inline_script'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:89:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178:in `block in multi_filter'", "org/jruby/RubyArray.java:1865:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:301:in `block in start_workers'"]}
    

    X-Reference issues

    (Cross reference any user stories that this bug might be affecting)

    Steps To Reproduce

    start the enrichment pipeline. I'm using logstash 8.5.2

    Expected behavior

    no error should be seen

    Additional context

    The following components in enrichment make use of ruby filer, but i don't understand what is the culprit

    ./02_ecs_data_type.conf
    ./04_timestamp.conf
    ./11_related_hosts.conf
    ./12_related_user.conf
    ./13_related_ip.conf
    ./14_related_hash.conf
    ./16_related_mac.conf
    ./93_mitre.conf
    ./94_remove_empty_n_truncate.conf
    
    bug wontfix 
    opened by anubisg1 2
  • cisco processor fails because of missing hostname and lowercase date

    cisco processor fails because of missing hostname and lowercase date

    I'm working with syslog_audit_cisco.switch.conf and i found the following issues:

    1. the syslog message is assumed here https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L52 as
      # {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no:timestamp: %facility-severity-MNEMONIC:description
    

    in reality most people would configure "logging origin-id hostname" which will change the log format into

      # {hostname} {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no: hostname: timestamp: %facility-severity-MNEMONIC:description
    
    1. the parser at line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L33 is modifying the hostname field before that field is parsed (maybe this is assumed from kafka, instead of being taken from the logs?

    2. in line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L48 the message is converted to lower case, but that causes date parse failures later on, becuase of case missmatch .

    Nov 17 11:44:46.490 UTC matches, but when i have nov 17 11:44:46.490 utc it fails on the date parsing here: https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L77

    Sample log entry for reference:

    <14>4643: Switch-core01: Nov 17 11:44:46.490 UTC: %LINK-3-UPDOWN: Interface GigabitEthernet1/0/27, changed state to up

    opened by anubisg1 0
  • GeoLitePrivate2-City.mmdb doesn't exist

    GeoLitePrivate2-City.mmdb doesn't exist

    to use the geoip enrichment, you need to files, specifically

              database => "/mnt/s3fs_geoip/GeoLite2-City.mmdb"
              database => "/mnt/s3fs_geoip/GeoLitePrivate2-City.mmdb"
    

    unfortunately seems like GeoLitePrivate2-City.mmdb doesn't exist anywhere in the internet and maxmind only provides

    • GeoLite2-ASN.mmdb
    • GeoLite2-City.mmdb
    • GeoLite2-Country.mmdb

    i'd expect that either more information on where to find GeoLitePrivate2-City.mmdb is added to the documentation or the enrichment pipeline is updated to function without that file

    documentation question 
    opened by anubisg1 3
  • Validate ECS fields

    Validate ECS fields

    User Story - details

    There should be an enrichment checking that only permitted values are stored in ECS fields that have a predefined set of values, so those fields can be compliant with ECS. See https://www.elastic.co/guide/en/ecs/1.9/ecs-event.html for more info. I believe event.xyz are the only fields that have their values defined. If that's true the sample code below should take care of doing this validation.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    the sample configuration below picks the event.type value that came from the processors and populates ecs_status with valid or event.type-invalid_field_value. therefore, if the ecs_status is not valid, it will add a tag that will have event.type-invalid_field_value. i.e. if event.type is "process", because "process" is not among the allowed values for event.type, a event.type-invalid_field_value: process will be added.

     translate {
                field => "event.type"
                dictionary => [
                "access", "valid", 
                "admin", "valid", 
                "allowed", "valid", 
                "change", "valid", 
                "connection", "valid", 
                "creation", "valid", 
                "deletion", "valid", 
                "denied", "valid", 
                "end", "valid", 
                "error", "valid", 
                "group", "valid", 
                "info", "valid", 
                "installation", "valid", 
                "protocol", "valid", 
                "start", "valid", 
                "user", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.type-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.type}" ]
                remove_field => [ "ecs_status", "event.type"]
            }
        }
    
        #EVENT.CATEGORY
        translate {
                field => "event.category"
                dictionary => [
                "authentication", "valid", 
                "configuration", "valid", 
                "driver", "valid", 
                "database", "valid", 
                "file", "valid", 
                "host", "valid", 
                "iam", "valid", 
                "intrusion_detection", "valid", 
                "malware", "valid", 
                "network", "valid", 
                "package", "valid", 
                "process", "valid", 
                "web", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.category-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.category}" ]
                remove_field => [ "ecs_status", "event.category"]
    
            }
        }
    
        # event.kind
         translate {
                field => "event.kind"
                dictionary => [
                "alert", "valid", 
                "event", "valid", 
                "metric", "valid", 
                "state", "valid", 
                "pipeline_error", "valid", 
                "signal", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.kind-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.kind}" ]
                remove_field => [ "ecs_status", "event.kind"]
    
            }
        }
    
    
        # event.outcome
         translate {
                field => "event.outcome"
                dictionary => [
                "failure", "valid", 
                "success", "valid", 
                "unknown", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.outcome-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.outcome}" ]
                remove_field => [ "ecs_status", "event.outcome"]
    
            }
        }
    
    opened by nnovaes 0
Releases(v0.1-beta)
  • v0.1-beta(May 19, 2021)

    This release lack an elaborate usage documentation so marking this as beta. Users can still work with it by going through the python script. Soon documentation would be added.

    Source code(tar.gz)
    Source code(zip)
Owner
Working to nourish the world. Committed to helping the world thrive
Joint Discriminative and Generative Learning for Person Re-identification. CVPR'19 (Oral)

Joint Discriminative and Generative Learning for Person Re-identification [Project] [Paper] [YouTube] [Bilibili] [Poster] [Supp] Joint Discriminative

NVIDIA Research Projects 1.2k Dec 30, 2022
Code for CMaskTrack R-CNN (proposed in Occluded Video Instance Segmentation)

CMaskTrack R-CNN for OVIS This repo serves as the official code release of the CMaskTrack R-CNN model on the Occluded Video Instance Segmentation data

Q . J . Y 61 Nov 25, 2022
Adaptive Pyramid Context Network for Semantic Segmentation (APCNet CVPR'2019)

Adaptive Pyramid Context Network for Semantic Segmentation (APCNet CVPR'2019) Introduction Official implementation of Adaptive Pyramid Context Network

21 Nov 09, 2022
Read and write layered TIFF ImageSourceData and ImageResources tags

Read and write layered TIFF ImageSourceData and ImageResources tags Psdtags is a Python library to read and write the Adobe Photoshop(r) specific Imag

Christoph Gohlke 4 Feb 05, 2022
Implementation of Fast Transformer in Pytorch

Fast Transformer - Pytorch Implementation of Fast Transformer in Pytorch. This only work as an encoder. Yannic video AI Epiphany Install $ pip install

Phil Wang 167 Dec 27, 2022
A Semantic Segmentation Network for Urban-Scale Building Footprint Extraction Using RGB Satellite Imagery

A Semantic Segmentation Network for Urban-Scale Building Footprint Extraction Using RGB Satellite Imagery This repository is the official implementati

Aatif Jiwani 42 Dec 08, 2022
Epidemiology analysis package

zEpid zEpid is an epidemiology analysis package, providing easy to use tools for epidemiologists coding in Python 3.5+. The purpose of this library is

Paul Zivich 111 Jan 08, 2023
Code for the paper "Next Generation Reservoir Computing"

Next Generation Reservoir Computing This is the code for the results and figures in our paper "Next Generation Reservoir Computing". They are written

OSU QuantInfo Lab 105 Dec 20, 2022
Deep generative models of 3D grids for structure-based drug discovery

What is liGAN? liGAN is a research codebase for training and evaluating deep generative models for de novo drug design based on 3D atomic density grid

Matt Ragoza 152 Jan 03, 2023
Implementation of the CVPR 2021 paper "Online Multiple Object Tracking with Cross-Task Synergy"

Online Multiple Object Tracking with Cross-Task Synergy This repository is the implementation of the CVPR 2021 paper "Online Multiple Object Tracking

54 Oct 15, 2022
SOFT: Softmax-free Transformer with Linear Complexity, NeurIPS 2021 Spotlight

SOFT: Softmax-free Transformer with Linear Complexity SOFT: Softmax-free Transformer with Linear Complexity, Jiachen Lu, Jinghan Yao, Junge Zhang, Xia

Fudan Zhang Vision Group 272 Dec 25, 2022
Air Pollution Prediction System using Linear Regression and ANN

AirPollution Pollution Weather Prediction System: Smart Outdoor Pollution Monitoring and Prediction for Healthy Breathing and Living Publication Link:

Dr Sharnil Pandya, Associate Professor, Symbiosis International University 19 Feb 07, 2022
Code to train models from "Paraphrastic Representations at Scale".

Paraphrastic Representations at Scale Code to train models from "Paraphrastic Representations at Scale". The code is written in Python 3.7 and require

John Wieting 71 Dec 19, 2022
A python implementation of Yolov5 to detect fire or smoke in the wild in Jetson Xavier nx and Jetson nano

yolov5-fire-smoke-detect-python A python implementation of Yolov5 to detect fire or smoke in the wild in Jetson Xavier nx and Jetson nano You can see

20 Dec 15, 2022
Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces(ICML 2021)

Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces(ICML 2021) This repository contains the code

149 Dec 15, 2022
UnsupervisedR&R: Unsupervised Pointcloud Registration via Differentiable Rendering

UnsupervisedR&R: Unsupervised Pointcloud Registration via Differentiable Rendering This repository holds all the code and data for our recent work on

Mohamed El Banani 118 Dec 06, 2022
An inofficial PyTorch implementation of PREDATOR based on KPConv.

PREDATOR: Registration of 3D Point Clouds with Low Overlap An inofficial PyTorch implementation of PREDATOR based on KPConv. The code has been tested

ZhuLifa 14 Aug 03, 2022
Codebase for BMVC 2021 paper "Text Based Person Search with Limited Data"

Text Based Person Search with Limited Data This is the codebase for our BMVC 2021 paper. Please bear with me refactoring this codebase after CVPR dead

Xiao Han 33 Nov 24, 2022
Exploration & Research into cross-domain MEV. Initial focus on ETH/POLYGON.

xMEV, an apt exploration This is a small exploration on the xMEV opportunities between Polygon and Ethereum. It's a data analysis exercise on a few pa

odyslam.eth 7 Oct 18, 2022
GAN-based 3D human pose estimation model for 3DV'17 paper

Tensorflow implementation for 3DV 2017 conference paper "Adversarially Parameterized Optimization for 3D Human Pose Estimation". @inproceedings{jack20

Dominic Jack 15 Feb 27, 2021