Recon InfoSec

Automating Graylog Pipelines

Written by Whitney Champion | Jun 18, 2019 8:02:00 PM

Part of our job at Recon relies on fine tuning our threat signatures that make up the bulk of our pipeline rules in our Graylog environment.

Because of this, they are constantly changing, growing, being tuned, and ultimately becoming more effective over time at detecting anomalous and malicious activity.

The rules change frequently, sometimes daily, and keeping track of who changed what and when is important. They are in plain text format, which makes them a perfect candidate for GitHub.

We also don't want to have situations where the rules that already live in Graylog don't match the rules sitting in our GitHub repository.

WE NEEDED TO CI OUR RULES.

CI, or continuous integration, takes the output from our GitHub commits, and kicks off a job that we tell it to run. This maintains a level of accountability for who approves and makes changes, gives us revision history, and ensures that what we put in GitHub is now in Graylog.

It also eliminates the manual process of copy/pasting a rule from GitHub into Graylog, and potentially altering it, or even assigning it to the wrong pipelines.

THE SOLUTION.

We decided to harness the power of Python and Ansible to build Graylog Ansible modules (wrappers for various pieces of the Graylog API), which are then used to deploy all of our pipeline rule changes to Graylog via CircleCI.

The modules can be found here, and there are currently 6 of them.[1]

  • graylog_pipelines
  • graylog_streams
  • graylog_users
  • graylog_roles
  • graylog_index_sets
  • graylog_collector_configurations

The one we are focused on primarily, in this case, is graylog_pipelines. This module allows us to create and modify pipelines, and pipeline rules, and we use it whenever we make modifications to the rules in our repository.

LET'S CHECK OUT AN EXAMPLE.

Say I have a pipeline called threats, and a rule called check_src_ip in stage 3 of that pipeline that checks source IPs against known bad IP addresses from various threat data feeds.

In this case, the path to this particular rule might be my_repo/pipelines/threats/stage_3/check_src_ip.

I can then, within CircleCI (or your tool of choice), kick off a job that runs a script (Bash, Python, you name it) that checks what rules were just added/modified/deleted in GitHub, and loop through that list of rule paths, grabbing the pipeline/rule name/stage from each.

The playbook below, using the graylog_pipelines module, will validate and create a rule sitting at my_repo/pipelines/threats/stage_3/check_src_ip.

ansible-playbook ~/my_repo/.circleci/playbook.yml -e \
"rule_path=pipelines/threats/stage_3/check_src_ip \
rule_pipeline=threats \
rule_stage=3 \
rule_name=check_src_ip \
commit=c26cb8afd03c423679908392f8a6617a7e7922a2"

playbook.yml:

- name: Get rule info
  set_fact:
    pipeline_name: ""
    stage: ""
    rule_path: ""
    rule_name: ""
    commit: ""
    username: ""
    password: ""
    endpoint: "graylog.mydomain.com"
  no_log: true

- name: Get rule source from file
  set_fact:
    rule_source: ""

- name: Validate pipeline rule
  graylog_pipelines:
    action: parse_rule
    endpoint: ""
    graylog_user: ""
    graylog_password: ""
    source: ""
  register: validate_rule

- name: Fail if rule does not validate
  fail:
    msg: "Rule does not validate"
  when: validate_rule.status != 200

- name: Create pipeline rule 
  graylog_pipelines:
    action: create_rule
    endpoint: ""
    graylog_user: ""
    graylog_password: ""
    title: ""
    description: "Created , Commit ID "
    source: ""
  register: pipeline_rule

- name: Get pipeline ID from pipeline name
  graylog_pipelines:
    action: query_pipelines
    endpoint: ""
    graylog_user: ""
    graylog_password: ""
    pipeline_name: ""
  register: pipeline

- name: Get pipeline
  graylog_pipelines:
    action: list
    endpoint: ""
    graylog_user: ""
    graylog_password: ""
    pipeline_id: ""
  register: pipeline

- name: Create file with pipeline source
  copy:
    content: ""
    dest: ./pipeline

- name: Make sure stage is in pipeline source
  lineinfile:
    path: ./pipeline
    regexp: '^stage '
    line: 'stage  match either'
    insertafter: '^pipeline'

- name: Add rule to pipeline source
  lineinfile:
    path: ./pipeline
    regexp: '^rule ""'
    line: 'rule ""'
    insertafter: '^stage '

- name: Get modified pipeline content
  set_fact:
    pipeline_source: ""

- name: Update pipeline with newly added rule
  graylog_pipelines:
    action: update
    endpoint: ""
    graylog_user: ""
    graylog_password: ""
    pipeline_id: ""
    source: ""

TAKEAWAYS.

This has added a lot of value to our team, and allows us to better streamline an important part of our stack. What was once an unmanaged and completely manual process, is now tracked, validated, and repeatable.

However, these rules can't just be thrown into a pipeline on a whim, especially in a production environment with hundreds/thousands/millions of events coming in.

We still have to do our due diligence in making sure the rules we commit do, in fact, behave how we intend for them to behave. Pull requests are our way of ensuring that rules have had multiple sets of eyeballs on them before they get accepted and merged.

Adding validation steps in our CircleCI jobs and Ansible playbooks is another layer of security before pushing any breaking changes to Graylog.

BUT WAIT! THERE'S MORE.

If you want to learn more about Graylog pipelines and rules, their documentation is fantastic.

Additionally, we are continuously building upon these modules, and other tools, and thrive on giving back to a community that has provided us with so much of what we use and rely on. Everything we've open sourced can be found in our GitHub, @ReconInfoSec.

WANT HANDS ON EXPERIENCE?

Check us out at Black Hat USA 2019, or join us for OpenSOC in the Blue Team Village at DEF CON 27 this year!

 

1. These modules are currently awaiting being merged into the official Ansible project here.