Eyeing the Onion – BSides Augusta

Chris Rimondi and I had the opportunity to co-present at BSides Augusta recently talking about Security Onion (SO) data sources, learning with SO for Splunk and how you can transfer that knowledge and understanding to ELSA. As part of our talk we announced several new log parsers and dashboards we’ve made available to Doug Burks and Scott Runnels for inclusion in SO and Martin Holste for inclusion in ELSA: https://github.com/ChrisRimondi/Bro_ELSA_Parsers

if you missed the talk, Martin Holste has introduced some major changes to ELSA’s architecture this year that will soon be coming to SO, most notably the ELSA web/API and the forwarding capabilities. We also demonstrated how you can leverage conditional data from non-SO data sources to supplement your hunting. Video below and slides available here: BSides_Augusta

Since we didn’t have much time to demo the dashboards, I wanted to get some screenshots up to give SO and ELSA users a glimpse of what’s to come. In addition to the dashboards below, the previously released Web Monitor dashboard is included in the release: https://github.com/brad-shoop/elsa_dashboards.

overview1

overview2

ftp

ssh

ssl1

ssl2

smtp1

smtp2

nethunting1

nethunting2

hosthunting1

hosthunting2

 

Splunk, Bro

If you’re running a dedicated Bro IDS sensor and want to get the Bro events into Splunk, you can do so very easily using the Security Onion for Splunk app along with the Security Onion for Splunk Server/Sensor addon. How easy?

  1. Install Security Onion for Splunk on your Splunk server.
  2. Install Splunk’s universal forwarder on your Bro sensor.
  3. Install the Server/Sensor addon on your Bro sensor. Follow the installation instructions in the link and when you edit the inputs.conf file disable all the non-Bro related sourcetypes and change the monitor paths to your Bro log path.

You’ll miss out on a few views here and there, but the Security Onion for Splunk app is built around Bro IDS logs. I promise you can get a lot of mileage out of the app with a Bro only deployment. Just watch that Splunk licensing volume!

Now go get splunky, Bro.

 

Mandiant’s APT1 Domain/MD5 Intel and Security Onion with ELSA

Last night’s post on using domain and md5 lists from Mandiant’s Intel Report APT1: Exposing One of China’s Cyber Espionage Units focused on Security Onion for Splunk. If you’re using ELSA with your Security Onion deployment you can use the bulk_query.pl tool to run ELSA queries against a text file.

Download the Digital Appendix and Indicators zip file (md5 hashes are provided by Mandiant) and extract the files. Copy “Appendix D (Digital) – FQDNs.txt” and “Appendix E (Digital) – MD5s.txt” to your Security Onion Server (ELSA web node) and rename them to apt1_fqdn.txt and apt1_md5.txt.

sudo perl /opt/elsa/contrib/bulk_query.pl -f /path/to/apt1_fqdn.txt -t > fqdn_hits.log

sudo perl /opt/elsa/contrib/bulk_query.pl -f /path/to/apt1_md5.txt -t > md5_hits.log

The commands above will generate a log file containing each query term searched and any matching results that were returned.

Mandiant’s APT1 Domain/MD5 Intel and Security Onion for Splunk

Mandiant’s Intel Report APT1: Exposing One of China’s Cyber Espionage Units is loaded with indicators that organizations can use to identify whether they are or have been a victim of APT1. If you’re running the newest version of Security Onion, then you’ll definitely have data available to comb through for network, domain and md5 hash indicators. ELSA can help you with this process as can Security Onion for Splunk.

If you want a quick way to leverage the domain and md5 data in Security Onion for Splunk download the Digital Appendix and Indicators zip file (md5 hashes are provided by Mandiant) and extract the files. Make a copy of the files “Appendix D (Digital) – FQDNs.txt” and “Appendix E (Digital) – MD5s.txt” and rename them to apt1_fqdn.csv and apt1_md5.csv. Then you’ll need to edit the files so the first row contains the field header. For apt1_fqdn.csv add “domain” to the first row; for apt1_md5.csv add “md5″ so they should look like this:

apt1_fqdn.csv example apt1_md5.csv example

Next, we need to upload the .csv files we edited so head to Security Onion for Splunk and click Manager > Lookups > Lookup Table files > New. The Destination app will be securityonion, then choose the files and specify the Destination filename to be the same as what we named the files (apt1_fqdn.csv and apt1_md5.csv).

Add lookup table

When you’ve uploaded both files you’ll need to change the permissions if you want other users to be able to run the queries by setting the permissions to Read for “This app only (securityonion).” Now head back to the Security Onion app, click the Search menu and run the following searches over all the Bro historical data you’ve got (dns.log and http.log specifically).

sourcetype=bro_dns [|inputlookup apt1_fqdn.csv | fields + domain] | fields dest_ip src_ip domain

sourcetype=bro_http [|inputlookup apt1_md5.csv | fields + md5] | fields dest_ip src_ip md5

If you get no matching events back, breathe a sigh of relief. If you do get results, start digging deeper!

 

A Dress for ELSA – Web Activity Dashboard

The most impressive new addition to Security Onion 12.04 is Enterprise Log Search & Archive (ELSA). ELSA’s creator, Martin Holste (Twitter @mcholste), liked Splunk but had concerns about speed, scalability and cost, so he set out to develop his own log collection, indexing and searching platform and succeeded. Thanks to the efforts of Scott Runnels (Twitter @srunnels) and Doug Burks (Twitter @dougburks), ELSA can be enabled with the click of a button when deploying Security Onion.

ELSA makes it pretty easy to build and share dashboards using Google Visualizations. For details on building dashboards in ELSA see Martin’s post at his Open-Source Security Tools blog. If you want one to play with, I put together an overview of HTTP activity that demonstrates some of the chart types available.

ELSA Web Overview Dashboard

If you want to check it out in your Security Onion ELSA, click the ELSA menu then Dashboards and the “Create/import new dashboard.” Give it a title, an alias (“web_monitor” for example), specify who has access then paste the following in the “Paste here for import” box:

{
   "charts" : [
      {
         "y" : "1",
         "options" : {
            "width" : 500,
            "displayMode" : "markers",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "logScale" : false,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "logScale" : false,
                  "useFormatFromData" : true
               }
            ],
            "backgroundColor" : "#ffffff",
            "booleanRole" : "certainty",
            "colors" : [
               "#DC3912",
               "#EFE6DC",
               "#109618"
            ]
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.dstip | geoip",
               "label" : "GeoIP Map"
            }
         ],
         "x" : "0",
         "type" : "GeoChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : true,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : 0,
            "title" : "Source IPs"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:srcip",
               "label" : "Sources"
            }
         ],
         "x" : "0",
         "type" : "PieChart"
      },
      {
         "y" : "3",
         "options" : {
            "title" : null
         },
         "queries" : [
            {
               "query" : "get post put head groupby:minute",
               "label" : "get post put head"
            }
         ],
         "x" : "0",
         "type" : "ColumnChart"
      },
      {
         "y" : "4",
         "options" : {
            "width" : 500,
            "sortColumn" : null,
            "page" : "enable",
            "legend" : "right",
            "hAxis" : {
               "minValue" : null,
               "viewWindowMode" : "pretty",
               "maxValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "minValue" : null,
                  "viewWindowMode" : "pretty",
                  "maxValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "title" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "pageSize" : 20,
            "booleanRole" : "certainty",
            "showRowNumber" : false,
            "alternatingRowStyle" : true
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.site",
               "label" : "Top Sites"
            }
         ],
         "x" : "0",
         "type" : "Table"
      },
      {
         "y" : "1",
         "options" : {
            "width" : 500,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : null,
               "minValue" : null,
               "viewWindow" : null,
               "maxValue" : null,
               "useFormatFromData" : true,
               "title" : "Destination Ports"
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "isStacked" : false,
            "title" : "Activity by Destination Port",
            "backgroundColor" : {
               "fill" : "#ffffff"
            },
            "animation" : {
               "duration" : 500
            }
         },
         "queries" : [
            {
               "query" : "get post put head groupby:dstport\n",
               "label" : "class=BRO_HTTP"
            }
         ],
         "x" : "1",
         "type" : "ColumnChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : true,
            "legend" : "right",
            "hAxis" : {
               "minValue" : null,
               "viewWindowMode" : "pretty",
               "maxValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "minValue" : null,
                  "viewWindowMode" : "pretty",
                  "maxValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "title" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : 0,
            "title" : "Destination IPs"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.dstip",
               "label" : "Destinations"
            }
         ],
         "x" : "1",
         "type" : "PieChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : false,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : "0.5",
            "title" : "Method"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:method",
               "label" : "class=BRO_HTTP"
            }
         ],
         "x" : "2",
         "type" : "PieChart"
      }
   ],
   "auth_required" : "1",
   "title" : "Web Monitor",
   "alias" : "webmonitor"
}

Security Onion for Splunk 2.0 Released

On New Year’s Day I released Security Onion for Splunk 2.0 and Security Onion Server/Sensor Add On 0.7 to support the new release of Security Onion 12.04. The new release includes updated Overview, IR Search and SOstat dashboards, and introduces a new dashboard for Bro IDS logs I’ve dubbed Bro(wser).

The new release requires Sideview Utils (available freely from Splunkbase). I also recommend performing a clean install if you are upgrading Security Onion for Splunk from 1.1.7 to 2.0. There were some dashboard files configured in the /local path that should’ve been in /default and might not be overwritten properly when you upgrade. To uninstall the app run:

sudo /opt/splunk/bin/splunk remove app securityonion

then install the app from Splunkbase.

So what’s new? Besides all the awesomesauce that is Security Onion 12.04 itself, I hope you find the upgrades in the Splunk app suitably useful and worthy.

Overview – The pie charts at the top provide summaries of Sguil, Bro Notice and OSSEC events. You can drill down on them, but they’re mainly there to tell you everything is working.
2.0 OverviewThe tabs beneath the pie charts are where things get interesting and you’ll find a lot of data at your finger tips. The Sguil tab lets you view and drill down on Sguil events by Name:

2.0 Overview Sguil by Nameor Classification:

2.0 Overview Sguil by ClassificationSorry, no capME integration yet but it’s on the list!

The Connection Byte Counts tab let’s you see Bro connection bytes by service, IP, port, country, protocol or connection description and you can toggle by originator, responder or both. The idea with a lot of these tabs is to give you a little visibility into trending normal on your network.

2.0 Overview - ConnectionsHTTP Files provides a summary of all the filenames detected in Bro’s http logs by extension and let’s you drill down to view the filenames and further details.

2.0 Overview - HTTP FilesSSL:

2.0 Overview - SSLThe Software tab provides some more great visibility for trending your network and looking for the unexpected.

2.0 Overview - SoftwareTop Level Domains:

2.0 Overview - TLDAnomalous Domains tab borrows from Johannes Ullrich’s “A Poor Man’s DNS Anomaly Detection Script.” Every night at 1 a.m. a search will run exporting the top 9,999 domains Bro saw queried the previous day to a .csv file. The Anomalous Domains tab looks domains up against that .csv file and if there’s a match the domain is ignored. The only domains that should show up are domains that hadn’t been visited the previous day. For fun, you can also sort Anomalous Domain hits by originator/source IP if you want to keep an eye on more erratic and unpredictable hosts. When you first install the app you’ll get a .csv not found error when you view this tab until the first csv export runs.

The CIF tab performs external .csv lookups against CIF results similar to previous releases.

Bro(wser) is the newest addition and it is all Bro. With tabs for Notices, Connections, DNS, HTTP, SSL, SMTP, FTP, IRC, SSH and Software, you have comprehensive Bro visibility in a digestible structure from one dashboard.

2.0 - Bro(wser)Drill down on a selection and you’ll get 4 pie charts summarizing event data and a list of IPs involved.

2.0 - Bro(wser)Drilling down on the IP list will query all Bro log sources for uid’s matching the selected event(s) and you can drill again to see the raw events:

2.0 - Bro(wser)In the example above, you can see how the selected uid was a connection (bro_conn sourcetype) via ssh (bro_ssh) that also triggered a bro_notice alert (or two) followed by the raw Bro events.

The new IR Search you’ll find looks very similar to the Overview tabs and I’ve built a mini Bro(wser) summarizing all Bro activity for the searched IP..

2.0 - IR SearchDrilling on the Bro dest_ip list gets you the grouped uid results similar to Bro(wser):

2.0 - IR SearchFrom there you have access to the workflow menu. Following the Bro data, you get a breakdown of all activity detected for the specified IP during the time range designated, with the events grouped in buckets allowing you to adjust the the bucket size.

2.0 - IR SearchSOstat got a makeover as well, but I think I’ve passed my screenshot quota.

I mentioned the workflow menu briefly as there are several additions here. I’ve added VirusTotal MD5 and DShield.org lookups, in addition to the IR Search workflow.

I’m sure there’s more I’ve left off but that’s the bulk of the changes for 2.0. It’s a work in progress and I’ll be continuing to tune and tweak as it gets more production use and as always, feedback is welcome!

Special thanks to all of the Security Onion team for their efforts!

Enjoy the release and have a secure and happy new year!

 

Security Onion for Splunk version 1.1.7 – README

1.1.7

  • Tweaked Sguil indexing to prevent Bro URL data from being duplicated into Splunk via sguild.log.
  • Monitors dashboard field name drop down selections added to all panels.
  • General Mining dashboard added panels for Bro SSH logs and Bro HTTP TLDs (top level domains). Also added drop down options for Bro FTP and IRC panels.
  • Squil Mining has been updated and improved.
  • Syslog Mining dashboard added for Bro Syslog.
  • An Event Workflow was added for searching Splunk for events by src_ip.
  • …and last but not least:
  • CIF Dashboards!

For the CIF integration to work you need access to a CIF (Collective Intelligence Framework – http://code.google.com/p/collective-intelligence-framework/) server in order to export query results to a .csv file for the external lookups.

The CIF integration leverages three files created by exporting CIF query results to .csv format. If you don’t have access to a CIF server but want a sense of how the app works, I’ve provided three sample files in the lookups folder. Visit http://testmyids.com via a browser and you should see results in the CIF dashboards.

IMPORTANT – By including these 3 sample files (that contain NO actual CIF data; just fields and a test/sample entry) we prevent Splunk from throwing lookup table errors for those who won’t use the CIF capability. The drawback is updating the Security Onion app will overwrite existing .csv files, so remember to backup the lookups/*.csv files prior to future updates or plan to recreate new CIF .csv files.

The three CIF queries used for development and testing were:

cif -q infrastructure -s medium -c 85 -p csv -O infrastructure.csv
cif -q url -s medium -c 85 -p csv -O url.csv
cif -q domain -s medium -c 65 -p csv -O domain.csv

Once you’ve created these files you have to do two things:

  1. Edit each file removing the “# ” (hash followed by space) from the first row of each file.
  2. Copy the files to /opt/splunk/etc/apps/securityonion_cif/lookups/

I highly recommend scripting the CIF export, file manipulation and copy/move process.

CIF Lookups:

In the first release there are two dashboards:

  • IP Correlation checks dest_ip addresses from the Bro Connection logs indexed against the CIF infrastructure.csv export.
  • Domain/MD5 Malware Correlation checks Bro HTTP domains and executable files downloaded MD5 hashes against CIF.

Both dashboards work the same in that you’ll first see a list of IP addresses with an event count. Clicking on an IP will provide CIF related details around the match as well as a list of all activity to the dest_ip address. From there you can use the Event workflow menu to splunk deeper.

=============================================

I’ll try to post some screenshots and more details soon!

Update:

Here are a couple screenshots of the CIF dashboards. When they load, the IP list in the top right shows all the IP matches in the CIF infrastructure.csv file that appeared in the Bro connection logs. A click will load some CIF details (impact, severity, confidence, etc) with the matching events below. From the events view, you can then use the Events Workflow menu (blue square with white arrow) to pivot to a search for activity from the the source IP, full CIF queries, Robtex and DShield lookups. OSINT at it’s finest!

The Domain/Malware MD5 dashboard works the same way. The only real difference is it’s matching Bro HTTP logs against both the domains.csv and url.csv. I wanted to match full urls from Bro against the url.csv, but I’ve run into a few technical difficulties. Bro logs domain and uri as separate fields. I can make Splunk index them as a single field for a direct match lookup against url.csv but that would equal more indexed data. So until I can figure out a better way to do that, we’re matching against the CIF malware_md5 field. Bro hashes executable files downloaded via HTTP by default and this allows us to limit the query to only bro_http fields that actually have an MD5 value.

The key distinction with the Domain/Malware MD5 dashboard is the presence of the cif_malware_md5 field in the CIF Details panel. If that field is populated with a hash value, then you have had a hit on a positive malicious executable download. If there is no hash value, then you have a domain name hit.

May the force be with you.

More (Advanced) Querying CIF Data With Splunk

My last post on querying Collective Intelligence Framework (CIF) data with Splunk showed you how to enable CIF as part of Splunk workflows for ad-hoc lookups, but what if you want to take it a step further and actually cross reference events against CIF? The easiest method I’ve found has been to perform periodic CIF queries saving the results to a comma delimited (.csv) file then using external file lookups in Splunk to cross reference events. It’s not very difficult to setup and you’ll quickly see the value. All you’ll need outside of Splunk is access to a CIF server to create the .csv files.

First we need to export the query results from CIF:

cif -q infrastructure -s medium -c 85 -p csv -O infrastructure.csv

This command queries CIF for any “infrastructure” results with a severity rating of medium or higher and a confidence rating of 85 or higher. Infrastructure results are IP based threats related to botnet command and control (CnC) and infection sources. You’ll then have an infrastructure.csv file that looks something like this:

We need to remove the “#” and following space from the first row fields so it looks like this:

If you don’t remove the “#<space>” from the first row, Splunk will not parse the field names. (In other words, it won’t work.)

Fire up Splunk and head to Manager > Lookups > Lookup table files and create a new lookup file. Specify which app you would like the lookup table associated with then browse to the file and specify what you want the file to be named when it gets copied up to the Splunk server. The Search app is probably the best default option. Once saved, you can change the permissions to make the lookup file available in other apps.

At this point you can already run some very handy queries, for example:

sourcetype=bro_conn [|inputlookup infrastructure.csv | rename address as dest_ip | fields + dest_ip]

This query searches our Bro IDS connection log but uses the “inputlookup” command against our infrastructure.csv file. The “rename address as dest_ip” tells Splunk to rename the address field as dest_ip and the “fields + dest_ip” specifies which fields to lookup.

If you want to just gauge what the number of connections from your network to IPs in CIF look like, add the stats command:

sourcetype=bro_conn [|inputlookup infrastructure.csv | rename address as dest_ip | fields + dest_ip] | stats count by dest_ip

This will show you all the dest_ip addresses in bro_conn that match CIF and provide a count of the number of those events for each IP. You can repeat this process for CIF domains and urls quite easily to perform the same types of queries against DNS or http logs (like Bro’s dns.log and http.log). We can, however, take it a step further with automated lookups.

If you revisit Splunk Manager > Lookups, let’s add a new Lookup definition. Specify the app, give it a name, make sure the “Type” is set to “File-based” and then specify infrastructure.csv in the “Lookup” file drop down. Save it and adjust the permissions accordingly. Then go back to Lookups but this time let’s add a new “Automatic lookup.”

Once again we choose the app, give it a name and choose the lookup table. We then specify which sourcetype we want to check against CIF. The “Lookup input fields” tell Splunk to check the address field in the infrastructure.csv file against the dest_ip for our bro_conn sourcetype. We then have to specify what fields we want to output from the lookup if there is a match. Here you can define every field in the infrastructure.csv file or just the one’s that you want to be able to reference in the events.

All we are doing is telling Splunk when you find “address” field in the .csv file display it in Splunk as cif_address. (I recommend you append a prefix to avoid potential field name conflicts). The fields defined in the above screenshot should give you plenty to get started. Save it and you’re ready to query.

Doing a simple search against the bro_conn sourectype won’t provide any CIF results on it’s own. In order to get the automatic lookup to kick in, we have to tell Splunk that we want one of our CIF output fields as a result.

sourcetype=bro_conn cif_impact=*

Running that query with the automatic lookup properly configured will now get you the CIF matches for any dest_ip addresses plus you’ll now see the output fields we defined in the field picker.

So now you can run queries like this:

sourcetype=bro_conn cif_impact=* | table src_ip dest_ip cif_confidence cif_severity cif_impact cif_portlist cif_cc

To get results like this:

If you want to turn it into a dashboard, you might end up with something like this:

You can automate the CIF query export to .csv, editing the results to remove the “#<space>” from the first row and copy the file to the Splunk lookup folder pretty easily. Once you’ve created the lookup file in Splunk, you can update it at any time without having to let Splunk know. The path to the file on your Splunk server would be something like /opt/splunk/etc/apps/<app name>/lookups/infrastructure.csv.

I plan to get some CIF dashboards built into the Security Onion for Splunk app over the next few months. (If you can’t tell I’ve already been working on them.) They’ll only benefit users who have access to a CIF server to perform the export queries for the file-based lookups, but it’s not that difficult to stand up your own CIF server and being able to correlate your network activity with current community based intelligence is more than worth the effort.

IDS Rule Reference for Splunk 1.0

I created a standalone version of IDS Rule Reference for Splunk for Snort/PulledPork users who are not running Security Onion. I’ve added a few dashboard views to provide a little more flexibility for searching or researching rule documents.

The initial IDS Rules view is what is included in the Security Onion for Splunk app. It can be used for researching rules and activity by filtering on enabled rules or by category, classtype or source file.

The first of the new additions is a more flexible search dashboard allowing for wildcard searches by rule name or sid. Simply enter keywords or some (or all) of a sid and you’re off and running.

Lastly, the IDS Rule Tome allows you to browse only the rules that have a matching rule reference document. Undocumented rules will not appear in these results.

Known issues: I’ve not sorted out how best to handle the small subset of rule documents that share a sid, so rule documents named genid-sid.txt will provide inconsistent results.

If you’re using IDS Rule Reference in Security Onion for Splunk and want the additional views, simply download this app and you’ll be good to go.

Setup is pretty simple if you’ve got Splunk rolling already.

Install:
Download the Snort Rule Documentation (opensource.gz) from http://www.snort.org/snort-rules then extract opensource.gz to the monitored path:

tar zxvf opensource.gz -C /opt/splunk/etc/apps/ids_ref/local/rules

Copy your Snort PulledPork *.rules files to the monitored path:

cp *.rules /opt/splunk/etc/apps/ids_ref/local/rules/

Restart Splunk.

Event Workflows:
You can modify the Event Workflows from Splunk Manager > Fields > Workflow actions. Edit the IDS Rule Reference “Apply only to the following fields” to apply the workflow link to your Snort sig_id field in Splunk). You’ll also want to edit the Search String variable field name ($sig_id$ is the default).

As always feedback and suggestions are welcome for improvements!

Security Onion for Splunk 1.1.3 – IDS Rule Reference

I’ve been working a lot lately on tuning Security Onion alerts, specifically Snort alerts via en/disablesid.conf, threshold.conf and Sguil’s autocat.conf. (If you use Security Onion and don’t know what those are, go here now.)  JJ Cummings’ PulledPork is an incredibly effective tool for updating Snort or Emerging Threat IDS rules and provides some very straightforward methods of controlling what rules are enabled or disabled in Snort. It does that job incredibly well, which is why it’s the tool of choice in Security Onion.

Where I ran into issues was in keeping track of it all. I had 6 terminal windows open: enablesid.conf, disablesid.conf, threshold.conf, autocat.conf, and windows to grep downloaded.rules and lookup the rule reference files. I also had Sguil and Snorby up as I like to keep Sguil focused on incidents and clean of false positives and let Snorby and Splunk provide the deeper visibility into less certain events, where I can get full Bro IDS context. Keeping track of what I had enabled in Snort, but not in Sguil and what was enabled versus disabled was maddening and my desktop was a mess.

There had to be an easier way…so I maddened myself the Splunk way during off hours to reduce the maddening during work hours. Hopefully the end result will help you maintain your sanity during the tuning process.

Just to be clear, what I’m about to describe in no way replaces what PulledPork does. It provides Splunk visibility to rule files created by PulledPork.

Version 1.1.3 introduces IDS Rules to the SOstat menu. By indexing PulledPork *.rules files and the Snort Rule Documentation – opensource.gz (Oinkcode required), this dashboard allows you to search Snort rules by Classtype, Category, Rule Source and/or Rule Status (enabled/disabled). You can quickly check, for example, what rules in the trojan-activity classtype are disabled. Drill down on a rule and you can view the rule, the Snort reference file (if the rule has one) and a timechart with a time picker to view the selected rule’s activity over time.

Needless to say, it’s made sorting through rules and rule data much more manageable.

Before I get to the eye candy, here’s some mind candy. You have to enable the Data Inputs in the Splunk GUI and I’m leaving it up to you in terms of how you want to index the data. If volume is a big concern for you, enable the Data Inputs and manually copy the files (extract the Rule Documentation) to the defined “monitor” folders. If you’ve got audit and tracking concerns, I provide a couple very simplistic scripts and script inputs to give you an idea how you can script the process to provide a running history of a rule and it’s status. If you’re hardcore tuning, you might even want to setup /etc/nsm/rules as the Data Input for real time monitoring. It’s really up to you. Just be mindful that it can chew up some indexing volume if you get carried away.

It’s going to need a little tuning of the dashboard view to handle real-time or daily monitoring and I’m working on filtering by sensor if you have various policies applied across a multi-sensor deployment. But in the words of @DougBurks, “release early and release often.”

Now the eye candy.

When you first load the dashboard you’ll see something like this:

IDS RulesThe drop downs let you refine your searches:

IDS Rules - ClasstypesIDS Rules - CategoriesYou can also specify the rule file you’re targeting as well as whether you want to view all, enabled or disabled rules. Once you’ve made a drop down selection the rules window will populate:

IDS Rules - Rules PanelDrilling down on a rule will display the rule, the reference file (if available for that sid) and a timechart with a time picker so you can quickly check on rule activity.IDS Rules - DrilldownZooming in on the Rule Reference panel:

IDS Rules - Rule Reference PanelThe event workflow for Snort Sguil events now looks like this:

IDS Rules - Sguil WorkflowSelecting *VRT – 498 will open a new Splunk search window to the rule reference file result. (I’m going to make this cleaner in future releases).

IDS Rules - Workflow Result

Quick Setup (for ad-hoc indexing)

Install/Upgrade the app. Enable the Data Inputs for ids_rules and ids_rules_doc sourcetypes in Splunk Manager.

If you use CLI, copy /opt/splunk/etc/apps/securityonion/default/inputs.conf to /opt/splunk/etc/apps/securityonion/local/inputs.conf and edit the /local copy. Making changes to the /local files will not be overwritten by app updates, whereas /default will.

Scroll to the bottom and look for the monitor stanza for sourcetype ids_rules and change disabled = 0 instead of 1.

[monitor:///opt/splunk/etc/apps/securityonion/local/*.rules]
sourcetype = ids_rules
followTail = 0
disabled = 0

If you want to pull in the Suricata rules as well, you might need to add the following after the “disabled = 0″ line:

crcSalt = <SOURCE>

That will force Splunk to index everything in the path. Scroll down a little further and enable the monitor for ids_rules_doc sourcetype:

[monitor:///opt/splunk/etc/apps/securityonion/local/doc/signatures/*.txt]
sourcetype = ids_rules_doc
followTail = 0
disabled = 0

Exit and save the file, then copy the rules files to the Splunk folder.

cp /etc/nsm/rules/*.rules /opt/splunk/etc/apps/securityonion/local/rules/

Download the Snort Rule Reference files via browser or curl them:

curl -L http://www.snort.org/reg-rules/opensource.gz/-o opensource.gz

Then extract the files to the monitored path:

tar zxvf opensource.gz -C /opt/splunk/etc/apps/securityonion/local/

Restart Splunk and give it a few to percolate. It will take a few minutes to index all the rule files so be patient. Edit /opt/splunk/etc/apps/securityonion/default/inputs.conf and disable the monitors. Then head on over to SOstat > IDS Rules and give it a spin. If you see a blue bar error at the top that reads “No matching fields exist” the files haven’t been indexed yet. You can do a search for “sourcetype=ids_rules” and/or “sourcetype=ids_rules_doc” to check on the indexing process or open the Search app from the Splunk menu and check the source types panel in the bottom left. Find the ids_rules* sourcetypes and when the count stops going up after more than a few minutes it’s done. Restart Splunk to reload the inputs.conf file and disable subsequent indexing.

I’m seriously debating releasing this as a PulledPork for Splunk app as well, so I would love feedback from either Security Onion users or Snort PulledPork users as to whether there is interest or need outside my own desire for order. But then again I’ll have some time on the flights to and from Vegas next week; it might be fitting for a PulledPork Splunk app to come into it’s own on an airplane, eh JJ?

Other notables in this release:

  • You may need to reenter your CIF api key in the workflow config if you’re using CIF.
  • The SOstat Security Onion and *nix dashboards now allow you to view details by all SO servers/sensors or by the individual hosts in a distributed deployment.
  • VRT lookup added to workflow for Sguil events with a sig_id. Not all sig_ids will have a Snort rule reference file (especially Emerging Threat rules), so mileage will vary.

I’m hopeful making the Snort rule reference files accessible will help move towards the ultimate goal of this app. All along I’ve had two end users in mind: a large scale deployment and the home user. Both can install Security Onion with little knowledge thanks to Doug’s efforts, but neither is assured to be able to take it to the next step without help or a lot of effort if the expertise isn’t there. Providing easy access to context, whether it’s Snort rule reference files or CIF queries, can make a huge difference. To that end, more updates will be coming with a Sguil mining dashboard that will provide correlated context around events (think IR search result type data as drill down results as you review Sguil events) and more Mining views for network based indicators.

I’ll be at BlackHat and DefCon next week so if any Security Onion or SO for Splunk users want to meet up, hit me up via email or the twitter (@bradshoop).

Happy splunking!