Eyeing the Onion – BSides Augusta

Chris Rimondi and I had the opportunity to co-present at BSides Augusta recently talking about Security Onion (SO) data sources, learning with SO for Splunk and how you can transfer that knowledge and understanding to ELSA. As part of our talk we announced several new log parsers and dashboards we’ve made available to Doug Burks and Scott Runnels for inclusion in SO and Martin Holste for inclusion in ELSA: https://github.com/ChrisRimondi/Bro_ELSA_Parsers

if you missed the talk, Martin Holste has introduced some major changes to ELSA’s architecture this year that will soon be coming to SO, most notably the ELSA web/API and the forwarding capabilities. We also demonstrated how you can leverage conditional data from non-SO data sources to supplement your hunting. Video below and slides available here: BSides_Augusta

Since we didn’t have much time to demo the dashboards, I wanted to get some screenshots up to give SO and ELSA users a glimpse of what’s to come. In addition to the dashboards below, the previously released Web Monitor dashboard is included in the release: https://github.com/brad-shoop/elsa_dashboards.

overview1

overview2

ftp

ssh

ssl1

ssl2

smtp1

smtp2

nethunting1

nethunting2

hosthunting1

hosthunting2

 

Mandiant’s APT1 Domain/MD5 Intel and Security Onion with ELSA

Last night’s post on using domain and md5 lists from Mandiant’s Intel Report APT1: Exposing One of China’s Cyber Espionage Units focused on Security Onion for Splunk. If you’re using ELSA with your Security Onion deployment you can use the bulk_query.pl tool to run ELSA queries against a text file.

Download the Digital Appendix and Indicators zip file (md5 hashes are provided by Mandiant) and extract the files. Copy “Appendix D (Digital) – FQDNs.txt” and “Appendix E (Digital) – MD5s.txt” to your Security Onion Server (ELSA web node) and rename them to apt1_fqdn.txt and apt1_md5.txt.

sudo perl /opt/elsa/contrib/bulk_query.pl -f /path/to/apt1_fqdn.txt -t > fqdn_hits.log

sudo perl /opt/elsa/contrib/bulk_query.pl -f /path/to/apt1_md5.txt -t > md5_hits.log

The commands above will generate a log file containing each query term searched and any matching results that were returned.

Mandiant’s APT1 Domain/MD5 Intel and Security Onion for Splunk

Mandiant’s Intel Report APT1: Exposing One of China’s Cyber Espionage Units is loaded with indicators that organizations can use to identify whether they are or have been a victim of APT1. If you’re running the newest version of Security Onion, then you’ll definitely have data available to comb through for network, domain and md5 hash indicators. ELSA can help you with this process as can Security Onion for Splunk.

If you want a quick way to leverage the domain and md5 data in Security Onion for Splunk download the Digital Appendix and Indicators zip file (md5 hashes are provided by Mandiant) and extract the files. Make a copy of the files “Appendix D (Digital) – FQDNs.txt” and “Appendix E (Digital) – MD5s.txt” and rename them to apt1_fqdn.csv and apt1_md5.csv. Then you’ll need to edit the files so the first row contains the field header. For apt1_fqdn.csv add “domain” to the first row; for apt1_md5.csv add “md5” so they should look like this:

apt1_fqdn.csv example apt1_md5.csv example

Next, we need to upload the .csv files we edited so head to Security Onion for Splunk and click Manager > Lookups > Lookup Table files > New. The Destination app will be securityonion, then choose the files and specify the Destination filename to be the same as what we named the files (apt1_fqdn.csv and apt1_md5.csv).

Add lookup table

When you’ve uploaded both files you’ll need to change the permissions if you want other users to be able to run the queries by setting the permissions to Read for “This app only (securityonion).” Now head back to the Security Onion app, click the Search menu and run the following searches over all the Bro historical data you’ve got (dns.log and http.log specifically).

sourcetype=bro_dns [|inputlookup apt1_fqdn.csv | fields + domain] | fields dest_ip src_ip domain

sourcetype=bro_http [|inputlookup apt1_md5.csv | fields + md5] | fields dest_ip src_ip md5

If you get no matching events back, breathe a sigh of relief. If you do get results, start digging deeper!

 

A Dress for ELSA – Web Activity Dashboard

The most impressive new addition to Security Onion 12.04 is Enterprise Log Search & Archive (ELSA). ELSA’s creator, Martin Holste (Twitter @mcholste), liked Splunk but had concerns about speed, scalability and cost, so he set out to develop his own log collection, indexing and searching platform and succeeded. Thanks to the efforts of Scott Runnels (Twitter @srunnels) and Doug Burks (Twitter @dougburks), ELSA can be enabled with the click of a button when deploying Security Onion.

ELSA makes it pretty easy to build and share dashboards using Google Visualizations. For details on building dashboards in ELSA see Martin’s post at his Open-Source Security Tools blog. If you want one to play with, I put together an overview of HTTP activity that demonstrates some of the chart types available.

ELSA Web Overview Dashboard

If you want to check it out in your Security Onion ELSA, click the ELSA menu then Dashboards and the “Create/import new dashboard.” Give it a title, an alias (“web_monitor” for example), specify who has access then paste the following in the “Paste here for import” box:

{
   "charts" : [
      {
         "y" : "1",
         "options" : {
            "width" : 500,
            "displayMode" : "markers",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "logScale" : false,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "logScale" : false,
                  "useFormatFromData" : true
               }
            ],
            "backgroundColor" : "#ffffff",
            "booleanRole" : "certainty",
            "colors" : [
               "#DC3912",
               "#EFE6DC",
               "#109618"
            ]
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.dstip | geoip",
               "label" : "GeoIP Map"
            }
         ],
         "x" : "0",
         "type" : "GeoChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : true,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : 0,
            "title" : "Source IPs"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:srcip",
               "label" : "Sources"
            }
         ],
         "x" : "0",
         "type" : "PieChart"
      },
      {
         "y" : "3",
         "options" : {
            "title" : null
         },
         "queries" : [
            {
               "query" : "get post put head groupby:minute",
               "label" : "get post put head"
            }
         ],
         "x" : "0",
         "type" : "ColumnChart"
      },
      {
         "y" : "4",
         "options" : {
            "width" : 500,
            "sortColumn" : null,
            "page" : "enable",
            "legend" : "right",
            "hAxis" : {
               "minValue" : null,
               "viewWindowMode" : "pretty",
               "maxValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "minValue" : null,
                  "viewWindowMode" : "pretty",
                  "maxValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "title" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "pageSize" : 20,
            "booleanRole" : "certainty",
            "showRowNumber" : false,
            "alternatingRowStyle" : true
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.site",
               "label" : "Top Sites"
            }
         ],
         "x" : "0",
         "type" : "Table"
      },
      {
         "y" : "1",
         "options" : {
            "width" : 500,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : null,
               "minValue" : null,
               "viewWindow" : null,
               "maxValue" : null,
               "useFormatFromData" : true,
               "title" : "Destination Ports"
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "isStacked" : false,
            "title" : "Activity by Destination Port",
            "backgroundColor" : {
               "fill" : "#ffffff"
            },
            "animation" : {
               "duration" : 500
            }
         },
         "queries" : [
            {
               "query" : "get post put head groupby:dstport\n",
               "label" : "class=BRO_HTTP"
            }
         ],
         "x" : "1",
         "type" : "ColumnChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : true,
            "legend" : "right",
            "hAxis" : {
               "minValue" : null,
               "viewWindowMode" : "pretty",
               "maxValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "minValue" : null,
                  "viewWindowMode" : "pretty",
                  "maxValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "title" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : 0,
            "title" : "Destination IPs"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:BRO_HTTP.dstip",
               "label" : "Destinations"
            }
         ],
         "x" : "1",
         "type" : "PieChart"
      },
      {
         "y" : "2",
         "options" : {
            "width" : "333.333333333333",
            "is3D" : false,
            "legend" : "right",
            "hAxis" : {
               "viewWindowMode" : "pretty",
               "minValue" : null,
               "viewWindow" : {
                  "min" : null,
                  "max" : null
               },
               "maxValue" : null,
               "useFormatFromData" : true
            },
            "vAxes" : [
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               },
               {
                  "viewWindowMode" : "pretty",
                  "minValue" : null,
                  "viewWindow" : {
                     "min" : null,
                     "max" : null
                  },
                  "maxValue" : null,
                  "useFormatFromData" : true
               }
            ],
            "booleanRole" : "certainty",
            "colors" : [
               "#3366CC",
               "#DC3912",
               "#FF9900",
               "#109618",
               "#990099",
               "#0099C6",
               "#DD4477",
               "#66AA00",
               "#B82E2E",
               "#316395",
               "#994499",
               "#22AA99",
               "#AAAA11",
               "#6633CC",
               "#E67300",
               "#8B0707",
               "#651067",
               "#329262",
               "#5574A6",
               "#3B3EAC",
               "#B77322",
               "#16D620",
               "#B91383",
               "#F4359E",
               "#9C5935",
               "#A9C413",
               "#2A778D",
               "#668D1C",
               "#BEA413",
               "#0C5922",
               "#743411"
            ],
            "pieHole" : "0.5",
            "title" : "Method"
         },
         "queries" : [
            {
               "query" : "get post put head groupby:method",
               "label" : "class=BRO_HTTP"
            }
         ],
         "x" : "2",
         "type" : "PieChart"
      }
   ],
   "auth_required" : "1",
   "title" : "Web Monitor",
   "alias" : "webmonitor"
}

Security Onion for Splunk 2.0 Released

On New Year’s Day I released Security Onion for Splunk 2.0 and Security Onion Server/Sensor Add On 0.7 to support the new release of Security Onion 12.04. The new release includes updated Overview, IR Search and SOstat dashboards, and introduces a new dashboard for Bro IDS logs I’ve dubbed Bro(wser).

The new release requires Sideview Utils (available freely from Splunkbase). I also recommend performing a clean install if you are upgrading Security Onion for Splunk from 1.1.7 to 2.0. There were some dashboard files configured in the /local path that should’ve been in /default and might not be overwritten properly when you upgrade. To uninstall the app run:

sudo /opt/splunk/bin/splunk remove app securityonion

then install the app from Splunkbase.

So what’s new? Besides all the awesomesauce that is Security Onion 12.04 itself, I hope you find the upgrades in the Splunk app suitably useful and worthy.

Overview – The pie charts at the top provide summaries of Sguil, Bro Notice and OSSEC events. You can drill down on them, but they’re mainly there to tell you everything is working.
2.0 OverviewThe tabs beneath the pie charts are where things get interesting and you’ll find a lot of data at your finger tips. The Sguil tab lets you view and drill down on Sguil events by Name:

2.0 Overview Sguil by Nameor Classification:

2.0 Overview Sguil by ClassificationSorry, no capME integration yet but it’s on the list!

The Connection Byte Counts tab let’s you see Bro connection bytes by service, IP, port, country, protocol or connection description and you can toggle by originator, responder or both. The idea with a lot of these tabs is to give you a little visibility into trending normal on your network.

2.0 Overview - ConnectionsHTTP Files provides a summary of all the filenames detected in Bro’s http logs by extension and let’s you drill down to view the filenames and further details.

2.0 Overview - HTTP FilesSSL:

2.0 Overview - SSLThe Software tab provides some more great visibility for trending your network and looking for the unexpected.

2.0 Overview - SoftwareTop Level Domains:

2.0 Overview - TLDAnomalous Domains tab borrows from Johannes Ullrich’s “A Poor Man’s DNS Anomaly Detection Script.” Every night at 1 a.m. a search will run exporting the top 9,999 domains Bro saw queried the previous day to a .csv file. The Anomalous Domains tab looks domains up against that .csv file and if there’s a match the domain is ignored. The only domains that should show up are domains that hadn’t been visited the previous day. For fun, you can also sort Anomalous Domain hits by originator/source IP if you want to keep an eye on more erratic and unpredictable hosts. When you first install the app you’ll get a .csv not found error when you view this tab until the first csv export runs.

The CIF tab performs external .csv lookups against CIF results similar to previous releases.

Bro(wser) is the newest addition and it is all Bro. With tabs for Notices, Connections, DNS, HTTP, SSL, SMTP, FTP, IRC, SSH and Software, you have comprehensive Bro visibility in a digestible structure from one dashboard.

2.0 - Bro(wser)Drill down on a selection and you’ll get 4 pie charts summarizing event data and a list of IPs involved.

2.0 - Bro(wser)Drilling down on the IP list will query all Bro log sources for uid’s matching the selected event(s) and you can drill again to see the raw events:

2.0 - Bro(wser)In the example above, you can see how the selected uid was a connection (bro_conn sourcetype) via ssh (bro_ssh) that also triggered a bro_notice alert (or two) followed by the raw Bro events.

The new IR Search you’ll find looks very similar to the Overview tabs and I’ve built a mini Bro(wser) summarizing all Bro activity for the searched IP..

2.0 - IR SearchDrilling on the Bro dest_ip list gets you the grouped uid results similar to Bro(wser):

2.0 - IR SearchFrom there you have access to the workflow menu. Following the Bro data, you get a breakdown of all activity detected for the specified IP during the time range designated, with the events grouped in buckets allowing you to adjust the the bucket size.

2.0 - IR SearchSOstat got a makeover as well, but I think I’ve passed my screenshot quota.

I mentioned the workflow menu briefly as there are several additions here. I’ve added VirusTotal MD5 and DShield.org lookups, in addition to the IR Search workflow.

I’m sure there’s more I’ve left off but that’s the bulk of the changes for 2.0. It’s a work in progress and I’ll be continuing to tune and tweak as it gets more production use and as always, feedback is welcome!

Special thanks to all of the Security Onion team for their efforts!

Enjoy the release and have a secure and happy new year!

 

Security Onion for Splunk version 1.1.7 – README

1.1.7

  • Tweaked Sguil indexing to prevent Bro URL data from being duplicated into Splunk via sguild.log.
  • Monitors dashboard field name drop down selections added to all panels.
  • General Mining dashboard added panels for Bro SSH logs and Bro HTTP TLDs (top level domains). Also added drop down options for Bro FTP and IRC panels.
  • Squil Mining has been updated and improved.
  • Syslog Mining dashboard added for Bro Syslog.
  • An Event Workflow was added for searching Splunk for events by src_ip.
  • …and last but not least:
  • CIF Dashboards!

For the CIF integration to work you need access to a CIF (Collective Intelligence Framework – http://code.google.com/p/collective-intelligence-framework/) server in order to export query results to a .csv file for the external lookups.

The CIF integration leverages three files created by exporting CIF query results to .csv format. If you don’t have access to a CIF server but want a sense of how the app works, I’ve provided three sample files in the lookups folder. Visit http://testmyids.com via a browser and you should see results in the CIF dashboards.

IMPORTANT – By including these 3 sample files (that contain NO actual CIF data; just fields and a test/sample entry) we prevent Splunk from throwing lookup table errors for those who won’t use the CIF capability. The drawback is updating the Security Onion app will overwrite existing .csv files, so remember to backup the lookups/*.csv files prior to future updates or plan to recreate new CIF .csv files.

The three CIF queries used for development and testing were:

cif -q infrastructure -s medium -c 85 -p csv -O infrastructure.csv
cif -q url -s medium -c 85 -p csv -O url.csv
cif -q domain -s medium -c 65 -p csv -O domain.csv

Once you’ve created these files you have to do two things:

  1. Edit each file removing the “# ” (hash followed by space) from the first row of each file.
  2. Copy the files to /opt/splunk/etc/apps/securityonion_cif/lookups/

I highly recommend scripting the CIF export, file manipulation and copy/move process.

CIF Lookups:

In the first release there are two dashboards:

  • IP Correlation checks dest_ip addresses from the Bro Connection logs indexed against the CIF infrastructure.csv export.
  • Domain/MD5 Malware Correlation checks Bro HTTP domains and executable files downloaded MD5 hashes against CIF.

Both dashboards work the same in that you’ll first see a list of IP addresses with an event count. Clicking on an IP will provide CIF related details around the match as well as a list of all activity to the dest_ip address. From there you can use the Event workflow menu to splunk deeper.

=============================================

I’ll try to post some screenshots and more details soon!

Update:

Here are a couple screenshots of the CIF dashboards. When they load, the IP list in the top right shows all the IP matches in the CIF infrastructure.csv file that appeared in the Bro connection logs. A click will load some CIF details (impact, severity, confidence, etc) with the matching events below. From the events view, you can then use the Events Workflow menu (blue square with white arrow) to pivot to a search for activity from the the source IP, full CIF queries, Robtex and DShield lookups. OSINT at it’s finest!

The Domain/Malware MD5 dashboard works the same way. The only real difference is it’s matching Bro HTTP logs against both the domains.csv and url.csv. I wanted to match full urls from Bro against the url.csv, but I’ve run into a few technical difficulties. Bro logs domain and uri as separate fields. I can make Splunk index them as a single field for a direct match lookup against url.csv but that would equal more indexed data. So until I can figure out a better way to do that, we’re matching against the CIF malware_md5 field. Bro hashes executable files downloaded via HTTP by default and this allows us to limit the query to only bro_http fields that actually have an MD5 value.

The key distinction with the Domain/Malware MD5 dashboard is the presence of the cif_malware_md5 field in the CIF Details panel. If that field is populated with a hash value, then you have had a hit on a positive malicious executable download. If there is no hash value, then you have a domain name hit.

May the force be with you.

Helping the Seekers – how to place “security onion”

Monitoring activity on this site gives me a glimpse into what search terms are drawing people here and I’ve always found the results interesting. Search terms are hardly a cry for help, but they usually are a whisper in the dark and can highlight issues people are dealing with and particularly where they might be struggling to find answers.

So I bring you “Helping the Seekers.” As search terms bubble up the list in frequency, I’ll periodically spotlight a few and try to have something to help out the next person who comes along. Since the release of the Security Onion Splunk app, “how to place ‘security onion'” has bubbled towards the top. The fun part is going to be trying to cover all the possible intentions behind that search term.

The answer to the “how to place ‘security onion'” question, regardless of the size of your network, is first at your entry/exit points to the Internet (aka gateway or egress points) just inside your firewall or Internet router. Security Onion can then monitor all traffic coming into or out of your network. If you can get eyes on that traffic, you’ll quickly be able to assess the state of your network and be in position to respond if an incident did/does occur. Fronting critical servers is also highly recommended, such as authentication servers, DNS, SQL, Microsoft SCCM/SMS and anywhere else critical data rests or traverses.

Why wouldn’t you want to put it outside of your firewall? Well, you could, but you lose visibility of your internal hosts as the sensor will not see any of your private IP addresses, only the public IPs from the gateway router/firewall. The Internet is a very noisy place and by placing the sensor just inside the firewall, the sensor will only monitor traffic that is specific to and allowed by your network and firewall.

(This is where I get on my hobby horse about running Security Onion. Whether you know what you’re doing or not, it’s pretty easy and affordable to setup and maintain. Just pet Sguil and Snorby every now and again and most smaller deployments will hum once tuned a bit. If you EVER need to call in professional assistance responding to an incident, you will thank me for the bill because it will save you a fortune in money and time.)

That answers the “how” question in terms of location (or maybe that would’ve been for a “where” question?). Regardless, now you know where to put it, what do you do? For this example we’ll use a home or small office environment, but the difference between small and large is typically capacity and the ability to handle the load. The same concepts of getting traffic to the Security Onion host apply.

A basic home or small office setup will look something like this:

Your Internet connection enters through a cable or DSL type modem and typically a gateway router device, like a Netgear or Cisco/Linksys, which we’ll call the perimeter.  The modem and router can potentially be the same device and if there’s a network firewall at play it will be here too. In the setup above, the gateway router is typically the DHCP server. We need to see those private addresses so we can identify which hosts are generating events. So we need a way to get between the endpoints and that router.

One easy and affordable way to accomplish this is with a Mikrotik Routerboard 250GS. They can be had for around $40 and will give you a fully manageable 5 port gigabit switch  (example 1).

In this case, we’d want to drop the Mikrotik between the gateway router that is issuing IP addresses to the internal devices. The only connections to our gateway router will be the WAN connection going to your modem and a single LAN connection to the Mikrotik. The gateway router will continue to do what it’s been doing with the only difference being all network activity to or from it will pass through the Mikrotik first. If you want to add wireless to the above setup, you’d want to use a separate wireless access point device plugged into the Mikrotik.

Your Security Onion host will ideally have two network interfaces if you plan on managing/monitoring it from another host on the network. One interface is your management interface which gets an IP address and can be connected to remotely. The other interface is your sensor interface where you want to see mirrored traffic; this is the data that Security Onion will monitor and analyze.

We used to have hubs, which were dumb but great at the same time. They basically would pass all traffic across all ports connected to the hub. Just plug in your Security Onion sensor and it would automagically see the data. Hubs have given way to switched networking now, which is why we need one we can manage (like the Mikrotik). Most retail store type switches don’t allow for things like configuring port forwarding.

So this is what your Mikrotik switch setup might look like:

Plug your gateway router LAN port into Port 1 and using RouterOS configure it to span all traffic to Port 2, where you’ll plug your Security Onion sensor interface. Ports 3-5 are free for endpoints. You can connect another switch or a WiFi AP to extend access to additional devices if you need more port capacity. The key is that all traffic into and out of your network at this point will be going through Port 1 on the Mikrotik which will be mirrored to Port 2 where Security Onion is listening.

I’ll save the Mikrotik step by step guide to configuring it’s RouterOS for another day.

Mad props to @Diagramly (http://www.diagram.ly/) for their awesome diagramming tool!

IDS Rule Reference for Splunk 1.0

I created a standalone version of IDS Rule Reference for Splunk for Snort/PulledPork users who are not running Security Onion. I’ve added a few dashboard views to provide a little more flexibility for searching or researching rule documents.

The initial IDS Rules view is what is included in the Security Onion for Splunk app. It can be used for researching rules and activity by filtering on enabled rules or by category, classtype or source file.

The first of the new additions is a more flexible search dashboard allowing for wildcard searches by rule name or sid. Simply enter keywords or some (or all) of a sid and you’re off and running.

Lastly, the IDS Rule Tome allows you to browse only the rules that have a matching rule reference document. Undocumented rules will not appear in these results.

Known issues: I’ve not sorted out how best to handle the small subset of rule documents that share a sid, so rule documents named genid-sid.txt will provide inconsistent results.

If you’re using IDS Rule Reference in Security Onion for Splunk and want the additional views, simply download this app and you’ll be good to go.

Setup is pretty simple if you’ve got Splunk rolling already.

Install:
Download the Snort Rule Documentation (opensource.gz) from http://www.snort.org/snort-rules then extract opensource.gz to the monitored path:

tar zxvf opensource.gz -C /opt/splunk/etc/apps/ids_ref/local/rules

Copy your Snort PulledPork *.rules files to the monitored path:

cp *.rules /opt/splunk/etc/apps/ids_ref/local/rules/

Restart Splunk.

Event Workflows:
You can modify the Event Workflows from Splunk Manager > Fields > Workflow actions. Edit the IDS Rule Reference “Apply only to the following fields” to apply the workflow link to your Snort sig_id field in Splunk). You’ll also want to edit the Search String variable field name ($sig_id$ is the default).

As always feedback and suggestions are welcome for improvements!

Security Onion for Splunk 1.1.3 – IDS Rule Reference

I’ve been working a lot lately on tuning Security Onion alerts, specifically Snort alerts via en/disablesid.conf, threshold.conf and Sguil’s autocat.conf. (If you use Security Onion and don’t know what those are, go here now.)  JJ Cummings’ PulledPork is an incredibly effective tool for updating Snort or Emerging Threat IDS rules and provides some very straightforward methods of controlling what rules are enabled or disabled in Snort. It does that job incredibly well, which is why it’s the tool of choice in Security Onion.

Where I ran into issues was in keeping track of it all. I had 6 terminal windows open: enablesid.conf, disablesid.conf, threshold.conf, autocat.conf, and windows to grep downloaded.rules and lookup the rule reference files. I also had Sguil and Snorby up as I like to keep Sguil focused on incidents and clean of false positives and let Snorby and Splunk provide the deeper visibility into less certain events, where I can get full Bro IDS context. Keeping track of what I had enabled in Snort, but not in Sguil and what was enabled versus disabled was maddening and my desktop was a mess.

There had to be an easier way…so I maddened myself the Splunk way during off hours to reduce the maddening during work hours. Hopefully the end result will help you maintain your sanity during the tuning process.

Just to be clear, what I’m about to describe in no way replaces what PulledPork does. It provides Splunk visibility to rule files created by PulledPork.

Version 1.1.3 introduces IDS Rules to the SOstat menu. By indexing PulledPork *.rules files and the Snort Rule Documentation – opensource.gz (Oinkcode required), this dashboard allows you to search Snort rules by Classtype, Category, Rule Source and/or Rule Status (enabled/disabled). You can quickly check, for example, what rules in the trojan-activity classtype are disabled. Drill down on a rule and you can view the rule, the Snort reference file (if the rule has one) and a timechart with a time picker to view the selected rule’s activity over time.

Needless to say, it’s made sorting through rules and rule data much more manageable.

Before I get to the eye candy, here’s some mind candy. You have to enable the Data Inputs in the Splunk GUI and I’m leaving it up to you in terms of how you want to index the data. If volume is a big concern for you, enable the Data Inputs and manually copy the files (extract the Rule Documentation) to the defined “monitor” folders. If you’ve got audit and tracking concerns, I provide a couple very simplistic scripts and script inputs to give you an idea how you can script the process to provide a running history of a rule and it’s status. If you’re hardcore tuning, you might even want to setup /etc/nsm/rules as the Data Input for real time monitoring. It’s really up to you. Just be mindful that it can chew up some indexing volume if you get carried away.

It’s going to need a little tuning of the dashboard view to handle real-time or daily monitoring and I’m working on filtering by sensor if you have various policies applied across a multi-sensor deployment. But in the words of @DougBurks, “release early and release often.”

Now the eye candy.

When you first load the dashboard you’ll see something like this:

IDS RulesThe drop downs let you refine your searches:

IDS Rules - ClasstypesIDS Rules - CategoriesYou can also specify the rule file you’re targeting as well as whether you want to view all, enabled or disabled rules. Once you’ve made a drop down selection the rules window will populate:

IDS Rules - Rules PanelDrilling down on a rule will display the rule, the reference file (if available for that sid) and a timechart with a time picker so you can quickly check on rule activity.IDS Rules - DrilldownZooming in on the Rule Reference panel:

IDS Rules - Rule Reference PanelThe event workflow for Snort Sguil events now looks like this:

IDS Rules - Sguil WorkflowSelecting *VRT – 498 will open a new Splunk search window to the rule reference file result. (I’m going to make this cleaner in future releases).

IDS Rules - Workflow Result

Quick Setup (for ad-hoc indexing)

Install/Upgrade the app. Enable the Data Inputs for ids_rules and ids_rules_doc sourcetypes in Splunk Manager.

If you use CLI, copy /opt/splunk/etc/apps/securityonion/default/inputs.conf to /opt/splunk/etc/apps/securityonion/local/inputs.conf and edit the /local copy. Making changes to the /local files will not be overwritten by app updates, whereas /default will.

Scroll to the bottom and look for the monitor stanza for sourcetype ids_rules and change disabled = 0 instead of 1.

[monitor:///opt/splunk/etc/apps/securityonion/local/*.rules]
sourcetype = ids_rules
followTail = 0
disabled = 0

If you want to pull in the Suricata rules as well, you might need to add the following after the “disabled = 0” line:

crcSalt = <SOURCE>

That will force Splunk to index everything in the path. Scroll down a little further and enable the monitor for ids_rules_doc sourcetype:

[monitor:///opt/splunk/etc/apps/securityonion/local/doc/signatures/*.txt]
sourcetype = ids_rules_doc
followTail = 0
disabled = 0

Exit and save the file, then copy the rules files to the Splunk folder.

cp /etc/nsm/rules/*.rules /opt/splunk/etc/apps/securityonion/local/rules/

Download the Snort Rule Reference files via browser or curl them:

curl -L http://www.snort.org/reg-rules/opensource.gz/-o opensource.gz

Then extract the files to the monitored path:

tar zxvf opensource.gz -C /opt/splunk/etc/apps/securityonion/local/

Restart Splunk and give it a few to percolate. It will take a few minutes to index all the rule files so be patient. Edit /opt/splunk/etc/apps/securityonion/default/inputs.conf and disable the monitors. Then head on over to SOstat > IDS Rules and give it a spin. If you see a blue bar error at the top that reads “No matching fields exist” the files haven’t been indexed yet. You can do a search for “sourcetype=ids_rules” and/or “sourcetype=ids_rules_doc” to check on the indexing process or open the Search app from the Splunk menu and check the source types panel in the bottom left. Find the ids_rules* sourcetypes and when the count stops going up after more than a few minutes it’s done. Restart Splunk to reload the inputs.conf file and disable subsequent indexing.

I’m seriously debating releasing this as a PulledPork for Splunk app as well, so I would love feedback from either Security Onion users or Snort PulledPork users as to whether there is interest or need outside my own desire for order. But then again I’ll have some time on the flights to and from Vegas next week; it might be fitting for a PulledPork Splunk app to come into it’s own on an airplane, eh JJ?

Other notables in this release:

  • You may need to reenter your CIF api key in the workflow config if you’re using CIF.
  • The SOstat Security Onion and *nix dashboards now allow you to view details by all SO servers/sensors or by the individual hosts in a distributed deployment.
  • VRT lookup added to workflow for Sguil events with a sig_id. Not all sig_ids will have a Snort rule reference file (especially Emerging Threat rules), so mileage will vary.

I’m hopeful making the Snort rule reference files accessible will help move towards the ultimate goal of this app. All along I’ve had two end users in mind: a large scale deployment and the home user. Both can install Security Onion with little knowledge thanks to Doug’s efforts, but neither is assured to be able to take it to the next step without help or a lot of effort if the expertise isn’t there. Providing easy access to context, whether it’s Snort rule reference files or CIF queries, can make a huge difference. To that end, more updates will be coming with a Sguil mining dashboard that will provide correlated context around events (think IR search result type data as drill down results as you review Sguil events) and more Mining views for network based indicators.

I’ll be at BlackHat and DefCon next week so if any Security Onion or SO for Splunk users want to meet up, hit me up via email or the twitter (@bradshoop).

Happy splunking!

Securing Splunk Free Version When Installed On Security Onion Server (or anywhere else)

This stroke of genius comes directly from the man behind Security Onion, @dougburks, and solves two problems, one serious the other functional. Splunk’s free version allows you to index up to 500 mb/day, but does limit some (even basic) capabilities, most important of which is disabling authentication. If you’re running Splunk free version on your Security Onion server and access the server remotely (from another workstation), I highly suggest you make this your standard access process. The instructions below work on Ubuntu distributions and if you followed Doug’s advice about using a Security Onion VM as your client, this should work perfectly as long as you haven’t configured the VM as a server.

The method can be used on a Windows or Linux client. The instructions below focus on Linux, but googling “windows ssh tunnel how to” should get you a good start. In the example below port 81 is the Splunk port. If you installed Splunk on a different port just replace 81 with it.

The approach uses an SSH tunnel and is really easy to setup. On your Security Onion/Splunk server you’ll want to make sure SSH is enabled in Uncomplicated Firewall (ufw).

sudo ufw status

You should see 22/tcp ALLOW in the results. If it says DENY, then enable it:

sudo ufw allow 22/tcp

Next configure ufw to block (yes I said block) Snorby, Squert and Splunk ports:

sudo ufw deny 81/tcp
sudo ufw deny 443/tcp
sudo ufw deny 3000/tcp

From a remote Linux host with openssh-client installed:

sudo ssh username@securityonion.example.com -L 81:localhost:81 -L
443:localhost:443 -L 3000:localhost:3000

Replace username with the Security Onion/Splunk server user and securityonion.example.com with the hostname or IP address of your Security Onion/Splunk server. This command essentially tells your client to pass anything destined to localhost ports 81, 443 or 3000 to your SO server on it’s localhost port 81, 443 or 3000 via the SSH tunnel. The command requires sudo due to accessing privileged ports, so you’ll be prompted for your local password then again for the remote SO server user’s password. After authentication, you’ll have an active SSH terminal session to the server.

Launch a web browser and browse to any of the following:

http://localhost:81 – Splunk
https://localhost:443 – SO Home/Squert
https://localhost:3000 – Snorby

It’s that simple.

If you recall I mentioned a “functional” advantage of using this approach. In the Security Onion for Splunk app, I provide links to Snorby and Squert, but unfortunately, the user must configure the urls to fit their environment if they access the tools remotely. The default config uses “localhost” as the server, so if you’re following, if you use the above method to access Splunk securely, the Snorby and Squert links work out of the box. =)

Thanks and hat tip to Doug for this little gem! I had to bite my lip whenever I recommended someone install the free version of Splunk due to the authentication limit, but now I don’t have to.