Cisco ASA alerts and Kibana

KibanaToday we will be sending alerts from my Cisco ASA firewall to Kibana. As I was looking how to configure this, I found some examples of how to do this, but none of them really worked, so I started “hobbying” myself and created something that works really well.

The easiest way so send ASA alerts to Kibana is sending the syslog messages and that’s it. But this will only get the message there, and nothing else. It will be hard to create filters, as everything is in one field, and also GeoIP won’t work because there is no field for the IP address.

So we have to create a filter in Logstash that will seperate the fields in the syslog message. Hardest part is the lots of different messages the ASA can send. Thankfully, nowadays Logstash has built-in support for these messages.

After a little messing around, this is what my input file looks like:

input {
        udp {
                port => 10514
                type => "cisco-fw"

filter {

        # Extract fields from the each of the detailed message types
        # The patterns provided below are included in core of LogStash 1.4.2.
        grok {
                match => [
                        "message", "%{CISCOFW106001}",
                        "message", "%{CISCOFW106006_106007_106010}",
                        "message", "%{CISCOFW106014}",
                        "message", "%{CISCOFW106015}",
                        "message", "%{CISCOFW106021}",
                        "message", "%{CISCOFW106023}",
                        "message", "%{CISCOFW106100}",
                        "message", "%{CISCOFW110002}",
                        "message", "%{CISCOFW302010}",
                        "message", "%{CISCOFW302013_302014_302015_302016}",
                        "message", "%{CISCOFW302020_302021}",
                        "message", "%{CISCOFW305011}",
                        "message", "%{CISCOFW313001_313004_313008}",
                        "message", "%{CISCOFW313005}",
                        "message", "%{CISCOFW402117}",
                        "message", "%{CISCOFW402119}",
                        "message", "%{CISCOFW419001}",
                        "message", "%{CISCOFW419002}",
                        "message", "%{CISCOFW500004}",
                        "message", "%{CISCOFW602303_602304}",
                        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
                        "message", "%{CISCOFW713172}",
                        "message", "%{CISCOFW733100}"

        # Parse the syslog severity and facility
        syslog_pri { }

# Do a DNS lookup for the sending host
# Otherwise host field will contain an
# IP address instead of a hostname
dns {
    reverse => [ "host" ]
    action => "replace"

geoip {
      source => "src_ip"
      target => "geoip"
      database => "/etc/logstash/GeoLiteCity.dat"
      add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
    mutate {
      convert => [ "[geoip][coordinates]", "float"]
    # do GeoIP lookup for the ASN/ISP information.
    geoip {
      database => "/etc/logstash/GeoIPASNum.dat"
      source => "src_ip"

output {
  elasticsearch { host => localhost }

Note: When using Logstash 2.0, the output section should look like this:

output {
  elasticsearch { hosts => localhost }

This will let Logstash listen on port 10514 for syslog messages. Make sure the filename of this input starts with a number higher than 30, so it will end up after the 30-lumberjack-output.conf file (I named it 40-cisco_asa.conf).

Now let’s configure the ASA. This is what it will look like:

ciscoasa# sh run | i logging
logging enable
logging timestamp
logging console notifications
logging trap notifications
logging asdm notifications
logging facility 21
logging device-id hostname
logging host Inside 17/10514
logging permit-hostdown

The most important line is logging host Inside 17/10514. This configures the logs to be send to (replace this with the IP address of your Logstash/Kibana server) via the Inside interface (VLAN) using UDP (protocol 17) on port 10514 (which we just configured in Logstash).

Write this to NVRAM on the ASA (write) and perform a service logstash restart on the Kibana server and your logs should be getting in.

Now, we started all this, so we could visualize what the ASA is doing. I created a dashboard myself, but I’m still moving around with visualizations, as I still don’t know what I want to see. There is too much to choose from! 🙂


This dashboard shows all blocked traffic by my 3 Cisco ASAs.

And this is what it looks like using the dark theme in Kibana 4.2:


4 thoughts on “Cisco ASA alerts and Kibana”

  1. Hi, thanks voor deze tutorial. Kan ik u bereiken via mail? Ik ben bezig met een ELK server en kom er nu ineens achter dat er zoveel mogelijkheden zijn dat ik door de bomen het bos niet meer zie 🙁

    Gr. Alex

  2. Can you share the template used for elsasticsearch here ? Such as how you lower cased the protocol filed, ( TCP vs tcp ) , as well as the TCP Flags as raw data ?

    1. I didn’t create any template manually, I just let elasticsearch create it automatically. Nor do I use raw fields. These can be used by using a “logstash-” index, or indeed by manually creating a template for your own index.

Leave a Reply

Your email address will not be published. Required fields are marked *