Visualising Cisco ASA logs with Kibana

I’ve used Cisco firewalls in their various forms for the last 10 years or so, from baby PIX 501’s through to the new ASA 5500X’s. While I’ve always been a big fan of the platform, one area which has always been deficient is their logging and reporting capability. There really is no comparison when you line up other vendors such as Pallo and Checkpoint along side Cisco when it comes to this.

I’ve recently started playing around with what people call the ELK stack, and have found it to be excellent at visualising cisco ASA logs. The ELK stack consists of three applications, ElasticSearch, Logstash, and Kibana.

ElasticSearch:

Elasticsearch is a search server based on Lucene. It provides a distributed, multitenant-capable full-text search engine with a RESTful web interface and schema-free JSON documents

Logstash:

logstash is a tool for managing events and logs. You can use it to collect logs, parse them, and store them for later use (like, for searching).

Kibana:

Kibana is a great tool for real time data analytics.

The end result is a system which is able to turn simple syslog messages into a screen which looks like my example below. Broken down are graphs to represent the top protocols, actions (ie accept, deny), destination ports, origin countries, and source and destination IP’s. Within each of these views all of the content is dynamic, say you’re only interested in dropped traffic, you can click this in the graph and the whole filter will change to only represent this data.

ASA

Click on “Deny” as the action, and “TCP” and as the protocol shows the top sources of this traffic. It immediately becomes easy to see the usual volumes of this type of traffic over any period of time you specify. Anomalies are easy to see and can be drilled down into to see more closely.

While setting up an ELK stack is outside what I’d planned on mentioning here, digital ocean have a good guideĀ on setting this all up. The pertinent logstash config I have used for the above is as follows:

input {
udp {
port => 10514
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
patterns_dir => "./patterns"
match => { "message" => "%{CISCOFW106014}" }
match => { "message" => "%{CISCOFW106001}" }
match => { "message" => "%{CISCOFW106023}" }
match => { "message" => "%{CISCOFW313005}" }
match => { "message" => "%{CISCOFW302020_302021}" }
match => { "message" => "%{CISCOFW302013_302014_302015_302016}" }
match => { "message" => "%{CISCOFW106015}" }
match => { "message" => "%{CISCOFW106006_106007_106010}" }
match => { "message" => "%{CISCOFW106100}" }
add_tag => [ "firewall" ]
}

geoip {
source => "src_ip"
target => "geoip"
database =>"/opt/logstash/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}

mutate {
convert => [ "[geoip][coordinates]", "float" ]
lowercase => [ "protocol" ]
}
}
}

output {
redis { host => "127.0.0.1"
data_type => "list"
key => "logstash" }
}


The above system will listen for syslog messages on udp 10514, run them through grok to extract the pertinent parts of the message, add geo-ip data, and forward it onto a local redis cache. Another system (the ELK receiver/indexer) polls this system and retrieves the logs and finally displays them to the user through the Kibana interface.

5 comments

  1. Hello,
    very nice tutorial. Can i have your Cisco ASA patterns to sort the syslog messages to fields like dst_ip, dst_port and so on.. ?

    Best Regards

    Daniel

    1. Hi Daniel,

      These GROK patterns are actually built into recent versions of logstash. On debian and ubuntu, these are found under /opt/logstash/patterns and can be referenced in your logstash config file with the following:

      patterns_dir => "./patterns"

      Once you have made this reference all of these Cisco syslog message may be referenced with those CISCOFWXXXXXX codes.

  2. Hello.
    Nice one. Can you share the dashboard “.json” files for kibana5 please as well, will be more helpful .

    Thank you.
    Umair

    1. Apologies but I no longer have access to the dashboard (I lost some indices on an ES upgrade!) and I haven’t got around to recreating it in the newer versions of Kibana. Sorry!

Leave a Reply

Your e-mail address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.