For the last week or so, I've been working on ingesting Corelight logs into our Elastic Security instance to give us the ability to generate alerts for our network capabilities. Unfortunately for me, there isn't a whole lot of documentation out there about how to do this, specifically.

Both Corelight and Elastic's preferred method of ingesting Corelight data is through the integration they developed. This would have been awesome except for one thing - we have security enabled on our Elastic cluster and Corelight doesn't provide a way to specify a root cert to use to connect. So, the connection just gets dropped by Elastic. All good - should just be able to point to Logstash and fire away, right? Not quite. So let's take a look at how to do it.


Objective: Ingest Corelight logs into Corelight then have it parsed correctly in Elasticsearch


Before we begin, you should ensure that you have the following all straightened out:


Step 1: Upload the Elasticsearch ingest pipeline for Corelight

This process is well documented on Corelight's GitHub page. I used the automatic install with the python script. I ran it from one of my Elasticsearch servers, but you could theoritically run it from anywhere as long as it can connect to Elasticsearch on whatever port you have Elasticsearch running.

Once complete, edit the index templates to match the index pattern you want (I used corelight-*). Do not do this for the temp or parsing failure index templates.

Create an index pattern for the new index name.


Step 2: Prepare the Logstash Server

For this part, there are some required items, and some optional. I'm not an expert on Logstash, so if you're reading this and you know better - do it your way.

  1. Pick a port that you want to use - I don't recommend using one that is being used by other log sources. Corelight data is going to be heavy and fast.
  2. Open the port on the server:
  3. Create a new file that will be used as our ingest config
  4. Create a pipeline for Corelight in /etc/logstash/pipelines.yml

Step 3: Configure the Input

The issue I encountered here was that all of the Corelight logs were failing to parse when they came in through Logstash, but parsed perfectly when I ran them through the Elastic pipeline debugger. After what I consider alot of troubleshooting, I determined that the message was arriving correctly, but something was happening to the content of the log before it got to Elastic.

So, what we need to do is transpose the content from the original message into what Logstash sends to Elastic. Below is a simplified version of what I used to do this:

            
input {
  tcp {
    port => "5045"
  }
}
filter {
  json {
    source => "message"
  }
  mutate {
    # This is optional, but now that we used the json filter above, it's literally just a duplicate of your log.
    remove_field => "message"
  }
}
output {
  elasticsearch {
    index => "corelight-%{+YYYY.MM.dd}"
    # All of your typical Elasticsearch config
  }
}
            
        

That's it on the Logstash and Elasticsearch end - fire up Logstash and make sure it's listening on the port you specified.


Step 4: Configure the JSON Export in Corelight

From the sensor menu, head down to the JSON output. Enable it, plug in the Logstash server's IP and port, then save the change.


Conclusion

That's it! You should now have Corelight data feeding into your Elasticsearch and being parsed correctly by the Corelight ingest pipelines.


Optional Step: Ensure that Elastic Security see's your data

In order for Elastic Security to see your newly ingested Corelight data, you have to tell it to:


Updated 22 May 2021