While the Elastic Stack (ELK) is typically used for live log monitoring, Winlogbeat can be modified to manually send cold logs, or old, inactive Windows Event Logs (EVTX) to ELK for analysis. This functionality allows an analyst to take EVTX files from images or data collected from potentially relevant systems and utilize the functionality of ELK for their investigations—bypassing the typical use of live data monitoring.
This opens the door to using ELK as a post-incident investigative tool for DFIR analysts, rather than just for security operations center (SOC) purposes like live analysis. This post will examine one method to leverage ELK for your event log analysis.
WHAT YOU NEED
- A functioning ELK cluster
- Single-node or multi-node are both acceptable.
- Logstash is not required.
- Ubuntu or CentOS as your ELK Node OS
- Windows host machine to send EVTX log files to ELK
- A Winlogbeat data shipper
The following guide will be based on a single-node ELK instance (Ubuntu 19) that does not incorporate Logstash.
DISCLAIMER: The files you will see described below were created and modified for the purpose of this post. As a result, a “plug-and-play” method of just copying these files and executing the process may not fully work as intended within your own environment. Hopefully this information can help guide you through this or a similar process, making it easier to implement within your environment for you own purposes.
HOW THIS WORKS
Winlogbeat is the data shipper created by Elastic and is used to send “hot,” or live, EVTX files to ELK as they are generated. This allows for the live monitoring of systems based on recorded events that happen in real-time. For digital forensic investigators, this data is often “cold” (e.g., collected from a live system where log aggregation is pre-configured, or extracted from system images). The typical utilization of ELK through a live instance would be of no use to an investigator dealing with these cold extracted logs.
Winlogbeat operates using the default configuration file “winlogbeat.yml” for all settings. This program also typically runs as a service on the host machine, operating in the background as it captures EVTX data upon creation. The setup we will use to grab our cold extracted logs will involve replacing winlogbeat.yml with a modified version, as well as utilizing a quick Windows PowerShell script to invoke Winlogbeat manually, rather than as a service.
DOWNLOAD WINLOGBEAT
If you have already setup Winlogbeat on your host system, please proceed to the next step. If you have not done so already, go to the following link to download the latest version of Winlogbeat from here: https://www.elastic.co/downloads/beats/winlogbeat.
NOTE: Winlogbeat version 7.3.2 was used for preparing this guide.
Once downloaded, extract the contents and place the resulting Winlogbeat folder wherever you see fit on your system. Make note of this parent directory.
SETUP WINLOGBEAT FOR ELK
Within the extracted Winlogbeat folder, you should see a file labeled “winlogbeat.yml.” If you have not previously used Winlogbeat within your ELK cluster, perform the following steps. If you have already used Winlogbeat within your ELK cluster, leave “winlogbeat.yml” alone and skip these steps.
Open winlogbeat.yml with your favorite text editor. Scroll down to the “Outputs” section and modify the “Hosts” option to the IP address of your Elasticsearch instance. For single-node clusters, Elasticsearch resides on the same node as the rest of your ELK processes.

Example of winlogbeat.yml with <ELK-IP> pointing to Elasticsearch
Once completed, save the file and then open a PowerShell window as Administrator on your host machine. Winlogbeat parses information based on index templates—set guidelines that tell the service how to interpret the EVTX files you give it to subsequently format for easy viewing. Winlogbeat has a default template that comes as part of the service and can be uploaded to your ELK instance from the Winlogbeat root folder with the following command: .\winlogbeat.exe setup -e.
You should now be ready to modify Winlogbeat.
CUSTOM WINLOGBEAT YML FILE
Ignoring the default winlogbeat.yml file, create a new file within that same directory (at the root of the Winlogbeat folder) and name it something that makes sense to you. I will name mine “winlogbeat-evtx.yml” for the purpose of this post.
Once created, open it with your favorite text editor and add the contents of the attached file below:
Here is a quick breakdown of what is what:
- You may notice PowerShell variables in this file, such as $EVTX_FILE. These are required and utilized in interaction with the Powershell script that will be covered shortly.
- The Winlogbeat registry file (evtx-registry.yml) was created as a way for Winlogbeat to keep track of which files have already been uploaded by path to prevent duplicate uploads. It is intended to keep a record of which logs within each EVTX file have been uploaded, so that if the upload is interrupted, it can easily resume later. This file is created in this configuration to separate it from the official one used by Winlogbeat during typical use as a service, just as a form of best practice.
- You may notice that custom fields were added that are determined by Powershell variables—this is so investigators can label uploads by client, case number, etc. If this does not apply to you, comment out (#) everything below “add_fields,” leaving “drop_fields” un-commented.
- The index creation process generates a new index within ELK, using the Winlogbeat template uploaded earlier, based on client name (${CASE}-${ELK_CLIENT}-evtx). For this version of the custom “winlogbeat-evtx.yml” file, a name, client or otherwise and some form of identifier (case number, etc.) will be required to create the index and add fields that can be used to separate data by case when viewing. Because these fields are being manually added to each log entry for filtering, they are not necessary—you can feel free to remove these from the base code if they do not apply to your investigative needs.
All settings you see in the custom “winlogbeat-evtx.yml” file are legitimate settings that can be used within “winlogbeat.yml;” however, the settings are striped to the bare minimum needed for this implementation. Elastic has a page on their site dedicated to what each of these settings do, which can be found here.
EVTX UPLOAD POWERSHELL SCRIPT
Now that we have created the instructions to feed Winlogbeat cold EVTX log data, we need to utilize a vehicle to make that happen. The easiest way to execute this is through Windows PowerShell scripts (.ps1). Open Windows PowerShell ISE on your host machine as Administrator and copy the code with the below attachment into the editor portion of the program:
This attached script may look daunting, but the entire file is commented to provide context as to what is going on. Here are a few important points:
- Based on design, the script as is must be run in the Winlogbeat folder’s parent directory. For my purposes, I put the Winlogbeat folder in a folder called “ELK-Tools” to help organize and prevent a random Winlogbeat folder from floating around on my host machine. This also helped to identify that this package of Winlogbeat has been modified for manual upload.
- The script automatically finds the “evtx-registry.yml” file and deletes it, if it exists from a prior upload. This is done to prevent logs that may be uploaded from the same directory (if needed) from uploading again.
- When asked for a target directory, the script can only accept a singular folder containing EVTX logs or a singular folder containing sub-folders labeled by system hostname. The latter method was designed to allow an investigatory to upload multiple systems at once, essentially running the script in a loop.
NOTE: As-is, only one level of subfolders is allowed for this script. You should have, at most, only one folder containing system-labeled folders housing the EVTX files for their corresponding hosts.
- This script asks for case number, identifier, etc. for investigators to use. In my version of this script, only client name and case number are necessary—you can comment out the lines responsible for variables that you don’t need. If it makes sense, rename $CLIENT to something else; however, make sure it stays consistent with the “winlogbeat-evtx.yml” file you created earlier.
- Near the end of the script, you can see exe invoked using the “winlogbeat-evtx.yml” file created earlier—this is possible because Winlogbeat natively accepts command line arguments. All arguments following “-E” within the command are injected into the “winlogbeat-evtx.yml” file, hence the PowerShell variables.
NOTE: You will need to modify the code executing winlogbeat.exe (.\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml…) to match that of your own folder structure.
- The script also takes the console output for Winlogbeat and transfers it to a text file. These text files are labeled by date within the folder “elk-logging” in the parent directory for the Winlogbeat folder. This allows for easy troubleshooting if something were to go wrong, and also why I created the root ELK-Tools folder to house all of these files. Feel free to change this to match your needs.

Once saved, you should be able to execute this script, follow the on screen prompts and upload to your ELK cluster. Once uploaded, you will need to go through the typical process of adding the created index to your Kibana Discover page within Settings under “Index Patterns.”
WHY DOES THIS MATTER?
If successful, you should now have a modified version of Winlogbeat on your host machine primed and ready to upload EVTX logs without the need for live system collection. While your final product may not match that of mine above, this process should at least help guide your own implementation. The ability to utilize the functionality of ELK for systems post-collection rather than through live data monitoring allows digital forensic investigators to quickly and thoroughly analyze EVTX in the same manner SOC analysts can in real-time. This can not only save valuable examination time, but can open the door to analysts’ applying creativity to their investigations in ways not typically possible before.
Analyst-crafted dashboards can quickly identify indicators of compromise that can be run against new log uploads immediately, drastically saving time and potentially pointing an analyst to particular variants of ransomware or specific types of malware. Using Kibana’s Saved Searches feature can allow an analyst to quickly bring up saved queries pertaining to Remote Desktop Protocol connections, service installations, PowerShell executions and their frequencies within the environment, allowing for data frequency and pattern analysis. These are only a few examples, with many more possibilities to be thought of.