We are trying to ship a csv file from Filebeat > Logstash > Elasticsearch. This csv files has column by name “CreateOn” which will have the dates as below;
Thursday, March 10, 2016 3:00:23 PM UTC
Thursday, May 4, 2017 11:29:02 AM UTC
Sunday, March 9, 2014 2:43:16 PM UTC
Tuesday, August 26, 2014 11:42:33 AM UTC
Wednesday, January 24, 2024 4:32:20 PM UTC
We are getting this fields with CSV filter on logstash but the values are coming in type text but we wanted it as date format.
We tried below date filter on logstash conf file to convert it, but its not pushing anything after modify it .
date {
match => ["CreateOn", "EEE, MMMM d, yyyy h:mm:ss a z"]
target => "CreateOn"
}
We know its something related to the time format input , can someone help us to fix this.
Our logstash conf looks like below;
csv {
skip_header => "true"
separator => ","
columns => ["Display Label", "CreateOn", "CommonName", "IsSelfSigned", "Organization", "OrganizationUnit", "Issuer", "IssuerName", "SerialNumber", "ValidTo", "Version", "SignatureAlgorithm", "Subject", "Subject Alternative Names"]
}
date {
match => ["CreateOn", "EEE, MMMM d, yyyy h:mm:ss a z"]
target => "CreateOn"
}
We are trying to ship a csv file from Filebeat > Logstash > Elasticsearch. This csv files has column by name “CreateOn” which will have the dates as below;
Thursday, March 10, 2016 3:00:23 PM UTC
Thursday, May 4, 2017 11:29:02 AM UTC
Sunday, March 9, 2014 2:43:16 PM UTC
Tuesday, August 26, 2014 11:42:33 AM UTC
Wednesday, January 24, 2024 4:32:20 PM UTC
We are getting this fields with CSV filter on logstash but the values are coming in type text but we wanted it as date format.
We tried below date filter on logstash conf file to convert it, but its not pushing anything after modify it .
date {
match => ["CreateOn", "EEE, MMMM d, yyyy h:mm:ss a z"]
target => "CreateOn"
}
We know its something related to the time format input , can someone help us to fix this.
Our logstash conf looks like below;
csv {
skip_header => "true"
separator => ","
columns => ["Display Label", "CreateOn", "CommonName", "IsSelfSigned", "Organization", "OrganizationUnit", "Issuer", "IssuerName", "SerialNumber", "ValidTo", "Version", "SignatureAlgorithm", "Subject", "Subject Alternative Names"]
}
date {
match => ["CreateOn", "EEE, MMMM d, yyyy h:mm:ss a z"]
target => "CreateOn"
}
Share
Improve this question
asked Feb 5 at 9:07
RoopeshRoopesh
391 silver badge6 bronze badges
1 Answer
Reset to default 0Tldr;
Have you checked what you have in the CreatedOn
column ? CSV filter split on ,
yet you have commas within the column.
Which might mean the value within createdOn
field is Thursday
and not contain any else.
Which would fail the subsequent filter.
To reproduce
I create the following file data.csv
"Thursday, March 10, 2016 3:00:23 PM UTC"
"Thursday, May 4, 2017 11:29:02 AM UTC"
"Sunday, March 9, 2014 2:43:16 PM UTC"
"Tuesday, August 26, 2014 11:42:33 AM UTC"
"Wednesday, January 24, 2024 4:32:20 PM UTC"
And the pipeline.config
input {
file {
id => "my_plugin_id"
path => "/tmp/data.csv"
start_position => "beginning" # Ensure Logstash reads from the beginning of the file
}
}
filter {
csv {
separator => ","
columns => ["CreateOn"] # List all columns here if there are more
}
date {
match => ["CreateOn", "EEE, MMMM d, yyyy h:mm:ss a z"] # Ensure this pattern matches your date format
target => "CreateOn"
locale => "en" # Specify locale if needed
}
}
output {
stdout { codec => "rubydebug" }
}
I am gettting this ouput
{
"CreateOn" => 2014-08-26T11:42:33.000Z,
"@timestamp" => 2025-02-11T09:35:46.398916017Z,
"message" => "\"Tuesday, August 26, 2014 11:42:33 AM UTC\"",
"event" => {
"original" => "\"Tuesday, August 26, 2014 11:42:33 AM UTC\""
},
"log" => {
"file" => {
"path" => "/tmp/data.csv"
}
},
"host" => {
"name" => "041d52b6d04f"
},
"@version" => "1"
}