Replies: 3 comments
-
Also looking for a solution to this. We have a 31 node Elastic cluster in production for SIEM and business analytics, and I would like to send all T-Pot data to that production cluster. This would allow us to pivot and correlate with other live network/security data. |
Beta Was this translation helpful? Give feedback.
-
yup, there is a wiki just for this. its not hard. if you just have single honeypots go by the wiki.. if you are using a hive install (multiple honeypots -> one hive server) you modify the wiki a little. instead of modifying the logstash.conf you modify the http_input.conf. i keep the elasticsearch output to the honeypot or hive, and add an additional output.. that way I can use the built in visualizations and stuff .. while also sending the data to another elk cluster for other kinds of analysis.. if you just want it to go to your new cluster.. just replace the output strings with where you want it to go.
|
Beta Was this translation helpful? Give feedback.
-
I am still in the planning phase of our honeypots, so I haven't looked into the actual configs or data yet. Does the data produced by T-Pot conform to the Elastic Common Schema (ECS)? EDIT: Just looked, and absolutely not ECS-compliant. This would take a ton of work for production. |
Beta Was this translation helpful? Give feedback.
-
Hello,
As a result of a project I have to undertake, I came across the issue of needing to separate the built-in ELK stack in T-Pot and send its data (logs) to an external ELK stack. To provide some context, T-Pot is running on Debian 11, and the external ELK stack is installed on an Ubuntu machine, as per the project's requirements. Both machines will be deployed on a cloud hosting provider.
So, my question is how could I send data to the external ELK stack?
I'm looking forward to your comments. Thank you in advance.
Beta Was this translation helpful? Give feedback.
All reactions