-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* pages edited with doc concerns Signed-off-by: dishanktiwari2501 <[email protected]> * data dog section Signed-off-by: dishanktiwari2501 <[email protected]> * prometheus section Signed-off-by: dishanktiwari2501 <[email protected]> * prometheus section Signed-off-by: dishanktiwari2501 <[email protected]> * changes updated Signed-off-by: dishanktiwari2501 <[email protected]> * changes made as suggested Signed-off-by: dishanktiwari2501 <[email protected]> * updated the product name Signed-off-by: dishanktiwari2501 <[email protected]> * updated the product name Signed-off-by: dishanktiwari2501 <[email protected]> * updated the docs Signed-off-by: dishanktiwari2501 <[email protected]> * updated the docs Signed-off-by: dishanktiwari2501 <[email protected]> --------- Signed-off-by: dishanktiwari2501 <[email protected]>
- Loading branch information
1 parent
2fa5e21
commit 928c667
Showing
32 changed files
with
2,423 additions
and
1,944 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,136 +1,153 @@ | ||
# ELK(ElasticSearch, Logstash, and Kibana) Integration for Chef Automate HA | ||
# ELK(Elasticsearch, Logstash, and Kibana) Integration for Chef Automate HA | ||
|
||
## Introduction to ELK Stack | ||
|
||
ELK Stack is open-source software that allows the search and visualization of logs generated by systems. ELK Stack has 3 primary components. | ||
ELK Stack is open-source software that allows the search and visualization of logs generated by systems. ELK Stack has three primary components. | ||
|
||
* **Elasticsearch** - A search engine that stores all collected logs. | ||
* **Logstash** - A data processing component that sends incoming logs to Elasticsearch. | ||
* **Kibana** - A web interface for visualization and searching of logs within Elasticsearch. | ||
1. **Elasticsearch**: A search engine that stores all collected logs. | ||
|
||
* **Filebeat** will be used to push logs from Chef Automate HA nodes to Logstash. | ||
1. **Logstash**: A data processing component that sends incoming logs to Elasticsearch. | ||
|
||
Installation of the ELK stack can be done in various ways depending on the organizational needs and requirements. Please refer to this document for details on system requirements and various kinds of installations. As part of this documentation, we have focused on one of the ways of installation. | ||
1. **Kibana**: A web interface for visualization and searching of logs within Elasticsearch. | ||
|
||
1. **Filebeat**: Filebeat is used to push logs from Chef Automate HA nodes to Logstash. | ||
|
||
The Elastic site (https://elastic.co) should be referenced for details on the sizing and configuration of ELK Stack. | ||
Installation of the ELK stack can be done in various ways depending on the organizational needs and requirements. Please refer to this document for details on system requirements and various kinds of installations. As part of this documentation, we have focused on one of the ways of installation. | ||
|
||
## Install Prerequistes | ||
The [Elastic site](https://elastic.co) should be referenced for details on the sizing and configuration of ELK Stack. | ||
|
||
## Install Prerequisites | ||
|
||
+ Dependency: ELK needs Java as a dependency, install the same if not: | ||
1. Dependency: ELK needs Java as a dependency, install the same using the following command: | ||
|
||
```sh | ||
sudo apt-get install openjdk-8-jdk | ||
``` | ||
|
||
+ Download and install the public signing key: | ||
1. Download and install the public signing key: | ||
|
||
```sh | ||
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elasticsearch-keyring.gpg | ||
``` | ||
|
||
+ Installing from the APT repository: | ||
1. Installing from the APT repository: | ||
|
||
```sh | ||
sudo apt-get install apt-transport-https | ||
``` | ||
|
||
+ Save the repository definition to /etc/apt/sources.list.d/elastic-8.x.list: | ||
1. Save the repository definition to /etc/apt/sources.list.d/elastic-8.x.list: | ||
|
||
```sh | ||
echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list | ||
``` | ||
|
||
## ELK Installation | ||
|
||
## ELK Installation: | ||
Please follow the below steps for installing the elastic stack in the normal way. | ||
Please follow the below steps to install the elastic stack in the normal way. | ||
|
||
### Elasticsearch Installation and Configuration | ||
|
||
+ Install the Elasticsearch Debian package with: | ||
1. Install the Elasticsearch Debian package with: | ||
|
||
```sh | ||
sudo apt-get update && sudo apt-get install elastic search | ||
``` | ||
|
||
+ Configure the elastic settings: | ||
1. Configure the elastic settings: | ||
|
||
```sh | ||
sudo nano /etc/elasticsearch/elasticsearch.yml | ||
``` | ||
|
||
+ Uncomment the Port and add the port number (for example - 9200) | ||
+ uncomment and the correct "network.host" IP | ||
+ If required, add cluster or "discovery.type: single-node" and node settings. | ||
* Uncomment the Port and add the port number (for example - 9200) | ||
* uncomment and the correct "network.host" IP | ||
* If required, add cluster or "discovery.type: single-node" and node settings. | ||
|
||
 | ||
 | ||
|
||
+ Start the elastic services: | ||
1. Start the elastic services: | ||
|
||
```sh | ||
sudo systemctl daemon-reload | ||
sudo systemctl start elasticsearch.service | ||
sudo systemctl enable elasticsearch.service | ||
``` | ||
|
||
For other ways to install elasticsearch; pls follow reference at [Elasticsearch-installation](https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html) | ||
|
||
### Kibana Installation and Configuration: | ||
### Kibana Installation and Configuration | ||
|
||
+ To install Kibana run the following: | ||
* To install Kibana run the following: | ||
|
||
apt install kibana | ||
sudo nano /etc/kibana/kibana.yml --> configure the kibana | ||
sudo systemctl start kibana.service | ||
sudo systemctl enable kibana.service | ||
```sh | ||
apt install kibana | ||
sudo nano /etc/kibana/kibana.yml --> configure the kibana | ||
sudo systemctl start kibana.service | ||
sudo systemctl enable kibana.service | ||
``` | ||
|
||
+ Uncomment the Port and add the port number for kibana (for example - 5601) | ||
+ uncomment and add correct "server.host" for kibana | ||
+ uncomment and add correct "network.host" for elasticsearch | ||
* Uncomment the Port and add the port number for kibana (for example - 5601). | ||
|
||
 | ||
* uncomment and add correct "server.host" for kibana. | ||
|
||
For other ways to install kibana; pls follow reference at [Kibana-installation](https://www.elastic.co/guide/en/kibana/current/install.html) | ||
* uncomment and add correct "network.host" for Elasticsearch. | ||
|
||
### Logstash Installation and Configuration: | ||
 | ||
|
||
+ To install Logstash run the following: | ||
|
||
apt install logstash | ||
sudo systemctl start logstash.service | ||
sudo systemctl enable logstash.service | ||
For other ways to install kibana; pls follow reference at [Kibana-installation](https://www.elastic.co/guide/en/kibana/current/install.html) | ||
|
||
### Configuration of Logstash | ||
### Logstash Installation and Configuration | ||
|
||
1. Create a configuration file to allow Filebeat to communicate with Logstash. | ||
* To install Logstash run the following: | ||
|
||
``` | ||
sudo nano /etc/logstash/conf.d/chef-beats-input.conf | ||
``` | ||
2. Enter the following in the chef-beats-input.conf file to allow Filebeat to send logs to Logstash over TCP port 5044. | ||
|
||
``` | ||
# Read input from filebeat on Chef Automate HA nodes by listening to port 5044 on which filebeat will send the data | ||
input { | ||
beats { | ||
port => "5044" | ||
} | ||
} | ||
filter { | ||
#If log line contains 'hab' then we will tag that entry as Chef | ||
if [message] =~ "hab" { | ||
grok { | ||
match => ["message", "^(hab)"] | ||
add_tag => ["Chef Automate HA"] | ||
} | ||
} | ||
} | ||
output { | ||
stdout { | ||
codec => rubydebug | ||
} | ||
# Send parsed log events to elasticsearch | ||
elasticsearch { | ||
hosts => ["localhost:9200"] | ||
} | ||
} | ||
```sh | ||
apt install logstash | ||
sudo systemctl start logstash.service | ||
sudo systemctl enable logstash.service | ||
``` | ||
|
||
3. Restart the logstash service: | ||
### Configuration of Logstash | ||
|
||
``` | ||
sudo systemctl restart logstash.service | ||
``` | ||
1. Create a configuration file to allow Filebeat to communicate with Logstash. | ||
|
||
For other ways to install logstash; pls follow reference at [Logstash-installation](https://www.elastic.co/guide/en/logstash/current/installing-logstash.html) | ||
```sh | ||
sudo nano /etc/logstash/conf.d/chef-beats-input.conf | ||
``` | ||
|
||
1. Enter the following in the chef-beats-input.conf file to allow Filebeat to send logs to Logstash over TCP port 5044. | ||
|
||
```sh | ||
# Read input from filebeat on Chef Automate HA nodes by listening to port 5044 on which filebeat will send the data | ||
input { | ||
beats { | ||
port => "5044" | ||
} | ||
} | ||
filter { | ||
#If log line contains 'hab' then we will tag that entry as Chef | ||
if [message] =~ "hab" { | ||
grok { | ||
match => ["message", "^(hab)"] | ||
add_tag => ["Chef Automate HA"] | ||
} | ||
} | ||
} | ||
output { | ||
stdout { | ||
codec => rubydebug | ||
} | ||
# Send parsed log events to elasticsearch | ||
elasticsearch { | ||
hosts => ["localhost:9200"] | ||
} | ||
} | ||
``` | ||
1. Restart the logstash service: | ||
```sh | ||
sudo systemctl restart logstash.service | ||
``` | ||
For other ways to install logstash; follow reference at [Logstash Installation](https://www.elastic.co/guide/en/logstash/current/installing-logstash.html) page. |
Oops, something went wrong.