Tags: elk, logs, s3, logstash
By default Kublr do not store logs into S3 bucket storage. In most cases if you need do that we can suggest to use S3 output plugin.
TABLE OF CONTENTS
- Step 1: Create AWS S3 bucket with access according to your needs.
- Step 2: Create IAM Policy for access to S3 bucket.
- Step 3: Modify Logstash settings.
- Step 4: Validate and update cluster specification.
Step 1: Create AWS S3 bucket with access according to your needs.
- You can create S3 bucket in AWS via cli or UI interface. Please replacing placeholders appropriately:
aws s3api create-bucket \ --bucket $Bucket_Name \ --region $Region
Step 2: Create IAM Policy for access to S3 bucket.
- You need to create IAM Policy through Cluster Specification for access to S3 bucket from the cluster:
locations: - aws: ... cloudFormationExtras: iamRoleNode: Properties: Policies: - PolicyDocument: Statement: - Action: - s3:GetObject - s3:PutObject Effect: Allow Resource: arn:aws:s3:::<Bucket_Name>/* - Action: - s3:ListBucket - s3:GetObject - s3:PutObject Effect: Allow Resource: arn:aws:s3:::<Bucket_Name> Version: '2012-10-17' PolicyName: <Policy_Name> ... name: aws1
Step 3: Modify Logstash settings.
- You need to add into "logging" section of cluster specification S3 output plugin settings:
logging: chart: ... values: logstash: additionalConfig: | output { s3 { bucket => "<Bucket_Name>" prefix => "logs" // Optional setting for define a folder in S3 bucket codec => "plain" // Pls,look at other options https://www.elastic.co/guide/en/logstash/7.10/plugins-outputs-s3.html#plugins-outputs-s3-options } }
Step 4: Validate and update cluster specification.
Waite before everything finished and cluster goes to "green" state.
Check S3 bucket status and logs: