

How to stream Pega cloud logs into Amazon S3 bucket step by step
1
187
0
Streaming pega logs is essential for any organization aiming to maintain strong application performance and effective monitoring. Pega cloud streams the logs real time with in 60 secs to the AWS bucket folders. For Pega users, integrating log streaming to Amazon S3 offers scalable storage and straightforward retrieval of log data. In this guide, we will examine the step-by-step process for streaming Pega logs into an Amazon S3 bucket.
Understanding Pega Logs
Pega logs are vital for tracking the health and performance of applications on the Pega platform. These logs record various events, such as errors, performance metrics, and transaction details. Streaming these logs to a centralized storage solution like Amazon S3 can enhance your logging strategy in several ways:
Data Durability: Amazon S3 boasts a 99.999999999% durability rate, meaning your log data is secure and unlikely to be lost.
Easy Access for Analysis: Using S3 allows teams to quickly access logs for performance monitoring and troubleshooting.
Reduced Risk of Data Loss: Storing logs on S3 minimizes the chance of losing critical data due to local storage failures.
By utilizing Amazon S3 for Pega logs, organizations can harness S3's durability and scalability, which is beneficial when dealing with large quantities of log files over time.
Streaming logs to your AWS S3 bucket gives you immediate access to your log files without relying on third-party integrations or Pega-provided services. Log streaming to an external Amazon S3 bucket is only available in AWS Pega Cloud does not currently support streaming logs to an S3 bucket outside of the cloud region where your Pega Cloud environments live.
Prerequisites
Before beginning the log streaming setup, ensure you have the following:
An active Amazon Web Services (AWS) account.
A configured Amazon S3 bucket to store the logs.
Familiarity with AWS IAM for managing permissions.
Step 1: Create and Configure an Amazon S3 Bucket
Creating an S3 bucket is the first step if you haven’t already done so. Follow these steps:
Log in to your AWS Management Console.
Navigate to the S3 service.
Click on Create Bucket.
Provide a unique bucket name and select a region near your Pega application for lower latency.
Adjust bucket settings to meet your needs. Default settings for logging, versioning, and encryption are recommended for enhanced security.
Click Create to finalize your bucket setup.
For Bucket encryption, use Server-side encryption with encryption type = AWS Key Management Service keys (SSE-KMS)) and key type = AWS KMS key ARN.
Provide Pega Support Engineering with the ARNs of the following artifacts from your AWS account:
Your Amazon S3 custom master keys (CMKs) ARN
For example, arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Your Amazon S3 bucket name ARN
For example, arn:aws:s3::::bucket-name
Keep note of the bucket name and KMS key for future steps.
Step 2: Pega support ticket for role creation and policy statements
To stream your Pega logs directly to your S3 bucket, perform the following task:
Create a Pega cloud change ticket for Pega log streaming to Amazon S3 bucket. Provide below details.
Amazon S3 bucket name
CMK ARN , KMS Key
The name of the environment from which you want to stream your logs to an S3 bucket
Pega configures the log streaming and provide the KMS and bucket policy statements to be configured in the client's AWS bucket.
Step 3: Set up IAM Policy and Role
To enable Pega to access your S3 bucket, create an IAM policy with the necessary permissions:
Pega Cloud sends the IAM role necessary to allow streaming. The IAM role arn will have the following name format.
<client>-delivery-stream-role ARN
Sign into your Amazon S3 console.
Select the bucket to which you want to add the Amazon S3 log streaming service.
Click Permissions, and add following statement to your bucket policy, substitituting the actual ARNs that you received from the Pega Cloud team for <<client>-delivery-stream-role ARN>.
For example
In your AWS account, open the IAM Console.
Click on Policies, then Create policy.
Select the JSON tab, and paste the following policy document, ensuring to replace `YOUR_BUCKET_NAME` with your actual S3 bucket name:
{
"Version": "yyyy-mm-dd",
"Statement": [
{
"Sid": "PegaS3Writes",
"Effect": "Allow",
"Principal": {
"AWS":"<<client>-delivery-stream-role ARN>"
},
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": [
"<<client>-bucket ARN>",
"<<client>-bucket-ARN>/*"
]
}
]
}
Streaming logs from multiple environments to a single S3 bucket
If you stream logs from multiple environments, your Resource class must reflect each environment name from which you stream your logs.
{
"Version": "yyyy-mm-dd",
"Statement": [
{
"Sid": "PegaS3Writes",
"Effect": "Allow",
"Principal": {
"AWS":"<<client>-delivery-stream-role ARN>"
},
"Action": [
"s3:ListBucket",
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": [
"<clientS3bucket>",
"<clientS3bucket>/*"
]
}
]
}
Configure KMS policy statements
Log in into your AWS KMS console.
In the navigation pane, click Customer managed keys.
Select the S3 CMK ARN.
Select the Key policy tab, and in the key policy editor. Add the statements for the "Enable Initial Create Grant" and "Enable Pega S3 Log Streaming KMS Access" as show below, substituting the actual ARNs that you received from the Pega Cloud team for <<client>-delivery-stream-service-role> and for <PEGA_CFN_ROLE_ARN>. Also specify your actual CMK ARN for <<client>-managed-key ARN>.
{
"Version": "yyyy-mm-dd",
"Id": "key-default-1",
"Statement": [
{
"Sid": "Enable IAM User Permissions",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<CLIENT_AWS_ACCOUNT>:root"
},
"Action": "kms:*",
"Resource": "*"
},
{
"Sid": "Enable Pega S3 KMS Access",
"Effect": "Allow",
"Principal": {
"AWS": "<<client>-delivery-stream-service-role>"
},
"Action": [
"kms:Encrypt",
"kms:Decrypt",
"kms:ReEncryptFrom",
"kms:ReEncryptTo",
"kms:DescribeKey",
"kms:CreateGrant",
"kms:GenerateDataKey*"
],
"Resource": "<<client>-managed-key ARN>"
}
]
}
{}
Step 4: Configuring Pega for Streaming Logs to folder on S3 bucket
Now that the S3 bucket and IAM permissions are set up, it’s time to configure Pega:
Request Pega cloud team to set up log streaming to below format to enable the streaming of logs to specific folders.
/{S3-Bucket-Name}/{Pega-Environment-Name}/{Client-App-Environment-Name}/{Log-Type}/{Date}/{Auto-Generated-Filename}
If duplicate folder name shows up while validation for {Pega-Environment-Name} or {Client-App-Environment-Name} request to configure only either of them ex: /{S3-Bucket-Name}/{Pega-Environment-Name}/{Log-Type}/{Date}/{Auto-Generated-Filename}
Step 5: Validate the Configuration
Once Pega is configured, it's essential to confirm that the setup works correctly:
Generate log entries in Pega by triggering an event.
Wait a few minutes and log in to your AWS console.
Check if new log files appear in your specified S3 bucket directory.
If you see the logs, you've successfully set up Pega log streaming to your Amazon S3 bucket!
Step 6: Set Up Regular Log Cleaning
To control storage costs, think about implementing a lifecycle policy for your S3 bucket:
Navigate to your S3 bucket in the AWS Management Console.
Click on Management, then Lifecycle rules.
Create a rule specifying how long to keep logs before they are automatically deleted or archived.
This strategy will help optimize your S3 usage without needing manual log management.
Streaming Pega logs to an Amazon S3 bucket significantly enhances your ability to monitor and analyze applications and use organization wide to consume for Dyna Trace or security audits etc.
#pegalogstreaming #pegalogamazons3bucket #pegalogstoamazons3bucket #pegalogstreamingtoamazons3bucket #pegasecurity #pegaintegration #pegas3bucketintegration
Related Posts
