Stream logs to elastic using aws cli

2020-03-30 02:05发布

问题:

I would like to enable the Stream to Amazon Elasticsearch Service from Cloudwatch to Elasticsearch.

I'm familiar with how to do that manually, I'm looking for a way to achieve that by running aws cli commands.

assuming Elasticsearch is already configured, is there any way to automate the process ?

回答1:

Update:

If you are using cloudformation, take a look at my answer here.


Many thanks to @Adiii that pointed me in the right direction,Find bellow the end to end solution for this issue.

The solution Include the following parts :

  • create-lambda-role

  • grant-permissions-to-lambda-role

  • create-lambda
  • grant-cloudwatch-permission-to-execute-lambda
  • add-subscription-to-cloudwatch-log-group

I assume that the lambda function is already packaged and accessible. You can find the lambda function here.

Update var endpoint = ${Elasticsearch_Endpoint}; in index.js with your Elasticseatch url e.g - search-xxx-yyyy.eu-west-1.es.amazonaws.com;.

Manually steps here.

1. create-lambda-role

First, we need to create the role that the lambda function use, later we will attach relevant policies to that role.

    cat > lambda-policy.json << EOF
{
  "Version": "2012-10-17",
  "Statement": {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
  }
} 
EOF

    aws iam create-role \
    --role-name ${ROLE_NAME} \
    --assume-role-policy-document file://lambda-policy.json \
    --profile ${PROFILE} \
    >/dev/null

2. grant-permissions-to-lambda-role

Attach relevant policies to lambda role.

 cat > lambda-to-es-via-vpc-policy.json << EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "Stmt1569580358341",
      "Action": "es:*",
      "Effect": "Allow",
      "Resource": "arn:aws:es:${AWS_REGION}:${AWS_ACCOUNT_ID}:domain/${ES_DOMAIN}/*"
    },
    {
      "Sid": "Stmt1569580707924",
      "Action": [
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents",
                "ec2:CreateNetworkInterface",
                "ec2:DescribeNetworkInterfaces",
                "ec2:DeleteNetworkInterface"
      ],
      "Effect": "Allow",
      "Resource": "*"
    }
  ]
}
EOF

    aws iam put-role-policy \
    --role-name ${ROLE_NAME} \
    --policy-name lambda-to-es-via-vpc-policy \
    --policy-document file://lambda-to-es-via-vpc-policy.json \
    --profile ${PROFILE} \
    >/dev/null

3. create-lambda

aws lambda create-function \
    --function-name ${LAMBDA_NAME} \
    --runtime nodejs8.10 \
    --role arn:aws:iam::${AWS_ACCOUNT_ID}:role/${ROLE_NAME} \
    --handler ${LAMBDA_NAME}.handler \
    --zip-file fileb://${LAMBDA_NAME}.zip \
    --timeout 30 \
    --vpc-config SubnetIds=${SUBNET_IDS},SecurityGroupIds=${SECURITY_GROUP_IDS} \
    --profile ${PROFILE} \
    >/dev/null

4. grant-cloudwatch-permission-to-execute-lambda

Grants an AWS service or another account permission to use a function.

aws lambda add-permission \
--function-name "${LAMBDA_NAME}" \
--statement-id "${LAMBDA_NAME}" \
--principal "logs.${AWS_REGION}.amazonaws.com" \
--action "lambda:InvokeFunction" \
--source-arn "arn:aws:logs:${AWS_REGION}:${AWS_ACCOUNT_ID}:log-group:/aws/eks/${EKS_CLUSTER}/cluster:*" \
--source-account ${AWS_ACCOUNT_ID} \
--profile ${PROFILE} \
>/dev/null

5. add-subscription-to-cloudwatch-log-group

Creates or updates a subscription filter and associates it with the specified log group. Subscription filters allow you to subscribe to a real-time stream of log events, and have them delivered to a specific destination.

  aws logs put-subscription-filter \
   --log-group-name  "/aws/eks/${EKS_CLUSTER}/cluster" \
   --filter-name "Common Log Format" \
   --filter-pattern "[host, ident, authuser, date, request, status, bytes]" \
   --destination-arn  arn:aws:lambda:${AWS_REGION}:${AWS_ACCOUNT_ID}:function:${LAMBDA_NAME} \
   --profile ${PROFILE} \
   >/dev/null


回答2:

Behind the scene Stream to Amazon Elasticsearch service create new lambda and then it pushes the log to Lambda then ELK.

destination arn

The Amazon Resource Name (ARN) of the Kinesis stream, Kinesis Data Firehose stream, or Lambda function you want to use as the destination of the subscription feed.

So here is the way to push to AWS lambda and Lambda will automatically push these stream to ELK.

aws logs put-subscription-filter --log-group-name log_group_name --filter-pattern "" --filter-name filter_name_demo   --destination-arn arn:aws:lambda:us-west-2:***********:function:your_lambda_name

AmazonCloudWatch-logs-Subscriptions