f4f8527955
* fix OPTIONS requests on non-existing API GW integrations * add cloudformation models for API Gateway deployments * bump version * add backdoor to return CloudWatch metrics * Updating implementation coverage * Updating implementation coverage * add cloudformation models for API Gateway deployments * Updating implementation coverage * Updating implementation coverage * Implemented get-caller-identity returning real data depending on the access key used. * bump version * minor fixes * fix Number data_type for SQS message attribute * fix handling of encoding errors * bump version * make CF stack queryable before starting to initialize its resources * bump version * fix integration_method for API GW method integrations * fix undefined status in CF FakeStack * Fix apigateway issues with terraform v0.12.21 * resource_methods -> add handle for "DELETE" method * integrations -> fix issue that "httpMethod" wasn't included in body request (this value was set as the value from refer method resource) * bump version * Fix setting http method for API gateway integrations (#6) * bump version * remove duplicate methods * add storage class to S3 Key when completing multipart upload (#7) * fix SQS performance issues; bump version * add pagination to SecretsManager list-secrets (#9) * fix default parameter groups in RDS * fix adding S3 metadata headers with names containing dots (#13) * Updating implementation coverage * Updating implementation coverage * add cloudformation models for API Gateway deployments * Updating implementation coverage * Updating implementation coverage * Implemented get-caller-identity returning real data depending on the access key used. * make CF stack queryable before starting to initialize its resources * bump version * remove duplicate methods * fix adding S3 metadata headers with names containing dots (#13) * Update amis.json to support EKS AMI mocks (#15) * fix PascalCase for boolean value in ListMultipartUploads response (#17); fix _get_multi_param to parse nested list/dict query params * determine non-zero container exit code in Batch API * support filtering by dimensions in CW get_metric_statistics * fix storing attributes for ELBv2 Route entities; API GW refactorings for TF tests * add missing fields for API GW resources * fix error messages for Route53 (TF-compat) * various fixes for IAM resources (tf-compat) * minor fixes for API GW models (tf-compat) * minor fixes for API GW responses (tf-compat) * add s3 exception for bucket notification filter rule validation * change the way RESTErrors generate the response body and content-type header * fix lint errors and disable "black" syntax enforcement * remove return type hint in RESTError.get_body * add RESTError XML template for IAM exceptions * add support for API GW minimumCompressionSize * fix casing getting PrivateDnsEnabled API GW attribute * minor fixes for error responses * fix escaping special chars for IAM role descriptions (tf-compat) * minor fixes and tagging support for API GW and ELB v2 (tf-compat) * Merge branch 'master' into localstack * add "AlarmRule" attribute to enable support for composite CloudWatch metrics * fix recursive parsing of complex/nested query params * bump version * add API to delete S3 website configurations (#18) * use dict copy to allow parallelism and avoid concurrent modification exceptions in S3 * fix precondition check for etags in S3 (#19) * minor fix for user filtering in Cognito * fix API Gateway error response; avoid returning empty response templates (tf-compat) * support tags and tracingEnabled attribute for API GW stages * fix boolean value in S3 encryption response (#20) * fix connection arn structure * fix api destination arn structure * black format * release 2.0.3.37 * fix s3 exception tests see botocore/parsers.py:1002 where RequestId is removed from parsed * remove python 2 from build action * add test failure annotations in build action * fix events test arn comparisons * fix s3 encryption response test * return default value "0" if EC2 availableIpAddressCount is empty * fix extracting SecurityGroupIds for EC2 VPC endpoints * support deleting/updating API Gateway DomainNames * fix(events): Return empty string instead of null when no pattern is specified in EventPattern (tf-compat) (#22) * fix logic and revert CF changes to get tests running again (#21) * add support for EC2 customer gateway API (#25) * add support for EC2 Transit Gateway APIs (#24) * feat(logs): add `kmsKeyId` into `LogGroup` entity (#23) * minor change in ELBv2 logic to fix tests * feat(events): add APIs to describe and delete CloudWatch Events connections (#26) * add support for EC2 transit gateway route tables (#27) * pass transit gateway route table ID in Describe API, minor refactoring (#29) * add support for EC2 Transit Gateway Routes (#28) * fix region on ACM certificate import (#31) * add support for EC2 transit gateway attachments (#30) * add support for EC2 Transit Gateway VPN attachments (#32) * fix account ID for logs API * add support for DeleteOrganization API * feat(events): store raw filter representation for CloudWatch events patterns (tf-compat) (#36) * feat(events): add support to describe/update/delete CloudWatch API destinations (#35) * add Cognito UpdateIdentityPool, CW Logs PutResourcePolicy * feat(events): add support for tags in EventBus API (#38) * fix parameter validation for Batch compute environments (tf-compat) * revert merge conflicts in IMPLEMENTATION_COVERAGE.md * format code using black * restore original README; re-enable and fix CloudFormation tests * restore tests and old logic for CF stack parameters from SSM * parameterize RequestId/RequestID in response messages and revert related test changes * undo LocalStack-specific adaptations * minor fix * Update CodeCov config to reflect removal of Py2 * undo change related to CW metric filtering; add additional test for CW metric statistics with dimensions * Terraform - Extend whitelist of running tests Co-authored-by: acsbendi <acsbendi28@gmail.com> Co-authored-by: Phan Duong <duongpv@outlook.com> Co-authored-by: Thomas Rausch <thomas@thrau.at> Co-authored-by: Macwan Nevil <macnev2013@gmail.com> Co-authored-by: Dominik Schubert <dominik.schubert91@gmail.com> Co-authored-by: Gonzalo Saad <saad.gonzalo.ale@gmail.com> Co-authored-by: Mohit Alonja <monty16597@users.noreply.github.com> Co-authored-by: Miguel Gagliardo <migag9@gmail.com> Co-authored-by: Bert Blommers <info@bertblommers.nl>
669 lines
22 KiB
Python
669 lines
22 KiB
Python
import os
|
|
import time
|
|
from unittest import SkipTest
|
|
|
|
import boto3
|
|
import pytest
|
|
import sure # noqa
|
|
from botocore.exceptions import ClientError
|
|
|
|
from moto import mock_logs, settings
|
|
|
|
_logs_region = "us-east-1" if settings.TEST_SERVER_MODE else "us-west-2"
|
|
|
|
|
|
@mock_logs
|
|
@pytest.mark.parametrize(
|
|
"kms_key_id",
|
|
[
|
|
"arn:aws:kms:us-east-1:000000000000:key/51d81fab-b138-4bd2-8a09-07fd6d37224d",
|
|
None,
|
|
],
|
|
)
|
|
def test_create_log_group(kms_key_id):
|
|
# Given
|
|
conn = boto3.client("logs", "us-west-2")
|
|
|
|
create_logs_params = dict(logGroupName="dummy")
|
|
if kms_key_id:
|
|
create_logs_params["kmsKeyId"] = kms_key_id
|
|
|
|
# When
|
|
response = conn.create_log_group(**create_logs_params)
|
|
response = conn.describe_log_groups()
|
|
|
|
# Then
|
|
response["logGroups"].should.have.length_of(1)
|
|
|
|
log_group = response["logGroups"][0]
|
|
log_group.should_not.have.key("retentionInDays")
|
|
|
|
if kms_key_id:
|
|
log_group.should.have.key("kmsKeyId")
|
|
log_group["kmsKeyId"].should.equal(kms_key_id)
|
|
|
|
|
|
@mock_logs
|
|
def test_exceptions():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
log_stream_name = "dummp-stream"
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
with pytest.raises(ClientError):
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
|
|
# descrine_log_groups is not implemented yet
|
|
|
|
conn.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
with pytest.raises(ClientError):
|
|
conn.create_log_stream(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name
|
|
)
|
|
|
|
conn.put_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
logEvents=[{"timestamp": 0, "message": "line"}],
|
|
)
|
|
|
|
with pytest.raises(ClientError) as ex:
|
|
conn.put_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName="invalid-stream",
|
|
logEvents=[{"timestamp": 0, "message": "line"}],
|
|
)
|
|
error = ex.value.response["Error"]
|
|
error["Code"].should.equal("ResourceNotFoundException")
|
|
error["Message"].should.equal("The specified log stream does not exist.")
|
|
|
|
|
|
@mock_logs
|
|
def test_put_logs():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
log_stream_name = "stream"
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
conn.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
messages = [
|
|
{"timestamp": 0, "message": "hello"},
|
|
{"timestamp": 0, "message": "world"},
|
|
]
|
|
putRes = conn.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=messages
|
|
)
|
|
res = conn.get_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name
|
|
)
|
|
events = res["events"]
|
|
nextSequenceToken = putRes["nextSequenceToken"]
|
|
assert isinstance(nextSequenceToken, str) == True
|
|
assert len(nextSequenceToken) == 56
|
|
events.should.have.length_of(2)
|
|
|
|
|
|
@mock_logs
|
|
def test_filter_logs_interleaved():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
log_stream_name = "stream"
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
conn.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
messages = [
|
|
{"timestamp": 0, "message": "hello"},
|
|
{"timestamp": 0, "message": "world"},
|
|
]
|
|
conn.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=messages
|
|
)
|
|
res = conn.filter_log_events(
|
|
logGroupName=log_group_name, logStreamNames=[log_stream_name], interleaved=True
|
|
)
|
|
events = res["events"]
|
|
for original_message, resulting_event in zip(messages, events):
|
|
resulting_event["eventId"].should.equal(str(resulting_event["eventId"]))
|
|
resulting_event["timestamp"].should.equal(original_message["timestamp"])
|
|
resulting_event["message"].should.equal(original_message["message"])
|
|
|
|
|
|
@mock_logs
|
|
def test_filter_logs_raises_if_filter_pattern():
|
|
if os.environ.get("TEST_SERVER_MODE", "false").lower() == "true":
|
|
raise SkipTest("Does not work in server mode due to error in Workzeug")
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
log_stream_name = "stream"
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
conn.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
messages = [
|
|
{"timestamp": 0, "message": "hello"},
|
|
{"timestamp": 0, "message": "world"},
|
|
]
|
|
conn.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=messages
|
|
)
|
|
with pytest.raises(NotImplementedError):
|
|
conn.filter_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamNames=[log_stream_name],
|
|
filterPattern='{$.message = "hello"}',
|
|
)
|
|
|
|
|
|
@mock_logs
|
|
def test_filter_logs_paging():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "/aws/dummy"
|
|
log_stream_name = "stream/stage"
|
|
conn.create_log_group(logGroupName=log_group_name)
|
|
conn.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
timestamp = int(time.time())
|
|
messages = []
|
|
for i in range(25):
|
|
messages.append(
|
|
{"message": "Message number {}".format(i), "timestamp": timestamp}
|
|
)
|
|
timestamp += 100
|
|
|
|
conn.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=messages
|
|
)
|
|
res = conn.filter_log_events(
|
|
logGroupName=log_group_name, logStreamNames=[log_stream_name], limit=20
|
|
)
|
|
events = res["events"]
|
|
events.should.have.length_of(20)
|
|
res["nextToken"].should.equal("/aws/dummy@stream/stage@" + events[-1]["eventId"])
|
|
|
|
res = conn.filter_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamNames=[log_stream_name],
|
|
limit=20,
|
|
nextToken=res["nextToken"],
|
|
)
|
|
events += res["events"]
|
|
events.should.have.length_of(25)
|
|
res.should_not.have.key("nextToken")
|
|
|
|
for original_message, resulting_event in zip(messages, events):
|
|
resulting_event["eventId"].should.equal(str(resulting_event["eventId"]))
|
|
resulting_event["timestamp"].should.equal(original_message["timestamp"])
|
|
resulting_event["message"].should.equal(original_message["message"])
|
|
|
|
res = conn.filter_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamNames=[log_stream_name],
|
|
limit=20,
|
|
nextToken="invalid-token",
|
|
)
|
|
res["events"].should.have.length_of(0)
|
|
res.should_not.have.key("nextToken")
|
|
|
|
res = conn.filter_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamNames=[log_stream_name],
|
|
limit=20,
|
|
nextToken="wrong-group@stream@999",
|
|
)
|
|
res["events"].should.have.length_of(0)
|
|
res.should_not.have.key("nextToken")
|
|
|
|
|
|
@mock_logs
|
|
def test_put_retention_policy():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
response = conn.create_log_group(logGroupName=log_group_name)
|
|
|
|
response = conn.put_retention_policy(logGroupName=log_group_name, retentionInDays=7)
|
|
|
|
response = conn.describe_log_groups(logGroupNamePrefix=log_group_name)
|
|
assert len(response["logGroups"]) == 1
|
|
assert response["logGroups"][0].get("retentionInDays") == 7
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
|
|
|
|
@mock_logs
|
|
def test_delete_retention_policy():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
response = conn.create_log_group(logGroupName=log_group_name)
|
|
|
|
response = conn.put_retention_policy(logGroupName=log_group_name, retentionInDays=7)
|
|
|
|
response = conn.describe_log_groups(logGroupNamePrefix=log_group_name)
|
|
assert len(response["logGroups"]) == 1
|
|
assert response["logGroups"][0].get("retentionInDays") == 7
|
|
|
|
response = conn.delete_retention_policy(logGroupName=log_group_name)
|
|
|
|
response = conn.describe_log_groups(logGroupNamePrefix=log_group_name)
|
|
assert len(response["logGroups"]) == 1
|
|
assert response["logGroups"][0].get("retentionInDays") == None
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
|
|
|
|
@mock_logs
|
|
def test_get_log_events():
|
|
client = boto3.client("logs", "us-west-2")
|
|
log_group_name = "test"
|
|
log_stream_name = "stream"
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
client.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
|
|
events = [{"timestamp": x, "message": str(x)} for x in range(20)]
|
|
|
|
client.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=events
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, limit=10
|
|
)
|
|
|
|
resp["events"].should.have.length_of(10)
|
|
for i in range(10):
|
|
resp["events"][i]["timestamp"].should.equal(i + 10)
|
|
resp["events"][i]["message"].should.equal(str(i + 10))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000019"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000010"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextBackwardToken"],
|
|
limit=20,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(10)
|
|
for i in range(10):
|
|
resp["events"][i]["timestamp"].should.equal(i)
|
|
resp["events"][i]["message"].should.equal(str(i))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000009"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000000"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextBackwardToken"],
|
|
limit=10,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(0)
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000000"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000000"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextForwardToken"],
|
|
limit=1,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(1)
|
|
resp["events"][0]["timestamp"].should.equal(1)
|
|
resp["events"][0]["message"].should.equal(str(1))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000001"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000001"
|
|
)
|
|
|
|
|
|
@mock_logs
|
|
def test_get_log_events_with_start_from_head():
|
|
client = boto3.client("logs", "us-west-2")
|
|
log_group_name = "test"
|
|
log_stream_name = "stream"
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
client.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
|
|
events = [{"timestamp": x, "message": str(x)} for x in range(20)]
|
|
|
|
client.put_log_events(
|
|
logGroupName=log_group_name, logStreamName=log_stream_name, logEvents=events
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
limit=10,
|
|
startFromHead=True, # this parameter is only relevant without the usage of nextToken
|
|
)
|
|
|
|
resp["events"].should.have.length_of(10)
|
|
for i in range(10):
|
|
resp["events"][i]["timestamp"].should.equal(i)
|
|
resp["events"][i]["message"].should.equal(str(i))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000009"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000000"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextForwardToken"],
|
|
limit=20,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(10)
|
|
for i in range(10):
|
|
resp["events"][i]["timestamp"].should.equal(i + 10)
|
|
resp["events"][i]["message"].should.equal(str(i + 10))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000019"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000010"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextForwardToken"],
|
|
limit=10,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(0)
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000019"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000019"
|
|
)
|
|
|
|
resp = client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken=resp["nextBackwardToken"],
|
|
limit=1,
|
|
)
|
|
|
|
resp["events"].should.have.length_of(1)
|
|
resp["events"][0]["timestamp"].should.equal(18)
|
|
resp["events"][0]["message"].should.equal(str(18))
|
|
resp["nextForwardToken"].should.equal(
|
|
"f/00000000000000000000000000000000000000000000000000000018"
|
|
)
|
|
resp["nextBackwardToken"].should.equal(
|
|
"b/00000000000000000000000000000000000000000000000000000018"
|
|
)
|
|
|
|
|
|
@mock_logs
|
|
def test_get_log_events_errors():
|
|
client = boto3.client("logs", "us-west-2")
|
|
log_group_name = "test"
|
|
log_stream_name = "stream"
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
client.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
|
|
|
|
with pytest.raises(ClientError) as e:
|
|
client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken="n/00000000000000000000000000000000000000000000000000000000",
|
|
)
|
|
ex = e.value
|
|
ex.operation_name.should.equal("GetLogEvents")
|
|
ex.response["ResponseMetadata"]["HTTPStatusCode"].should.equal(400)
|
|
ex.response["Error"]["Code"].should.equal("InvalidParameterException")
|
|
ex.response["Error"]["Message"].should.contain(
|
|
"The specified nextToken is invalid."
|
|
)
|
|
|
|
with pytest.raises(ClientError) as e:
|
|
client.get_log_events(
|
|
logGroupName=log_group_name,
|
|
logStreamName=log_stream_name,
|
|
nextToken="not-existing-token",
|
|
)
|
|
ex = e.value
|
|
ex.operation_name.should.equal("GetLogEvents")
|
|
ex.response["ResponseMetadata"]["HTTPStatusCode"].should.equal(400)
|
|
ex.response["Error"]["Code"].should.equal("InvalidParameterException")
|
|
ex.response["Error"]["Message"].should.contain(
|
|
"The specified nextToken is invalid."
|
|
)
|
|
|
|
|
|
@mock_logs
|
|
def test_list_tags_log_group():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
tags = {"tag_key_1": "tag_value_1", "tag_key_2": "tag_value_2"}
|
|
|
|
response = conn.create_log_group(logGroupName=log_group_name)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == {}
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
response = conn.create_log_group(logGroupName=log_group_name, tags=tags)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == tags
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
|
|
|
|
@mock_logs
|
|
def test_tag_log_group():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
tags = {"tag_key_1": "tag_value_1"}
|
|
response = conn.create_log_group(logGroupName=log_group_name)
|
|
|
|
response = conn.tag_log_group(logGroupName=log_group_name, tags=tags)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == tags
|
|
|
|
tags_with_added_value = {"tag_key_1": "tag_value_1", "tag_key_2": "tag_value_2"}
|
|
response = conn.tag_log_group(
|
|
logGroupName=log_group_name, tags={"tag_key_2": "tag_value_2"}
|
|
)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == tags_with_added_value
|
|
|
|
tags_with_updated_value = {"tag_key_1": "tag_value_XX", "tag_key_2": "tag_value_2"}
|
|
response = conn.tag_log_group(
|
|
logGroupName=log_group_name, tags={"tag_key_1": "tag_value_XX"}
|
|
)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == tags_with_updated_value
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
|
|
|
|
@mock_logs
|
|
def test_untag_log_group():
|
|
conn = boto3.client("logs", "us-west-2")
|
|
log_group_name = "dummy"
|
|
response = conn.create_log_group(logGroupName=log_group_name)
|
|
|
|
tags = {"tag_key_1": "tag_value_1", "tag_key_2": "tag_value_2"}
|
|
response = conn.tag_log_group(logGroupName=log_group_name, tags=tags)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == tags
|
|
|
|
tags_to_remove = ["tag_key_1"]
|
|
remaining_tags = {"tag_key_2": "tag_value_2"}
|
|
response = conn.untag_log_group(logGroupName=log_group_name, tags=tags_to_remove)
|
|
response = conn.list_tags_log_group(logGroupName=log_group_name)
|
|
assert response["tags"] == remaining_tags
|
|
|
|
response = conn.delete_log_group(logGroupName=log_group_name)
|
|
|
|
|
|
@mock_logs
|
|
def test_describe_subscription_filters():
|
|
# given
|
|
client = boto3.client("logs", "us-east-1")
|
|
log_group_name = "/test"
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
|
|
# when
|
|
response = client.describe_subscription_filters(logGroupName=log_group_name)
|
|
|
|
# then
|
|
response["subscriptionFilters"].should.have.length_of(0)
|
|
|
|
|
|
@mock_logs
|
|
def test_describe_subscription_filters_errors():
|
|
# given
|
|
client = boto3.client("logs", "us-east-1")
|
|
|
|
# when
|
|
with pytest.raises(ClientError) as e:
|
|
client.describe_subscription_filters(logGroupName="not-existing-log-group",)
|
|
|
|
# then
|
|
ex = e.value
|
|
ex.operation_name.should.equal("DescribeSubscriptionFilters")
|
|
ex.response["ResponseMetadata"]["HTTPStatusCode"].should.equal(400)
|
|
ex.response["Error"]["Code"].should.contain("ResourceNotFoundException")
|
|
ex.response["Error"]["Message"].should.equal(
|
|
"The specified log group does not exist"
|
|
)
|
|
|
|
|
|
@mock_logs
|
|
def test_describe_log_groups_paging():
|
|
client = boto3.client("logs", "us-east-1")
|
|
|
|
group_names = [
|
|
"/aws/lambda/lowercase-dev",
|
|
"/aws/lambda/FileMonitoring",
|
|
"/aws/events/GetMetricData",
|
|
"/aws/lambda/fileAvailable",
|
|
]
|
|
|
|
for name in group_names:
|
|
client.create_log_group(logGroupName=name)
|
|
|
|
resp = client.describe_log_groups()
|
|
resp["logGroups"].should.have.length_of(4)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
resp = client.describe_log_groups(limit=2)
|
|
resp["logGroups"].should.have.length_of(2)
|
|
resp["nextToken"].should.equal("/aws/lambda/FileMonitoring")
|
|
|
|
resp = client.describe_log_groups(nextToken=resp["nextToken"], limit=1)
|
|
resp["logGroups"].should.have.length_of(1)
|
|
resp["nextToken"].should.equal("/aws/lambda/fileAvailable")
|
|
|
|
resp = client.describe_log_groups(nextToken=resp["nextToken"])
|
|
resp["logGroups"].should.have.length_of(1)
|
|
resp["logGroups"][0]["logGroupName"].should.equal("/aws/lambda/lowercase-dev")
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
resp = client.describe_log_groups(nextToken="invalid-token")
|
|
resp["logGroups"].should.have.length_of(0)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
|
|
@mock_logs
|
|
def test_describe_log_streams_paging():
|
|
client = boto3.client("logs", "us-east-1")
|
|
|
|
log_group_name = "/aws/codebuild/lowercase-dev"
|
|
stream_names = [
|
|
"job/214/stage/unit_tests/foo",
|
|
"job/215/stage/unit_tests/spam",
|
|
"job/215/stage/e2e_tests/eggs",
|
|
"job/216/stage/unit_tests/eggs",
|
|
]
|
|
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
for name in stream_names:
|
|
client.create_log_stream(logGroupName=log_group_name, logStreamName=name)
|
|
|
|
resp = client.describe_log_streams(logGroupName=log_group_name)
|
|
resp["logStreams"].should.have.length_of(4)
|
|
resp["logStreams"][0]["arn"].should.contain(log_group_name)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
resp = client.describe_log_streams(logGroupName=log_group_name, limit=2)
|
|
resp["logStreams"].should.have.length_of(2)
|
|
resp["logStreams"][0]["arn"].should.contain(log_group_name)
|
|
resp["nextToken"].should.equal(
|
|
u"{}@{}".format(log_group_name, resp["logStreams"][1]["logStreamName"])
|
|
)
|
|
|
|
resp = client.describe_log_streams(
|
|
logGroupName=log_group_name, nextToken=resp["nextToken"], limit=1
|
|
)
|
|
resp["logStreams"].should.have.length_of(1)
|
|
resp["logStreams"][0]["arn"].should.contain(log_group_name)
|
|
resp["nextToken"].should.equal(
|
|
u"{}@{}".format(log_group_name, resp["logStreams"][0]["logStreamName"])
|
|
)
|
|
|
|
resp = client.describe_log_streams(
|
|
logGroupName=log_group_name, nextToken=resp["nextToken"]
|
|
)
|
|
resp["logStreams"].should.have.length_of(1)
|
|
resp["logStreams"][0]["arn"].should.contain(log_group_name)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
resp = client.describe_log_streams(
|
|
logGroupName=log_group_name, nextToken="invalid-token"
|
|
)
|
|
resp["logStreams"].should.have.length_of(0)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
resp = client.describe_log_streams(
|
|
logGroupName=log_group_name, nextToken="invalid@token"
|
|
)
|
|
resp["logStreams"].should.have.length_of(0)
|
|
resp.should_not.have.key("nextToken")
|
|
|
|
|
|
@mock_logs
|
|
def test_start_query():
|
|
client = boto3.client("logs", "us-east-1")
|
|
|
|
log_group_name = "/aws/codebuild/lowercase-dev"
|
|
client.create_log_group(logGroupName=log_group_name)
|
|
|
|
response = client.start_query(
|
|
logGroupName=log_group_name,
|
|
startTime=int(time.time()),
|
|
endTime=int(time.time()) + 300,
|
|
queryString="test",
|
|
)
|
|
|
|
assert "queryId" in response
|
|
|
|
with pytest.raises(ClientError) as e:
|
|
client.start_query(
|
|
logGroupName="/aws/codebuild/lowercase-dev-invalid",
|
|
startTime=int(time.time()),
|
|
endTime=int(time.time()) + 300,
|
|
queryString="test",
|
|
)
|
|
|
|
# then
|
|
ex = e.value
|
|
ex.response["Error"]["Code"].should.contain("ResourceNotFoundException")
|
|
ex.response["Error"]["Message"].should.equal(
|
|
"The specified log group does not exist"
|
|
)
|