moto/tests/test_datasync/test_datasync.py
Waldemar Hummer f4f8527955
Merge LocalStack changes into upstream moto (#4082)
* fix OPTIONS requests on non-existing API GW integrations

* add cloudformation models for API Gateway deployments

* bump version

* add backdoor to return CloudWatch metrics

* Updating implementation coverage

* Updating implementation coverage

* add cloudformation models for API Gateway deployments

* Updating implementation coverage

* Updating implementation coverage

* Implemented get-caller-identity returning real data depending on the access key used.

* bump version

* minor fixes

* fix Number data_type for SQS message attribute

* fix handling of encoding errors

* bump version

* make CF stack queryable before starting to initialize its resources

* bump version

* fix integration_method for API GW method integrations

* fix undefined status in CF FakeStack

* Fix apigateway issues with terraform v0.12.21
* resource_methods -> add handle for "DELETE" method
* integrations -> fix issue that "httpMethod" wasn't included in body request (this value was set as the value from refer method resource)

* bump version

* Fix setting http method for API gateway integrations (#6)

* bump version

* remove duplicate methods

* add storage class to S3 Key when completing multipart upload (#7)

* fix SQS performance issues; bump version

* add pagination to SecretsManager list-secrets (#9)

* fix default parameter groups in RDS

* fix adding S3 metadata headers with names containing dots (#13)

* Updating implementation coverage

* Updating implementation coverage

* add cloudformation models for API Gateway deployments

* Updating implementation coverage

* Updating implementation coverage

* Implemented get-caller-identity returning real data depending on the access key used.

* make CF stack queryable before starting to initialize its resources

* bump version

* remove duplicate methods

* fix adding S3 metadata headers with names containing dots (#13)

* Update amis.json to support EKS AMI mocks (#15)

* fix PascalCase for boolean value in ListMultipartUploads response (#17); fix _get_multi_param to parse nested list/dict query params

* determine non-zero container exit code in Batch API

* support filtering by dimensions in CW get_metric_statistics

* fix storing attributes for ELBv2 Route entities; API GW refactorings for TF tests

* add missing fields for API GW resources

* fix error messages for Route53 (TF-compat)

* various fixes for IAM resources (tf-compat)

* minor fixes for API GW models (tf-compat)

* minor fixes for API GW responses (tf-compat)

* add s3 exception for bucket notification filter rule validation

* change the way RESTErrors generate the response body and content-type header

* fix lint errors and disable "black" syntax enforcement

* remove return type hint in RESTError.get_body

* add RESTError XML template for IAM exceptions

* add support for API GW minimumCompressionSize

* fix casing getting PrivateDnsEnabled API GW attribute

* minor fixes for error responses

* fix escaping special chars for IAM role descriptions (tf-compat)

* minor fixes and tagging support for API GW and ELB v2 (tf-compat)

* Merge branch 'master' into localstack

* add "AlarmRule" attribute to enable support for composite CloudWatch metrics

* fix recursive parsing of complex/nested query params

* bump version

* add API to delete S3 website configurations (#18)

* use dict copy to allow parallelism and avoid concurrent modification exceptions in S3

* fix precondition check for etags in S3 (#19)

* minor fix for user filtering in Cognito

* fix API Gateway error response; avoid returning empty response templates (tf-compat)

* support tags and tracingEnabled attribute for API GW stages

* fix boolean value in S3 encryption response (#20)

* fix connection arn structure

* fix api destination arn structure

* black format

* release 2.0.3.37

* fix s3 exception tests

see botocore/parsers.py:1002 where RequestId is removed from parsed

* remove python 2 from build action

* add test failure annotations in build action

* fix events test arn comparisons

* fix s3 encryption response test

* return default value "0" if EC2 availableIpAddressCount is empty

* fix extracting SecurityGroupIds for EC2 VPC endpoints

* support deleting/updating API Gateway DomainNames

* fix(events): Return empty string instead of null when no pattern is specified in EventPattern (tf-compat) (#22)

* fix logic and revert CF changes to get tests running again (#21)

* add support for EC2 customer gateway API (#25)

* add support for EC2 Transit Gateway APIs (#24)

* feat(logs): add `kmsKeyId` into `LogGroup` entity (#23)

* minor change in ELBv2 logic to fix tests

* feat(events): add APIs to describe and delete CloudWatch Events connections (#26)

* add support for EC2 transit gateway route tables (#27)

* pass transit gateway route table ID in Describe API, minor refactoring (#29)

* add support for EC2 Transit Gateway Routes (#28)

* fix region on ACM certificate import (#31)

* add support for EC2 transit gateway attachments (#30)

* add support for EC2 Transit Gateway VPN attachments (#32)

* fix account ID for logs API

* add support for DeleteOrganization API

* feat(events): store raw filter representation for CloudWatch events patterns (tf-compat) (#36)

* feat(events): add support to describe/update/delete CloudWatch API destinations (#35)

* add Cognito UpdateIdentityPool, CW Logs PutResourcePolicy

* feat(events): add support for tags in EventBus API (#38)

* fix parameter validation for Batch compute environments (tf-compat)

* revert merge conflicts in IMPLEMENTATION_COVERAGE.md

* format code using black

* restore original README; re-enable and fix CloudFormation tests

* restore tests and old logic for CF stack parameters from SSM

* parameterize RequestId/RequestID in response messages and revert related test changes

* undo LocalStack-specific adaptations

* minor fix

* Update CodeCov config to reflect removal of Py2

* undo change related to CW metric filtering; add additional test for CW metric statistics with dimensions

* Terraform - Extend whitelist of running tests

Co-authored-by: acsbendi <acsbendi28@gmail.com>
Co-authored-by: Phan Duong <duongpv@outlook.com>
Co-authored-by: Thomas Rausch <thomas@thrau.at>
Co-authored-by: Macwan Nevil <macnev2013@gmail.com>
Co-authored-by: Dominik Schubert <dominik.schubert91@gmail.com>
Co-authored-by: Gonzalo Saad <saad.gonzalo.ale@gmail.com>
Co-authored-by: Mohit Alonja <monty16597@users.noreply.github.com>
Co-authored-by: Miguel Gagliardo <migag9@gmail.com>
Co-authored-by: Bert Blommers <info@bertblommers.nl>
2021-07-26 15:21:17 +01:00

426 lines
14 KiB
Python

import logging
import boto
import boto3
from botocore.exceptions import ClientError
from moto import mock_datasync
import pytest
def create_locations(client, create_smb=False, create_s3=False):
"""
Convenience function for creating locations.
Locations must exist before tasks can be created.
"""
smb_arn = None
s3_arn = None
if create_smb:
response = client.create_location_smb(
ServerHostname="host",
Subdirectory="somewhere",
User="",
Password="",
AgentArns=["stuff"],
)
smb_arn = response["LocationArn"]
if create_s3:
response = client.create_location_s3(
S3BucketArn="arn:aws:s3:::my_bucket",
Subdirectory="dir",
S3Config={"BucketAccessRoleArn": "role"},
)
s3_arn = response["LocationArn"]
return {"smb_arn": smb_arn, "s3_arn": s3_arn}
@mock_datasync
def test_create_location_smb():
client = boto3.client("datasync", region_name="us-east-1")
response = client.create_location_smb(
ServerHostname="host",
Subdirectory="somewhere",
User="",
Password="",
AgentArns=["stuff"],
)
assert "LocationArn" in response
@mock_datasync
def test_describe_location_smb():
client = boto3.client("datasync", region_name="us-east-1")
agent_arns = ["stuff"]
user = "user"
response = client.create_location_smb(
ServerHostname="host",
Subdirectory="somewhere",
User=user,
Password="",
AgentArns=agent_arns,
)
response = client.describe_location_smb(LocationArn=response["LocationArn"])
assert "LocationArn" in response
assert "LocationUri" in response
assert response["User"] == user
assert response["AgentArns"] == agent_arns
@mock_datasync
def test_create_location_s3():
client = boto3.client("datasync", region_name="us-east-1")
response = client.create_location_s3(
S3BucketArn="arn:aws:s3:::my_bucket",
Subdirectory="dir",
S3Config={"BucketAccessRoleArn": "role"},
)
assert "LocationArn" in response
@mock_datasync
def test_describe_location_s3():
client = boto3.client("datasync", region_name="us-east-1")
s3_config = {"BucketAccessRoleArn": "role"}
response = client.create_location_s3(
S3BucketArn="arn:aws:s3:::my_bucket", Subdirectory="dir", S3Config=s3_config
)
response = client.describe_location_s3(LocationArn=response["LocationArn"])
assert "LocationArn" in response
assert "LocationUri" in response
assert response["S3Config"] == s3_config
@mock_datasync
def test_describe_location_wrong():
client = boto3.client("datasync", region_name="us-east-1")
agent_arns = ["stuff"]
user = "user"
response = client.create_location_smb(
ServerHostname="host",
Subdirectory="somewhere",
User=user,
Password="",
AgentArns=agent_arns,
)
with pytest.raises(ClientError) as e:
response = client.describe_location_s3(LocationArn=response["LocationArn"])
@mock_datasync
def test_list_locations():
client = boto3.client("datasync", region_name="us-east-1")
response = client.list_locations()
assert len(response["Locations"]) == 0
create_locations(client, create_smb=True)
response = client.list_locations()
assert len(response["Locations"]) == 1
assert response["Locations"][0]["LocationUri"] == "smb://host/somewhere"
create_locations(client, create_s3=True)
response = client.list_locations()
assert len(response["Locations"]) == 2
assert response["Locations"][1]["LocationUri"] == "s3://my_bucket/dir"
create_locations(client, create_s3=True)
response = client.list_locations()
assert len(response["Locations"]) == 3
assert response["Locations"][2]["LocationUri"] == "s3://my_bucket/dir"
@mock_datasync
def test_delete_location():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_smb=True)
response = client.list_locations()
assert len(response["Locations"]) == 1
location_arn = locations["smb_arn"]
response = client.delete_location(LocationArn=location_arn)
response = client.list_locations()
assert len(response["Locations"]) == 0
with pytest.raises(ClientError):
response = client.delete_location(LocationArn=location_arn)
@mock_datasync
def test_create_task():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_smb=True, create_s3=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
)
assert "TaskArn" in response
@mock_datasync
def test_create_task_fail():
"""Test that Locations must exist before a Task can be created"""
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_smb=True, create_s3=True)
with pytest.raises(ClientError) as e:
response = client.create_task(
SourceLocationArn="1", DestinationLocationArn=locations["s3_arn"]
)
with pytest.raises(ClientError) as e:
response = client.create_task(
SourceLocationArn=locations["smb_arn"], DestinationLocationArn="2"
)
@mock_datasync
def test_list_tasks():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
)
response = client.create_task(
SourceLocationArn=locations["s3_arn"],
DestinationLocationArn=locations["smb_arn"],
Name="task_name",
)
response = client.list_tasks()
tasks = response["Tasks"]
assert len(tasks) == 2
task = tasks[0]
assert task["Status"] == "AVAILABLE"
assert "Name" not in task
task = tasks[1]
assert task["Status"] == "AVAILABLE"
assert task["Name"] == "task_name"
@mock_datasync
def test_describe_task():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
task_arn = response["TaskArn"]
response = client.describe_task(TaskArn=task_arn)
assert "TaskArn" in response
assert "Status" in response
assert "SourceLocationArn" in response
assert "DestinationLocationArn" in response
@mock_datasync
def test_describe_task_not_exist():
client = boto3.client("datasync", region_name="us-east-1")
with pytest.raises(ClientError) as e:
client.describe_task(TaskArn="abc")
@mock_datasync
def test_update_task():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
initial_name = "Initial_Name"
updated_name = "Updated_Name"
initial_options = {
"VerifyMode": "NONE",
"Atime": "BEST_EFFORT",
"Mtime": "PRESERVE",
}
updated_options = {
"VerifyMode": "POINT_IN_TIME_CONSISTENT",
"Atime": "BEST_EFFORT",
"Mtime": "PRESERVE",
}
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name=initial_name,
Options=initial_options,
)
task_arn = response["TaskArn"]
response = client.describe_task(TaskArn=task_arn)
assert response["TaskArn"] == task_arn
assert response["Name"] == initial_name
assert response["Options"] == initial_options
response = client.update_task(
TaskArn=task_arn, Name=updated_name, Options=updated_options
)
response = client.describe_task(TaskArn=task_arn)
assert response["TaskArn"] == task_arn
assert response["Name"] == updated_name
assert response["Options"] == updated_options
with pytest.raises(ClientError):
client.update_task(TaskArn="doesnt_exist")
@mock_datasync
def test_delete_task():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
response = client.list_tasks()
assert len(response["Tasks"]) == 1
task_arn = response["Tasks"][0]["TaskArn"]
assert task_arn is not None
response = client.delete_task(TaskArn=task_arn)
response = client.list_tasks()
assert len(response["Tasks"]) == 0
with pytest.raises(ClientError):
response = client.delete_task(TaskArn=task_arn)
@mock_datasync
def test_start_task_execution():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
task_arn = response["TaskArn"]
response = client.describe_task(TaskArn=task_arn)
assert "CurrentTaskExecutionArn" not in response
response = client.start_task_execution(TaskArn=task_arn)
assert "TaskExecutionArn" in response
task_execution_arn = response["TaskExecutionArn"]
response = client.describe_task(TaskArn=task_arn)
assert response["CurrentTaskExecutionArn"] == task_execution_arn
@mock_datasync
def test_start_task_execution_twice():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
task_arn = response["TaskArn"]
response = client.start_task_execution(TaskArn=task_arn)
assert "TaskExecutionArn" in response
task_execution_arn = response["TaskExecutionArn"]
with pytest.raises(ClientError) as e:
response = client.start_task_execution(TaskArn=task_arn)
@mock_datasync
def test_describe_task_execution():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
task_arn = response["TaskArn"]
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "AVAILABLE"
response = client.start_task_execution(TaskArn=task_arn)
task_execution_arn = response["TaskExecutionArn"]
# Each time task_execution is described the Status will increment
# This is a simple way to simulate a task being executed
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "INITIALIZING"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "RUNNING"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "PREPARING"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "RUNNING"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "TRANSFERRING"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "RUNNING"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "VERIFYING"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "RUNNING"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "SUCCESS"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "AVAILABLE"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["TaskExecutionArn"] == task_execution_arn
assert response["Status"] == "SUCCESS"
response = client.describe_task(TaskArn=task_arn)
assert response["Status"] == "AVAILABLE"
@mock_datasync
def test_describe_task_execution_not_exist():
client = boto3.client("datasync", region_name="us-east-1")
with pytest.raises(ClientError) as e:
client.describe_task_execution(TaskExecutionArn="abc")
@mock_datasync
def test_cancel_task_execution():
client = boto3.client("datasync", region_name="us-east-1")
locations = create_locations(client, create_s3=True, create_smb=True)
response = client.create_task(
SourceLocationArn=locations["smb_arn"],
DestinationLocationArn=locations["s3_arn"],
Name="task_name",
)
task_arn = response["TaskArn"]
response = client.start_task_execution(TaskArn=task_arn)
task_execution_arn = response["TaskExecutionArn"]
response = client.describe_task(TaskArn=task_arn)
assert response["CurrentTaskExecutionArn"] == task_execution_arn
assert response["Status"] == "RUNNING"
response = client.cancel_task_execution(TaskExecutionArn=task_execution_arn)
response = client.describe_task(TaskArn=task_arn)
assert "CurrentTaskExecutionArn" not in response
assert response["Status"] == "AVAILABLE"
response = client.describe_task_execution(TaskExecutionArn=task_execution_arn)
assert response["Status"] == "ERROR"