Amazon ECS helps a local integration with Amazon EBS volumes for data-intensive workloads

0
424
Amazon ECS helps a local integration with Amazon EBS volumes for data-intensive workloads


Voiced by Polly

Today we’re saying that Amazon Elastic Container Service (Amazon ECS) helps an integration with Amazon Elastic Block Store (Amazon EBS), making it simpler to run a wider vary of knowledge processing workloads. You can provision Amazon EBS storage to your ECS duties operating on AWS Fargate and Amazon Elastic Compute Cloud (Amazon EC2) while not having to handle storage or compute.

Many organizations select to deploy their functions as containerized packages, and with the introduction of Amazon ECS integration with Amazon EBS, organizations can now run extra forms of workloads than earlier than.

You can run information workloads requiring storage that helps excessive transaction volumes and throughput, corresponding to extract, rework, and cargo (ETL) jobs for large information, which have to fetch current information, carry out processing, and retailer this processed information for downstream use. Because the storage lifecycle is totally managed by Amazon ECS, you don’t have to construct any further scaffolding to handle infrastructure updates, and consequently, your information processing workloads are actually extra resilient whereas concurrently requiring much less effort to handle.

Now you may select from a wide range of storage choices to your containerized functions operating on Amazon ECS:

  • Your Fargate duties get 20 GiB of ephemeral storage by default. For functions that want further space for storing to obtain massive container photographs or for scratch work, you may configure as much as 200 GiB of ephemeral storage to your Fargate duties.
  • For functions that span many duties that want concurrent entry to a shared dataset, you may configure Amazon ECS to mount the Amazon Elastic File System (Amazon EFS) file system to your ECS duties operating on each EC2 and Fargate. Common examples of such workloads embody internet functions corresponding to content material administration programs, inner DevOps instruments, and machine studying (ML) frameworks. Amazon EFS is designed to be out there throughout a Region and could be concurrently connected to many duties.
  • For functions that want high-performance, low-cost storage that doesn’t must be shared throughout duties, you may configure Amazon ECS to provision and connect Amazon EBS storage to your duties operating on each Amazon EC2 and Fargate. Amazon EBS is designed to supply block storage with low latency and excessive efficiency inside an Availability Zone.

To be taught extra, see Using information volumes in Amazon ECS duties and persistent storage greatest practices within the AWS documentation.

Getting began with EBS quantity integration to your ECS duties
You can configure the amount mount level to your container within the job definition and go Amazon EBS storage necessities to your Amazon ECS job at runtime. For most use circumstances, you may get began by merely offering the dimensions of the amount wanted for the duty. Optionally, you may configure all EBS quantity attributes and the file system you need the amount formatted with.

1. Create a job definition
Go to the Amazon ECS console, navigate to Task definitions, and select Create new job definition.

In the Storage part, select Configure at deployment to set EBS quantity as a brand new configuration kind. You can provision and connect one quantity per job for Linux file programs.

When you select Configure at job definition creation, you may configure current storage choices corresponding to bind mounts, Docker volumes, EFS volumes, Amazon FSx for Windows File Server volumes, or Fargate ephemeral storage.

Now you may choose a container within the job definition, the supply EBS quantity, and supply a mount path the place the amount shall be mounted within the job.

You may also use $aws ecs register-task-definition --cli-input-json file://instance.json command line to register a job definition so as to add an EBS quantity. The following snippet is a pattern, and job definitions are saved in JSON format.

{
    "household": "nginx"
    ...
    "containerDefinitions": [
        {
            ...
            "mountPoints": [
                "containerPath": "/foo",
                "sourceVolume": "new-ebs-volume"
            ],
            "title": "nginx",
            "picture": "nginx"
        }
    ],
    "volumes": [
       {
           "name": "/foo",
           "configuredAtRuntime": true
       }
    ]
}

2. Deploy and run your job with EBS quantity
Go to your ECS cluster and select Run new job. Note you can choose the compute choices, the launch kind, and your job definition.

Note: While this instance goes by means of deploying a standalone job with an connected EBS quantity, you can too configure a brand new or current ECS service to make use of EBS volumes with the specified configuration.

You have a brand new Volume part the place you may configure the extra storage. The quantity title, kind, and mount factors are those who you outlined in your job definition. Choose your EBS quantity varieties, sizes (GiB), IOPS, and the specified throughput.

You can’t connect an current EBS quantity to an ECS job. But if you wish to create a quantity from an current snapshot, you’ve the choice to decide on your snapshot ID. If you wish to create a brand new quantity, then you may go away this subject empty. You can select the file system kind: ext3, ext4, or xfs file programs on Linux.

By default, when a job is terminated, Amazon ECS deletes the connected quantity. If you want the information within the EBS quantity to be retained after the duty exits, uncheck Delete on termination. Also, it’s worthwhile to create an AWS Identity and Access Management (IAM) position for quantity administration that incorporates the related permissions to permit Amazon ECS to make API calls in your behalf. For extra info on this coverage, see infrastructure position within the AWS documentation.

You may also configure encryption by default in your EBS volumes utilizing both Amazon managed keys and buyer managed keys. To be taught extra in regards to the choices, see our Amazon EBS encryption within the AWS documentation.

After configuring all job settings, select Create to start out your job.

3. Deploy and run your job with EBS quantity
Once your job has began, you may see the amount info on the duty particulars web page. Choose a job and choose the Volumes tab to search out your created EBS quantity particulars.

Your workforce can set up the event and operations of EBS volumes extra effectively. For instance, utility builders can configure the trail the place your utility expects storage to be out there within the job definition, and DevOps engineers can configure the precise EBS quantity attributes at runtime when the applying is deployed.

This permits DevOps engineers to deploy the identical job definition to completely different environments with differing EBS quantity configurations, for instance, gp3 volumes within the improvement environments and io2 volumes in manufacturing.

Now out there
Amazon ECS integration with Amazon EBS is obtainable in 9 AWS Regions: US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), and Europe (Stockholm). You solely pay for what you utilize, together with EBS volumes and snapshots. To be taught extra, see the Amazon EBS pricing web page and Amazon EBS volumes in ECS within the AWS documentation.

Give it a strive now and ship suggestions to our public roadmap, AWS re:Post for Amazon ECS, or by means of your normal AWS Support contacts.

Channy

P.S. Special due to Maish Saidel-Keesing, a senior enterprise developer advocate at AWS for his contribution in scripting this weblog submit.

A correction was made on January 12, 2024: An earlier model of this submit misstated: I modified 1) from “either ext3 or ext4” to “ext3, ext4, or xfs”, 2) from “check Delete on termination” to “uncheck Delete on termination”, 3) from “configure encryption”, “by default configure encryption”, and 4) from “task definition details page” to “task details page”.



LEAVE A REPLY

Please enter your comment!
Please enter your name here