Easy ways to migrate petabytes of data to Amazon S3 using AWS DataSync

Organizations are often faced with challenges in migrating vast amounts of data efficiently and effectively from their on-premises data storage environments to AWS. Planning and moving petabytes of HDFS data is a massive task to execute. In this session, learn simple steps to migrate and ingest data at scale to Amazon S3. We share best practices to build the right architecture on AWS for AWS DataSync service to migrate data in a faster, more secured, and cost-effective manner. We also explain how AWS DataSync can do all the heavy lifting and help ingest the data with ease to Amazon S3.


Speaker: Ameen Khan, Storage Specialist Solutions Architect, AWS
Download slides »

Previous Video
Black belt tips Operating at scale with your data in an Amazon S3 data lake
Black belt tips Operating at scale with your data in an Amazon S3 data lake

Simplicity in access, observability, monitoring, automation, protecting data, and cost optimization are key...

Next Video
Turn streaming data into real-time insights with AWS serverless technologies
Turn streaming data into real-time insights with AWS serverless technologies

There is increasing need to deal with an ever-growing volume of streaming data and the challenges of turnin...