Source Systems, Data Ingestion, and Pipelines

DeepLearning.AI via Coursera

Go to Course: https://www.coursera.org/learn/source-systems-data-ingestion-and-pipelines

Introduction

Gather stakeholder needs and translate them into system requirements.

Implement a batch and a streaming ingestion process on AWS to ingest data from various source systems.

Integrate aspects of security, data management, DataOps and orchestration into the data systems you build.

Syllabus

Working with Source Systems

In lesson 1, you will explore source systems data engineers commonly interact with. Then in lesson 2, you will learn how to connect to various source systems and troubleshoot common connectivity issues.

Data Ingestion

This week you will dive deep into the batch and streaming ingestion patterns. You will identify use cases and considerations for each, and then build a batch and a streaming ingestion pipeline. When looking at batch ingestion, you will compare and contrast the ETL and ELT paradigms. You will also explore various AWS services for batch and streaming ingestion.

DataOps

In the first lesson, you will explore DataOps automation practices, including applying CI/CD to both data and code, and using infrastructure as code tools like Terraform to automate the provisioning and management of your resources. Then in lesson 2, you will explore DataOps observability and monitoring practices, including using tools like Great Expectation to monitor data quality, and using Amazon CloudWatch to monitor your infrastructure.

Orchestration, Monitoring, and Automating Your Data Pipelines

This week, you will learn all about orchestrating your data pipeline tasks. You'll identify the various orchestration tools, but will focus on Airflow -- one of the most popular and widely used tools in the field today. You'll explore the core components of Airflow, the Airflow UI, and how to create and manage DAGs using various Airflow features.

Overview

In this course, you will explore various types of source systems, learn how they generate and update data, and troubleshoot common issues you might encounter when trying to connect to these systems in the real world. You’ll dive into the details of common ingestion patterns and implement batch and streaming pipelines. You’ll automate and orchestrate your data pipelines using infrastructure as code and pipelines as code tools. You’ll also explore AWS and open source tools for monitoring your data

Skills

Reviews

great learning and exercises. appreciate the community.

Really valuable, and I got an idea of data-related concepts and infrastructure management.

All concepts related to these topics are explained clearly.

Excellent course, with up to date technology, interesting labs and challenging quizzes. Highly recommended.