Build. Collaborate.
Make Impact.
Join the Ultimate 3-Week Workshop to break into Data Engineering
(Starting Apr 15, 2025)

Why Join the Workshop ?
Register FAST!! Only accepting ~10 people for workshop for better personalized guidance

Hands-On Projects
Work on 3 Industry Projects that give you hands-on industry exposure and a portfolio boost

Data Eng Roadmap
Getting Started? You will get a clear roadmap of what tools to use and how to get started

Project Support
Build your own project using the walkthrough form live demonstrations

Mentorship
Get guided by experienced professionals throughout the journey.
Agenda – Portfolio Projects

Week 1: Building Data Pipelines with Airflow
✅ Introduction to Airflow and its components (DAGs, Operators, Tasks, Scheduler)
✅ Setting up Airflow locally using Docker or a virtual environment
✅ Exploring the Airflow Web UI to monitor DAGs and tasks
✅ Creating a basic ETL pipeline (Extract, Transform, Load)
✅ Scheduling tasks with Airflow’s cron-like syntax
✅ Implementing error handling and retries for tasks
✅ Monitoring pipeline execution and logs in Airflow

Week 2: Building Data Pipelines with Azure
✅ Introduction to Azure Data Factory (ADF) and Databricks
✅ Setting up Azure Blob Storage and configuring ADF
✅ Connecting ADF to data sources (APIs, databases)
✅ Building an ETL pipeline using ADF to extract, transform, and load data
✅ Setting up and processing data in Azure Databricks
✅ Storing processed data in Azure SQL Database or Blob Storage
✅ Scheduling and monitoring pipeline execution with ADF

Week 3: Building Data Pipelines on AWS
✅ Introduction to AWS Glue, Lambda, and S3
✅ Setting up AWS services (S3, Glue, Lambda) and IAM roles
✅ Creating an S3 bucket for data storage
✅ Building an ETL pipeline with AWS Glue (extract, transform, load)
✅ Processing real-time data using AWS Lambda triggered by S3 events
✅ Storing results in DynamoDB or Redshift
✅ Monitoring pipeline execution using AWS CloudWatch
DataDroolers Initiative is to help aspirants break into data engineering and promote hands-on learning through real-world projects
– Sunjana Ramana, Instructor for the Workshop

Frequently Asked Questions
95% of questions are answered here. Can’t find what you’re looking for? Feel free to reach out!
Who is this workshop for?
This workshop is designed for aspiring data engineers, software engineers, data scientists, and anyone looking to gain hands-on experience with real-world data engineering projects. A basic understanding of SQL and Python is helpful.
Do I need prior experience to join?
Not necessarily! This workshop is structured to help both beginners and those with some experience build practical, portfolio-worthy projects from scratch. If you have some background in Python and SQL, you’ll be able to follow along more easily.
What technologies will we use?
The workshop will cover tools commonly used in data engineering, such as SQL, Python, Apache Airflow, dbt, Spark, and cloud platforms like AWS/GCP/Azure. Each project will introduce new tools and concepts relevant to real-world data engineering workflows.
How much time will I need to dedicate each week?
Expect to spend around 4-6 hours per week, including 1hr live sessions every week, hands-on coding assignment, and project implementation. The goal is to make it manageable for professionals and students alike.
Do I need a cloud account?
Yes, you will need a free-tier account on AWS, GCP, or Azure to deploy and test your projects. We’ll provide guidance on setting up the necessary environments.
Will this help me get a data engineering job?
This workshop helps you builds a strong portfolio of projects that showcase real-world skills—something employers value highly. It’s a great way to gain experience and stand out in job applications. However, the workshop won’t guarantee a job by itself. Your portfolio is a major part of the job hunt process and this workshop will help you make it better.
Don’t Miss Out!!
Give your resume a boost in just 3-weeks with hand-on projects
Copyright © 2025 www.datadrooler.com