Building Modern Data Applications Using Databricks Lakehouse

Building Modern Data Applications Using Databricks Lakehouse

eBook Details:

  • Paperback: 246 pages
  • Publisher: WOW! eBook (October 31, 2024)
  • Language: English
  • ISBN-10: 1801073236
  • ISBN-13: 978-1801073233

eBook Description:

Building Modern Data Applications Using Databricks Lakehouse: Develop, optimize, and monitor data pipelines on Databricks. Get up to speed with the Databricks Data Intelligence Platform to build and scale modern data applications, leveraging the latest advancements in data engineering.

With so many tools to choose from in today’s data engineering development stack as well as operational complexity, this often overwhelms data engineers, causing them to spend less time gleaning value from their data and more time maintaining complex data pipelines. Guided by a lead specialist solutions architect at Databricks with 10+ years of experience in data and AI, this Building Modern Data Applications Using Databricks Lakehouse book shows you how the Delta Live Tables framework simplifies data pipeline development by allowing you to focus on defining input data sources, transformation logic, and output table destinations.

This book gives you an overview of the Delta Lake format, the Databricks Data Intelligence Platform, and the Delta Live Tables framework. It teaches you how to apply data transformations by implementing the Databricks medallion architecture and continuously monitor the data quality of your pipelines. You’ll learn how to handle incoming data using the Databricks Auto Loader feature and automate real-time data processing using Databricks workflows. You’ll master how to recover from runtime errors automatically.

  • Deploy near-real-time data pipelines in Databricks using Delta Live Tables
  • Orchestrate data pipelines using Databricks workflows
  • Implement data validation policies and monitor/quarantine bad data
  • Apply slowly changing dimensions (SCD), Type 1 and 2, data to lakehouse tables
  • Secure data access across different groups and users using Unity Catalog
  • Automate continuous data pipeline deployment by integrating Git with build tools such as Terraform and Databricks Asset Bundles

By the end of this Building Modern Data Applications Using Databricks Lakehouse book, you’ll be able to build a real-time data pipeline from scratch using Delta Live Tables, leverage CI/CD tools to deploy data pipeline changes automatically across deployment environments, and monitor, control, and optimize cloud costs.

DOWNLOAD

Leave a Reply

Your email address will not be published. Required fields are marked *