Database Data Transformation for Data Engineers | by 💡Mike Shakhomirov | Feb, 2024

Advanced techniques for beginners

14 min read

18 hours ago

AI generated image using Kandinsky

In this story, I would like to raise a discussion on how we transform data. Whether it’s a database, data warehouse or reporting solution we run data transformations based on data models but how do we organise them? I would like to talk about the modern data transformation tools you use. We will touch on some nuances of the modular approach, scheduling and data transformation tests. At the end of this article, I will provide an example application to run data modelling tasks with data lineage and self-documenting features. I’m very keen to know what you think about it.

I witnessed dozens of various ways to run data transformations. Throughout my more than fifteen-year career in big data and analytics, I built data pipelines with different design patterns and I’m sure there are more. That’s why I like the technology world so much. The multitude of possibilities it offers is simply amazing.

Which operating system do you use for your data warehouse?

Modern data transformation tools

Modern data transformation tools also known as data modelling tools or data warehouse (DWH) operating systems were designed to simplify SQL data manipulation tasks to create datasets, views and tables. Often they use SQL-like dialect to run any possible data definitions (DDL) and manipulations (DML) we might need including data transformation tests and custom dataset creation in development mode.

The abundance of ANSI-SQL data warehouse solutions in the market makes these tools extremely useful. For instance, consider this list of dbt adaptors below. All market leaders are present there.

Creating a new connection using dbt. Image by author.

dbt stands for database build tool and it is essentially a scheduler application that can be run locally or on the server to run data transformation tasks. For example, consider this simple model below. It creates a view in our database and we can materialise it let’s say every 5 minutes to preserve the data for analytics. At the top of the file we have…