├── .devcontainer ├── devcontainer.json └── docker-compose.yml ├── .gitignore ├── .python-version ├── .vscode └── settings.json ├── README.md ├── config.yaml ├── external_models └── external_models.yaml ├── img ├── step_dev_environment.png ├── step_inside_codespaces.png ├── step_model_versioning.png ├── step_run_codespaces.png ├── step_sqltool_password.png └── step_sqltool_query.png ├── models ├── full_model.sql ├── inc_model.sql ├── python_model.py ├── seed_model.sql └── view_model.sql ├── pyproject.toml ├── scripts ├── duckdb-init.sql └── insert-rows.sql ├── seeds └── seed_data.csv ├── tests └── test_python_model.yaml └── workshop.session.sql /.devcontainer/devcontainer.json: -------------------------------------------------------------------------------- 1 | // For format details, see https://aka.ms/devcontainer.json. For config options, see the 2 | // README at: https://github.com/devcontainers/templates/tree/main/src/python 3 | { 4 | "name": "Workshop SQLMesh", 5 | "dockerComposeFile": "docker-compose.yml", 6 | "service": "devcontainer", 7 | "workspaceFolder": "/workspaces/${localWorkspaceFolderBasename}", 8 | "features": { 9 | "ghcr.io/jsburckhardt/devcontainer-features/uv:1": {}, 10 | "ghcr.io/eitsupi/devcontainer-features/duckdb-cli:1": {} 11 | }, 12 | "customizations": { 13 | "vscode": { 14 | "extensions": [ 15 | "ms-python.python", 16 | "ms-python.debugpy", 17 | "ms-python.vscode-pylance", 18 | "charliermarsh.ruff", 19 | "mtxr.sqltools", 20 | "mtxr.sqltools-driver-pg", 21 | "RandomFractalsInc.duckdb-sql-tools" 22 | ] 23 | } 24 | }, 25 | "postCreateCommand": "rm -Rf db.db && cat ./scripts/duckdb-init.sql | duckdb db.db" 26 | 27 | // Use 'forwardPorts' to make a list of ports inside the container available locally. 28 | // "forwardPorts": [], 29 | 30 | // Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root. 31 | // "remoteUser": "root" 32 | } 33 | -------------------------------------------------------------------------------- /.devcontainer/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3.8' 2 | services: 3 | devcontainer: 4 | image: "mcr.microsoft.com/devcontainers/python:3.12" 5 | volumes: 6 | - ../..:/workspaces:cached 7 | network_mode: service:db 8 | command: sleep infinity 9 | 10 | db: 11 | image: postgres:latest 12 | restart: unless-stopped 13 | # volumes: 14 | # - postgres-data:/var/lib/postgresql/data 15 | environment: 16 | POSTGRES_PASSWORD: postgres 17 | POSTGRES_USER: postgres 18 | POSTGRES_DB: workshop 19 | 20 | # volumes: 21 | # postgres-data: -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Python-generated files 2 | __pycache__/ 3 | *.py[oc] 4 | build/ 5 | dist/ 6 | wheels/ 7 | *.egg-info 8 | 9 | # Virtual environments 10 | .venv 11 | 12 | # Git 13 | .gitkeep 14 | 15 | # uv 16 | uv.lock 17 | 18 | # SQLMesh 19 | logs 20 | .cache 21 | 22 | # duckdb 23 | *.db 24 | *.duckdb -------------------------------------------------------------------------------- /.python-version: -------------------------------------------------------------------------------- 1 | 3.12 2 | -------------------------------------------------------------------------------- /.vscode/settings.json: -------------------------------------------------------------------------------- 1 | { 2 | "sqltools.connections": [ 3 | { 4 | "previewLimit": 50, 5 | "server": "localhost", 6 | "port": 5432, 7 | "driver": "PostgreSQL", 8 | "name": "workshop", 9 | "database": "workshop", 10 | "username": "postgres" 11 | } 12 | ] 13 | } -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Building pipelines with SQLMesh - workshop 2 | 3 | > [!NOTE] 4 | > 📖 [Presentation link](https://pitch.com/v/workshop-sqlmesh-txrd72) 5 | 6 | Hello 👋 This is the README for "Building pipelines with SQLMesh" workshop. This document is the main place to follow all the workshop instructions. 7 | 8 | SQLMesh is a relative new python library, and it tries to introduce concepts and good practices adopted from other mainstream tools like Terraform. This is a similar tool to DBT, but with a different approach. DBT is stateless, SQLMesh is not. A main feature SQLMesh offers to developers is to verify (with plan command) what is going to happen before the execution (More or less as Terraform work with Infrastructure as code). 9 | 10 | This repository provides you with all the necessary tools to explore the SQLMesh library and all the features it has. The repository will load a sample records on a duckdb databases named `db.db`. The user will have SQLTool extension ready to access to the Postgres database that will store a lot interesting information about what you will be doing. 11 | 12 | **Let's start!** 🏎️ 13 | 14 | ### Step: Open Github Codespaces 15 | 16 | Let's start opening the environment with Codespaces. It is really straightforward, you only need to click on *`Code`* button, and then *`Create Codespace`* on branch `main`. 17 | 18 | > [!CAUTION] 19 | > I assume you will have credits to run Codespaces. 20 | > _"GitHub will provide users in the free plan 120 core hours or 60 hours of run time on a 2 core codespace, plus 15 GB of storage each month"_ 21 | > 22 | > If not, people can run this project using DevContainers locally installing the proper [VSCODE extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack). 23 | 24 | 25 | ![create_codespace](./img/step_run_codespaces.png) 26 | 27 | You will see that a visual studio code opens on the browser 28 | 29 | ![first_screen_codespace](./img/step_inside_codespaces.png) 30 | 31 | ### Step: Create SQLMesh project 32 | 33 | Uau! Setup ready, congratulations! This was a really important 👏🏻. 34 | 35 | Now it's time to initialize the SQLMesh project. We will use [*uv*](https://docs.astral.sh/uv/) python package manager for this. 36 | 37 | ``` 38 | uv tool run sqlmesh init -t empty duckdb 39 | ``` 40 | 41 | This command will initialize an empty SQLMesh project with all the proper folders to use during the tutorial. 42 | 43 | ### Step: Define the state connection 44 | 45 | It's time to explore the [**`config.yml`**](config.yaml). This file is generated by `sqlmesh init` command with some default values. By default we will use _DuckDB_ as the main data storage to save our models. You can define multiple gateways, but you must use one as default. 46 | 47 | ```yml 48 | gateways: 49 | local: 50 | connection: 51 | type: duckdb 52 | database: db.db 53 | 54 | default_gateway: local 55 | 56 | model_defaults: 57 | dialect: duckdb 58 | start: 2024-11-01 59 | ``` 60 | 61 | As we mentioned before, SQLMesh use state to control what or not is executed. By default we can use DuckDB, but given concurrency limitations the connection may hang. 62 | 63 | > _"DuckDB does not support concurrent connections, so it may "hang" when used as a state database if the primary connection's concurrent_tasks value is greater than 1."_ 64 | 65 | Given this consideration, we preferred to use a transactional db, a postgreSQL instance. 66 | 67 | ```diff 68 | gateways: 69 | local: 70 | connection: 71 | type: duckdb 72 | database: db.db 73 | + state_connection: 74 | + type: postgres 75 | + host: db 76 | + port: 5432 77 | + user: postgres 78 | + password: postgres 79 | + database: workshop 80 | 81 | default_gateway: local 82 | 83 | model_defaults: 84 | dialect: duckdb 85 | start: 2024-11-01 86 | ``` 87 | 88 | A postgres instance is available as well, and we have access to the SQLTool extension to query the data. Try to check that you have access. CLick on the SQLTool vscode extension (left panel) and connect. the browser will ask for the password. 89 | 90 | ![sqltool_pwd](./img/step_sqltool_password.png) 91 | 92 | Once you are connected, run a simple query to check basic database information. 93 | 94 | ![sqltool_query](./img/step_sqltool_query.png) 95 | 96 | Cool 👌! Everything ready to start with SQLMesh models. 97 | 98 | ### Step: Define an external model 99 | 100 | As any other Data project, we need source data. We added some fake data to an initial duckdb database. Easy to access with command `duckdb db.db`, then run the query: 101 | 102 | ```sql 103 | select * from workshop.events 104 | ``` 105 | 106 | This data source is covered under the `External model` option in SQLMesh. Any model not managed by SQLMesh can be defined via [**`yaml`**](./external_models/external_models.yaml) file. In this file we define the gateway namespace, the table name, and the column types. 107 | 108 | ```yaml 109 | - name: '"db"."workshop"."events"' 110 | gateway: local 111 | columns: 112 | id: INT 113 | amount: FLOAT 114 | event_date: TIMESTAMP 115 | ``` 116 | 117 | Now SQLMesh is aware of this source table. 118 | 119 | ### Step: Build the first SQL model 120 | 121 | Now, create a simple [`FULL` SQL model](./models/full_model.sql) to query the `EXTERNAL` model. 122 | 123 | ```sql 124 | MODEL ( 125 | name core.full, 126 | kind FULL, 127 | cron '@daily' 128 | ); 129 | 130 | SELECT 131 | id, 132 | amount, 133 | event_date 134 | FROM workshop.events 135 | ``` 136 | 137 | Once this model is defined, we are ready to apply our plan on the actual environment. 138 | 139 | ### Step: Running the first plan on Prod 140 | 141 | ```bash 142 | uv run sqlmesh plan 143 | ``` 144 | 145 | Output: 146 | ``` 147 | ====================================================================== 148 | Successfully Ran 1 tests against duckdb 149 | ---------------------------------------------------------------------- 150 | New environment `prod` will be created from `prod` 151 | Summary of differences against `prod`: 152 | Models: 153 | └── Added: 154 | └── core.full 155 | Models needing backfill (missing dates): 156 | └── core.full: 2024-11-01 - 2024-11-08 157 | Apply - Backfill Tables [y/n]: 158 | ``` 159 | 160 | ### Step: Modify the previous SQL model 161 | 162 | ```diff 163 | MODEL ( 164 | name core.full, 165 | kind FULL, 166 | cron '@daily' 167 | ); 168 | 169 | SELECT 170 | id, 171 | amount, 172 | + 1 as active, 173 | event_date 174 | FROM workshop.events 175 | ``` 176 | 177 | ### Step: Run the dev plan 178 | 179 | ```bash 180 | uv run sqlmesh plan dev 181 | ``` 182 | 183 | Output: 184 | ``` 185 | ====================================================================== 186 | Successfully Ran 1 tests against duckdb 187 | ---------------------------------------------------------------------- 188 | Summary of differences against `dev`: 189 | Models: 190 | └── Directly Modified: 191 | └── core__dev.full 192 | --- 193 | +++ 194 | @@ -7,6 +7,6 @@ 195 | SELECT 196 | id, 197 | amount, 198 | 199 | + 1 AS active, 200 | 201 | event_date 202 | FROM workshop.events 203 | Directly Modified: core__dev.full (Breaking) 204 | Models needing backfill (missing dates): 205 | └── core__dev.full: 2024-11-01 - 2024-11-08 206 | Enter the backfill start date (eg. '1 year', '2020-01-01') or blank to backfill from the beginning of history: 207 | ``` 208 | Take a minute to check what happened on the state database (PotgreSQL instance). 209 | 210 | ```sql 211 | -- Query on Postgres 212 | 213 | select * from sqlmesh._snapshots; 214 | ``` 215 | 216 | On the following picture we can see that SQLMesh created 2 rows with different version number on `_intervals` table. That makes completely sense! We modified the `core.full` model introducing a breaking change, so a new variant was created. 217 | 218 | ![](./img/step_dev_environment.png) 219 | 220 | At the same time we can see the view living on the virtual layers. `core.full` on the prod environment, and `core__dev` on the dev environment. Checking the `CREATE STATEMENT` on duckdb we see: 221 | 222 | ``` 223 | D .schema 224 | ``` 225 | Output: 226 | 227 | ```sql 228 | -- CREATE TABLE FOR EXTERNAL MODEL 229 | CREATE TABLE workshop.events(id INTEGER, amount FLOAT, event_date TIMESTAMP); 230 | 231 | -- CREATE PHYSICAL TABLES (KNOWN AS SNAPSHOTS) ON PHYSICAL LAYER 232 | CREATE TABLE sqlmesh__core.core__full__1500675415(id INTEGER, amount FLOAT, event_date TIMESTAMP); 233 | CREATE TABLE sqlmesh__core.core__full__3227676363(id INTEGER, amount FLOAT, active INTEGER, event_date TIMESTAMP); 234 | 235 | -- CREATE VIEWS FOR EACH VIRTUAL LAYER 236 | CREATE VIEW core."full" AS SELECT * FROM db.sqlmesh__core.core__full__3227676363; 237 | CREATE VIEW core__dev."full" AS SELECT * FROM db.sqlmesh__core.core__full__3227676363; 238 | ``` 239 | 240 | ### Step: Validate data 241 | 242 | ```bash 243 | uv run sqlmesh fetchdf "select * from core__dev.full" 244 | ``` 245 | 246 | Output: 247 | ``` 248 | 249 | id amount active event_date 250 | 0 101 150.00 1 2024-11-01 00:00:00 251 | 1 102 200.50 1 2024-11-01 00:00:00 252 | 2 101 175.25 1 2024-11-02 00:00:00 253 | 3 103 300.75 1 2024-11-02 00:00:00 254 | 4 102 250.00 1 2024-11-03 00:00:00 255 | 5 103 320.00 1 2024-11-03 00:00:00 256 | 6 101 160.50 1 2024-11-04 00:00:00 257 | 7 102 210.25 1 2024-11-04 00:00:00 258 | 8 101 180.75 1 2024-11-05 00:00:00 259 | 9 103 340.00 1 2024-11-05 00:00:00 260 | 10 101 175.25 1 2024-11-06 00:00:00 261 | 11 103 300.75 1 2024-11-06 00:00:00 262 | 12 103 300.75 1 2024-11-06 15:00:00 263 | ``` 264 | 265 | 266 | ### Step: Build the first Incremental models 267 | 268 | ```sql 269 | MODEL ( 270 | name core.inc, 271 | kind INCREMENTAL_BY_TIME_RANGE ( 272 | time_column event_date 273 | ), 274 | cron '*/5 * * * *' 275 | ); 276 | 277 | SELECT 278 | event_date, 279 | SUM(amount) AS total_amount 280 | FROM workshop.events 281 | WHERE 282 | event_date BETWEEN @start_dt AND @end_dt 283 | GROUP BY 284 | event_date 285 | ``` 286 | 287 | ``` 288 | uv run sqlmesh plan 289 | ``` 290 | 291 | Output: 292 | ``` 293 | New environment `prod` will be created from `prod` 294 | Summary of differences against `prod`: 295 | Models: 296 | └── Added: 297 | ├── core.inc 298 | └── workshop.events 299 | Models needing backfill (missing dates): 300 | └── core.inc: 2024-11-01 00:00:00 - 2024-11-09 20:14:59 301 | ``` 302 | 303 | ### Step: Check how incremental interval works 304 | 305 | Insert new records on source table `db.workshop.events` representing the next interval that will be processed. The last interval processed based on the `sqlmesh plan` was **`2024-11-09 20:14:59`**, then we need to create samples like: 306 | 307 | ```sql 308 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 20:16:00', 101, 175.25); 309 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 20:17:00', 103, 300.75); 310 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 20:18:00', 103, 300.75); 311 | ``` 312 | 313 | You will have to modify the `YYYY-mm-dd HH:mm:ss` depending when you are running this tutorial. 314 | 315 | ### Step: Execute the run command 316 | 317 | > _if a model’s cron is daily then sqlmesh run will only execute the model once per day. If you issue sqlmesh run the first time on a day the model will execute; if you issue sqlmesh run again nothing will happen because the model shouldn’t be executed again until tomorrow._ 318 | 319 | Processing incrementals has a hard dependency on the cron definition. This makes a lot of sense because the processing model in SQLMesh is by time intervals and not higher watermarks. Then, testing how the incremental model works needs the step [defined previously](#step-check-how-incremental-interval-works) 320 | 321 | ```sh 322 | watch -n 30 uv run sqlmesh run 323 | ``` 324 | 325 | Output: 326 | ``` 327 | Every 30.0s: uv run sqlmesh run 328 | 5ddcb8705ed5: Sat Nov 9 20:25:16 2024 329 | [1/1] core.inc evaluated in 0.02s 330 | Evaluating models ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100.0% • 1/1 • 0:00:00 331 | 332 | All model batches have been executed successfully 333 | 334 | ``` 335 | 336 | ### Step: Validate data 337 | 338 | ```bash 339 | uv run sqlmesh fetchdf "select * from core.inc" 340 | ``` 341 | 342 | Output: 343 | ```diff 344 | event_date total_amount 345 | 0 2024-11-01 00:00:00 350.50 346 | 1 2024-11-02 00:00:00 476.00 347 | 2 2024-11-03 00:00:00 570.00 348 | 3 2024-11-04 00:00:00 370.75 349 | 4 2024-11-05 00:00:00 520.75 350 | 5 2024-11-06 00:00:00 476.00 351 | 6 2024-11-06 15:00:00 300.75 352 | + 7 2024-11-09 20:16:00 175.25 353 | + 8 2024-11-09 20:17:00 300.75 354 | + 9 2024-11-09 20:18:00 300.75 355 | ``` -------------------------------------------------------------------------------- /config.yaml: -------------------------------------------------------------------------------- 1 | gateways: 2 | local: 3 | connection: 4 | type: duckdb 5 | database: db.db 6 | state_connection: 7 | type: postgres 8 | host: db 9 | port: 5432 10 | user: postgres 11 | password: postgres 12 | database: workshop 13 | 14 | default_gateway: local 15 | 16 | model_defaults: 17 | dialect: duckdb 18 | start: 2024-11-01 -------------------------------------------------------------------------------- /external_models/external_models.yaml: -------------------------------------------------------------------------------- 1 | - name: '"db"."workshop"."events"' 2 | gateway: local 3 | columns: 4 | id: INT 5 | amount: FLOAT 6 | event_date: TIMESTAMP -------------------------------------------------------------------------------- /img/step_dev_environment.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_dev_environment.png -------------------------------------------------------------------------------- /img/step_inside_codespaces.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_inside_codespaces.png -------------------------------------------------------------------------------- /img/step_model_versioning.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_model_versioning.png -------------------------------------------------------------------------------- /img/step_run_codespaces.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_run_codespaces.png -------------------------------------------------------------------------------- /img/step_sqltool_password.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_sqltool_password.png -------------------------------------------------------------------------------- /img/step_sqltool_query.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/marcraminv/workshop-sqlmesh/b2d1385f3797a0e7dc6515ed7ee2ab9a72a6d65c/img/step_sqltool_query.png -------------------------------------------------------------------------------- /models/full_model.sql: -------------------------------------------------------------------------------- 1 | MODEL ( 2 | name core.full, 3 | kind FULL, 4 | cron '@daily' 5 | ); 6 | 7 | SELECT 8 | id, 9 | amount, 10 | event_date 11 | FROM workshop.events -------------------------------------------------------------------------------- /models/inc_model.sql: -------------------------------------------------------------------------------- 1 | MODEL ( 2 | name core.inc, 3 | kind INCREMENTAL_BY_TIME_RANGE ( 4 | time_column event_date 5 | ), 6 | cron '*/5 * * * *' 7 | ); 8 | 9 | SELECT 10 | event_date, 11 | SUM(amount) AS total_amount 12 | FROM workshop.events 13 | WHERE 14 | event_date BETWEEN @start_dt AND @end_dt 15 | GROUP BY 16 | event_date -------------------------------------------------------------------------------- /models/python_model.py: -------------------------------------------------------------------------------- 1 | import typing as t 2 | from datetime import datetime 3 | 4 | import pandas as pd 5 | from sqlmesh import ExecutionContext, model 6 | 7 | @model( 8 | "core.basic_python", 9 | cron="@daily", 10 | columns={ 11 | "id": "int", 12 | "name": "text", 13 | } 14 | ) 15 | def execute( 16 | context: ExecutionContext, 17 | start: datetime, 18 | end: datetime, 19 | execution_time: datetime, 20 | **kwargs: t.Any, 21 | ) -> pd.DataFrame: 22 | 23 | return pd.DataFrame([ 24 | {"id": 1, "name": "Marc"} 25 | ]) -------------------------------------------------------------------------------- /models/seed_model.sql: -------------------------------------------------------------------------------- 1 | MODEL ( 2 | name core.seed, 3 | kind SEED ( 4 | path '../seeds/seed_data.csv' 5 | ) 6 | ) -------------------------------------------------------------------------------- /models/view_model.sql: -------------------------------------------------------------------------------- 1 | MODEL ( 2 | name core.view, 3 | kind VIEW, 4 | cron '@daily' 5 | ); 6 | 7 | SELECT 8 | COUNT(id) AS total 9 | FROM core.full -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [project] 2 | name = "workshop-sqlmesh" 3 | version = "0.1.0" 4 | description = "Add your description here" 5 | readme = "README.md" 6 | requires-python = ">=3.12" 7 | dependencies = [ 8 | "pandas>=2.2.3", 9 | "sqlmesh[postgres,web]>=0.130.2", 10 | ] 11 | -------------------------------------------------------------------------------- /scripts/duckdb-init.sql: -------------------------------------------------------------------------------- 1 | create schema db.workshop; 2 | 3 | create or replace table db.workshop.events ( 4 | id integer, 5 | amount float, 6 | event_date timestamp 7 | ); 8 | 9 | insert into workshop.events (event_date, id, amount) values ('2024-11-01 00:00:00', 101, 150.00); 10 | insert into workshop.events (event_date, id, amount) values ('2024-11-01 00:00:00', 102, 200.50); 11 | insert into workshop.events (event_date, id, amount) values ('2024-11-02 00:00:00', 101, 175.25); 12 | insert into workshop.events (event_date, id, amount) values ('2024-11-02 00:00:00', 103, 300.75); 13 | insert into workshop.events (event_date, id, amount) values ('2024-11-03 00:00:00', 102, 250.00); 14 | insert into workshop.events (event_date, id, amount) values ('2024-11-03 00:00:00', 103, 320.00); 15 | insert into workshop.events (event_date, id, amount) values ('2024-11-04 00:00:00', 101, 160.50); 16 | insert into workshop.events (event_date, id, amount) values ('2024-11-04 00:00:00', 102, 210.25); 17 | insert into workshop.events (event_date, id, amount) values ('2024-11-05 00:00:00', 101, 180.75); 18 | insert into workshop.events (event_date, id, amount) values ('2024-11-05 00:00:00', 103, 340.00); 19 | insert into workshop.events (event_date, id, amount) values ('2024-11-06 00:00:00', 101, 175.25); 20 | insert into workshop.events (event_date, id, amount) values ('2024-11-06 00:00:00', 103, 300.75); 21 | insert into workshop.events (event_date, id, amount) values ('2024-11-06 15:00:00', 103, 300.75); -------------------------------------------------------------------------------- /scripts/insert-rows.sql: -------------------------------------------------------------------------------- 1 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 00:00:00', 101, 175.25); 2 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 00:00:00', 103, 300.75); 3 | insert into workshop.events (event_date, id, amount) values ('2024-11-09 15:00:00', 103, 300.75); -------------------------------------------------------------------------------- /seeds/seed_data.csv: -------------------------------------------------------------------------------- 1 | id,item_id,event_date 2 | 1,2,2024-01-01 3 | 2,1,2024-01-01 4 | 3,3,2024-01-03 5 | 4,1,2024-01-04 6 | 5,1,2024-01-05 -------------------------------------------------------------------------------- /tests/test_python_model.yaml: -------------------------------------------------------------------------------- 1 | 2 | test_python_model_is_working: 3 | model: core.basic_python 4 | inputs: 5 | core.view: 6 | rows: 7 | - total: 1 8 | outputs: 9 | query: 10 | rows: 11 | - id: 1 12 | name: "Marc" -------------------------------------------------------------------------------- /workshop.session.sql: -------------------------------------------------------------------------------- 1 | -- Show the tables to store the SQLMesh state 2 | SELECT * FROM information_schema.tables WHERE table_schema = 'sqlmesh'; 3 | 4 | -- Show information about intervals 5 | select * from workshop.sqlmesh._intervals; --------------------------------------------------------------------------------