21 |
22 | ---
23 |
24 | [Install](#installation) | [Getting Started](#getting-started) | [Documentation](https://noteable-origami.readthedocs.io) | [License](./LICENSE) | [Code of Conduct](./CODE_OF_CONDUCT.md) | [Contributing](./CONTRIBUTING.md)
25 |
26 |
27 |
28 | ## Intro to Origami
29 |
30 | Origami is a 🐍 Python library for talking to [Noteable notebooks](https://noteable.io/). This is the official way to access the full breadth of API calls and access patterns in async Python for rich programmatic access to notebooks. You can use [Noteable for free](https://app.noteable.io) with a quick signup.
31 |
32 |
33 |
34 |
35 |
36 | ## Requirements
37 |
38 | Python 3.8+
39 |
40 |
41 |
42 |
43 |
44 | ## Installation
45 |
46 | For stable release:
47 |
48 | ```bash
49 | pip install noteable-origami
50 | ```
51 |
52 | ```bash
53 | poetry add noteable-origami
54 | ```
55 |
56 | For alpha pre-release:
57 |
58 | ```bash
59 | pip install noteable-origami --pre
60 | ```
61 |
62 |
63 |
64 | ## Getting Started
65 |
66 | > **Note**
67 | > Developer note: For pre-1.0 release information, see the [pre-1.0 README](https://github.com/noteable-io/origami/blob/release/0.0.35/README.md)
68 |
69 | ### API Tokens
70 |
71 | The Noteable API requires an authentication token. You can manage tokens at the Noteable user settings page.
72 |
73 | 1. Log in to [Noteable](https://app.noteable.io) (sign up is free).
74 | 2. In the User Settings tab, navigate to `API Tokens` and generate a new token.
75 | 
76 | 3. Copy the generated token to the clipboard and save in a secure location, to be read into your Python environment later.
77 | 
78 |
79 | The token can be passed directly in to `APIClient` on initialization, or set it as env var `NOTEABLE_TOKEN`.
80 |
81 | ### Usage
82 |
83 |
84 | The example below will guide you through the basics of creating a notebook, adding content, executing code, and seeing the output. For more examples, see our [Use Cases](../usage) section.
85 |
86 | ### Setting up the `APIClient`
87 |
88 | Using the API token you created previously, load it into your notebook environment so it can be passed into the `APIClient` directly. (If you're in [Noteable](https://app.noteable.io), you can create a [Secret](https://docs.noteable.io/product-docs/collaborate/access-and-visibility/secrets-permissions) that can be read in as an environment variable.)
89 |
90 | ```python
91 | import os
92 | from origami.clients.api import APIClient
93 |
94 | # if we have the `NOTEABLE_TOKEN` environment variable set,
95 | # we don't need to pass it in to the APIClient directly
96 | api_client = APIClient()
97 | ```
98 | *The `APIClient` is what we'll use to make HTTP requests to Noteable's REST API.*
99 |
100 |
101 | ### Checking your user information
102 |
103 | ```python
104 | user = await api_client.user_info()
105 | user
106 | ```
107 | ``` {.python .no-copy }
108 | User(
109 | id=UUID('f1a2b3c4-5678-4d90-ef01-23456789abcd'),
110 | created_at=datetime.datetime(2023, 1, 1, 0, 0, 0, 0, tzinfo=datetime.timezone.utc),
111 | updated_at=datetime.datetime(2023, 1, 1, 0, 0, 0, 0, tzinfo=datetime.timezone.utc),
112 | deleted_at=None,
113 | handle='ori.gami',
114 | email='origami@noteable.io',
115 | first_name='Ori',
116 | last_name='Gami',
117 | origamist_default_project_id=UUID('a1b2c3d4-e5f6-4a7b-8123-abcdef123456'),
118 | principal_sub='pat:0a1b2c3d4e5f6g7h8i9j10k11l',
119 | auth_type='pat:0a1b2c3d4e5f6g7h8i9j10k11l'
120 | )
121 | ```
122 | (The information returned should match your user account information associated with the previously-generated API token.)
123 |
124 |
125 | ### Creating a new Notebook
126 |
127 | > **Note**
128 | > For this example, we're using the `origamist_default_project_id`, which is the default project designed to be used by the ChatGPT plugin. Feel free to replace it with projects you have access to in [Noteable](https://app.noteable.io/)!
129 |
130 |
131 | Provide a file `path` as well as a `project_id` (UUID) where the Notebook will exist.
132 | ```python
133 | project_id = user.origamist_default_project_id
134 |
135 | file = await api_client.create_notebook(
136 | project_id=project_id,
137 | path="Origami Demo.ipynb"
138 | )
139 | file
140 | ```
141 | ``` {.python .no-copy }
142 | File(
143 | id=UUID('bcd12345-6789-4abc-d012-3456abcdef90'),
144 | created_at=datetime.datetime(2023, 2, 2, 0, 0, 0, 0, tzinfo=datetime.timezone.utc),
145 | updated_at=datetime.datetime(2023, 2, 2, 0, 0, 0, 0, tzinfo=datetime.timezone.utc),
146 | deleted_at=None,
147 | filename='Origami Demo.ipynb',
148 | path=PosixPath('Origami Demo.ipynb'),
149 | project_id=UUID('a1b2c3d4-e5f6-4a7b-8123-abcdef123456'),
150 | space_id=UUID('7890ab12-3412-4cde-8901-2345abcdef67'),
151 | size=0,
152 | mimetype=None,
153 | type='notebook',
154 | current_version_id=None,
155 | presigned_download_url=None,
156 | url='https://app.noteable.io/f/abc12312-3412-4abc-8123-abc12312abc1/Origami Demo.ipynb'
157 | )
158 | ```
159 |
160 |
161 | ### Launching a Kernel
162 |
163 | At a minimum, the `file_id` from the Notebook is required. Additionally, you can specify:
164 |
165 | + `kernel_name` (default `python3`, see more about [available kernels](https://docs.noteable.io/product-docs/work-with-notebooks/manage-kernels/noteable-provided-kernels))
166 | + `hardware_size` (default `small`, see more about [hardware options](https://docs.noteable.io/product-docs/work-with-notebooks/manage-hardware)).
167 |
168 | ```python
169 | kernel_session = await api_client.launch_kernel(file_id=file.id)
170 | kernel_session
171 | ```
172 | ```{.python .no-copy}
173 | KernelSession(
174 | id=UUID('e1f2a345-6789-4b01-cdef-1234567890ab'),
175 | kernel=KernelDetails(
176 | name='python3',
177 | last_activity=datetime.datetime(2023, 2, 2, 1, 0, 0, 0, tzinfo=datetime.timezone.utc),
178 | execution_state='idle'
179 | )
180 | )
181 | ```
182 |
183 |
184 | ### Adding Cells
185 |
186 | Content updates and code execution is handled through the Noteable Real-Time Update (RTU) websocket connection.
187 | ```python
188 | realtime_notebook = await api_client.connect_realtime(file)
189 | ```
190 |
191 |
192 | > **Warning**
193 | > You may see messages like `Received un-modeled RTU message msg.channel= ...`. This is expected as we update the Noteable backend services' messaging.
194 |
195 |
196 | Once the RTU client is connected, we can begin adding cells, executing code, and more! First, let's add a code cell with a basic Python `print` statement.
197 | ```python
198 | from origami.models.notebook import CodeCell
199 |
200 | cell = CodeCell(source="print('Hello World')")
201 | await realtime_notebook.add_cell(cell=cell)
202 | ```
203 | (You can also pass code source directly into `.add_cell(source='CODE HERE')` as a shortcut.)
204 |
205 |
206 | ### Running a Code Cell
207 |
208 | The returned value is a dictionary of `asyncio.Future`s. Awaiting those futures will block until the cells have completed execution.
209 | The return value of the Futures is the up-to-date cell. If there's output, an output collection id will be set on the cell metadata.
210 | ```python
211 | import asyncio
212 |
213 | queued_execution = await realtime_notebook.queue_execution(cell.id)
214 | cells = await asyncio.gather(*queued_execution)
215 | cell = cells[0]
216 | cell
217 | ```
218 | ```{.python .no-copy}
219 | CodeCell(
220 | id='2345ab6c-de78-4901-bcde-f1234567890a',
221 | source="print('Hello World')",
222 | metadata={
223 | 'noteable': {'output_collection_id': UUID('d1234e5f-6789-4a0b-c123-4567890abcdef')},
224 | 'ExecuteTime': {
225 | 'start_time': '2023-02-02T01:00:00.000000+00:00',
226 | 'end_time': '2023-02-02T01:00:00.050000+00:00'
227 | }
228 | },
229 | cell_type='code',
230 | execution_count=None,
231 | outputs=[]
232 | )
233 | ```
234 |
235 |
236 | ### Getting Cell Output
237 |
238 | We can call the `.output_collection_id` property on cells directly, rather than having to parse the cell metadata.
239 | ```python
240 | output_collection = await api_client.get_output_collection(cell.output_collection_id)
241 | output_collection
242 | ```
243 | ```{.python .no-copy}
244 | KernelOutputCollection(
245 | id=UUID('d1234e5f-6789-4a0b-c123-4567890abcdef'),
246 | created_at=datetime.datetime(2023, 2, 2, 1, 0, 1, 000000, tzinfo=datetime.timezone.utc),
247 | updated_at=datetime.datetime(2023, 2, 2, 1, 0, 1, 000000, tzinfo=datetime.timezone.utc),
248 | deleted_at=None,
249 | cell_id='2345ab6c-de78-4901-bcde-f1234567890a',
250 | widget_model_id=None,
251 | file_id=UUID('bcd12345-6789-4abc-d012-3456abcdef90'),
252 | outputs=[
253 | KernelOutput(
254 | id=UUID('abcdef90-1234-4a56-7890-abcdef123456'),
255 | created_at=datetime.datetime(2023, 2, 2, 1, 0, 1, 000000, tzinfo=datetime.timezone.utc),
256 | updated_at=datetime.datetime(2023, 2, 2, 1, 0, 1, 000000, tzinfo=datetime.timezone.utc),
257 | deleted_at=None,
258 | type='stream',
259 | display_id=None,
260 | available_mimetypes=['text/plain'],
261 | content_metadata=KernelOutputContent(raw='{"name":"stdout"}', url=None, mimetype='application/json'),
262 | content=KernelOutputContent(raw='Hello World\n', url=None, mimetype='text/plain'),
263 | content_for_llm=KernelOutputContent(raw='Hello World\n', url=None, mimetype='text/plain'),
264 | parent_collection_id=UUID('d1234e5f-6789-4a0b-c123-4567890abcdef')
265 | )
266 | ]
267 | )
268 | ```
269 |
270 |
271 | ## CLI
272 |
273 | Origami has a small CLI for fetching the content of a Notebook, and tailing a Notebook to see all RTU messages being emitted on the relevant RTU channels.
274 |
275 | ```
276 | pip install noteable-origami[cli]
277 | poetry install -E cli
278 | ```
279 |
280 | 1. Fetch the content of a Notebook and write to file: `origami fetch > notebook.ipynb`
281 | 2. Tail a Notebook, useful when debugging RTU messages: `origami tail `
282 |
283 | ## Dev ENV settings
284 |
285 | - Use `NOTEABLE_API_URL` to point to non-production clusters, such as `http://localhost:8001/api` for local Gate development
286 | - E2E tests will use `TEST_SPACE_ID`, `TEST_PROJECT_ID`, and `TEST_USER_ID` env vars when running, useful in CI
287 |
288 | ## Contributing
289 |
290 | See [CONTRIBUTING.md](./CONTRIBUTING.md).
291 |
292 | ---
293 |
294 |
Open sourced with ❤️ by Noteable for the community.
295 |
296 |
297 |
--------------------------------------------------------------------------------
/docs/changelog.md:
--------------------------------------------------------------------------------
1 | --8<-- "CHANGELOG.md"
--------------------------------------------------------------------------------
/docs/client.md:
--------------------------------------------------------------------------------
1 | # Noteable Client
2 |
3 | The NoteableClient class provides an extension of `httpx.AsyncClient` with Noteable specific helpers. The async entrypoint for the class will establish and maintain a websocket for real time updates to/from Noteable servers. The API messages being sent have custom formats from Jupyter but for any directly connecting to the kernel pass Jupyter messages across the Noteable API layer. e.g. observing output messages arriving you'll see that their content matches the ZMQ Jupyter format.
4 |
5 | ## Authentication
6 |
7 | The client automatically uses the `api_token` argument, or the `NOTEABLE_TOKEN` environment variable in absence, to generate the `f"Bearer {self.token.access_token}"` to establish connections or REST requests. This token can be an ephemeral token fetched dynamically from the site live with a short lifecycle, or you can create a more permanent token via your User Settings in the upper right of the Noteable platform. Either one will work as a Bearer token for authentication and can be individually revoked as needed.
8 |
9 | Note: If you have a custom deployment URL for your Noteable service, you'll need to set the `NOTEABLE_DOMAIN` environment variable or the `domain` config key to point to the correct URL. Otherwise it will default to the public multi-tenant environment.
10 |
11 | ## Routes
12 |
13 | Most routes presented help with kernel session or file manipulation. Some direct Jupyter APIs are also present on the server but not given helpers in the client as there's often a wrapping API preferred for use or replacing the open source pattern with Noteable specific affordances.
14 |
15 | `get_or_launch_ready_kernel_session` is often where one will start to initiate or join a kernel session, handling the launch handshakes and establishing a connection to the Jupyter kernel. See the [API docs page](/reference/client/#client.NoteableClient.get_or_launch_ready_kernel_session) for the specific method signatures.
16 |
17 | You don't need to explicitly call `delete_kernel_session` but it does save on resources being utilized until they timeout on the service side. If you know you're wrapping up an interactions it's polite to clean the kernel and avoid wasting money / carbon.
18 |
19 | ## Websockets
20 |
21 | Once aentered, the client will stream all messages back from the real time update channels. You can use `register_message_callback` to setup your own callbacks based on `message_type`, `transaction_id`, or response schemas. This is the primary way to respond to events in Noteable.
22 |
23 | Similarly, `send_rtu_request` is used to initiate any real time requests to the system. Common patterns using these calls are wrapped in helpers to achieve known patterns but any extensions can be added to customize client behavior using these mechanisms.
24 |
25 | See the [API docs page](/reference/client/) for the specific method signatures related to these actions.
--------------------------------------------------------------------------------
/docs/contributing.md:
--------------------------------------------------------------------------------
1 | --8<-- "CONTRIBUTING.md"
--------------------------------------------------------------------------------
/docs/gen_doc_stubs.py:
--------------------------------------------------------------------------------
1 | """Generate the code reference pages."""
2 |
3 | from pathlib import Path
4 |
5 | import mkdocs_gen_files
6 |
7 | nav = mkdocs_gen_files.Nav()
8 |
9 | for path in sorted(Path("origami").rglob("*.py")):
10 | module_path = path.relative_to("origami").with_suffix("")
11 | doc_path = path.relative_to("origami").with_suffix(".md")
12 | full_doc_path = Path("reference", doc_path)
13 | parts = list(module_path.parts)
14 |
15 | if parts[-1].startswith("_"):
16 | continue
17 |
18 | nav[parts] = doc_path.as_posix()
19 |
20 | with mkdocs_gen_files.open(full_doc_path, "w") as fd:
21 | identifier = ".".join(parts)
22 | print("::: " + identifier, file=fd)
23 | print("::: " + identifier)
24 | # break
25 |
26 | # nav["mkdocs_autorefs", "references"] = "autorefs/references.md"
27 | # nav["mkdocs_autorefs", "plugin"] = "autorefs/plugin.md"
28 |
29 | with mkdocs_gen_files.open("reference/SUMMARY.md", "w") as nav_file:
30 | nav_file.writelines(nav.build_literate_nav())
31 |
--------------------------------------------------------------------------------
/docs/index.md:
--------------------------------------------------------------------------------
1 | # Origami
2 |
3 |
4 |
5 | --8<-- "README.md:intro"
6 |
7 | --8<-- "README.md:requirements"
8 |
9 | --8<-- "README.md:install"
10 |
11 |
12 | ## Contributing
13 |
14 | See [Contributing page](contributing.md).
15 |
--------------------------------------------------------------------------------
/docs/papersnake-dark.svg:
--------------------------------------------------------------------------------
1 |
4 |
--------------------------------------------------------------------------------
/docs/papersnake.svg:
--------------------------------------------------------------------------------
1 |
4 |
--------------------------------------------------------------------------------
/docs/quickstart.md:
--------------------------------------------------------------------------------
1 | # Quick Start
2 | The example below will guide you through the basics of creating a notebook, adding content, executing code, and seeing the output. For more examples, see our [Use Cases](../usage) section.
3 |
4 | !!! note "Developer note: For pre-1.0 release information, see the [pre-1.0 README](https://github.com/noteable-io/origami/blob/release/0.0.35/README.md)"
5 |
6 | --8<-- "README.md:install"
7 |
8 | ## API Tokens
9 | --8<-- "README.md:api-tokens"
10 |
11 | ## Setting up the `APIClient`
12 | --8<-- "README.md:api-client"
13 |
14 | ## Checking your user information
15 | --8<-- "README.md:user-info"
16 |
17 | ## Creating a new Notebook
18 |
19 | !!! note "For this example, we're using the `origamist_default_project_id`, which is the default project designed to be used by the ChatGPT plugin. Feel free to replace it with projects you have access to in [Noteable](https://app.noteable.io/)!"
20 |
21 | --8<-- "README.md:create-notebook"
22 |
23 | ## Launching a Kernel
24 |
25 | --8<-- "README.md:launch-kernel"
26 |
27 | ## Adding Cells
28 |
29 | --8<-- "README.md:connect-rtu"
30 |
31 | !!! warning "You may see messages like `Received un-modeled RTU message msg.channel= ...`. This is expected as we update the Noteable backend services' messaging."
32 |
33 | --8<-- "README.md:add-cells"
34 |
35 | ## Running a Code Cell
36 |
37 | --8<-- "README.md:run-code-cell"
38 |
39 | ## Getting Cell Output
40 |
41 | --8<-- "README.md:get-cell-output"
--------------------------------------------------------------------------------
/docs/reference/clients/api.md:
--------------------------------------------------------------------------------
1 | ::: clients.api
2 |
--------------------------------------------------------------------------------
/docs/reference/clients/cache.md:
--------------------------------------------------------------------------------
1 | ::: clients.cache
2 |
--------------------------------------------------------------------------------
/docs/reference/clients/rtu.md:
--------------------------------------------------------------------------------
1 | ::: clients.rtu
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/base.md:
--------------------------------------------------------------------------------
1 | ::: models.api.base
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/datasources.md:
--------------------------------------------------------------------------------
1 | ::: models.api.datasources
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/files.md:
--------------------------------------------------------------------------------
1 | ::: models.api.files
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/outputs.md:
--------------------------------------------------------------------------------
1 | ::: models.api.outputs
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/projects.md:
--------------------------------------------------------------------------------
1 | ::: models.api.projects
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/spaces.md:
--------------------------------------------------------------------------------
1 | ::: models.api.spaces
2 |
--------------------------------------------------------------------------------
/docs/reference/models/api/users.md:
--------------------------------------------------------------------------------
1 | ::: models.api.users
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/base.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.base
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/cell_contents.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.cell_contents
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/cell_execute.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.cell_execute
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/cell_metadata.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.cell_metadata
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/cell_output_collection.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.cell_output_collection
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/nb_cells.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.nb_cells
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/delta_types/nb_metadata.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.delta_types.nb_metadata
2 |
--------------------------------------------------------------------------------
/docs/reference/models/deltas/discriminators.md:
--------------------------------------------------------------------------------
1 | ::: models.deltas.discriminators
2 |
--------------------------------------------------------------------------------
/docs/reference/models/kernels.md:
--------------------------------------------------------------------------------
1 | ::: models.kernels
2 |
--------------------------------------------------------------------------------
/docs/reference/models/notebook.md:
--------------------------------------------------------------------------------
1 | ::: models.notebook
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/base.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.base
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/channels/files.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.channels.files
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/channels/kernels.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.channels.kernels
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/channels/system.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.channels.system
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/discriminators.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.discriminators
2 |
--------------------------------------------------------------------------------
/docs/reference/models/rtu/errors.md:
--------------------------------------------------------------------------------
1 | ::: models.rtu.errors
2 |
--------------------------------------------------------------------------------
/docs/reference/notebook/builder.md:
--------------------------------------------------------------------------------
1 | ::: notebook.builder
2 |
--------------------------------------------------------------------------------
/docs/requirements.txt:
--------------------------------------------------------------------------------
1 | # Pin packages for RTD builds
2 | # https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html#pinning-dependencies
3 | mkdocs
4 | mkdocs-material
5 | mkdocstrings[python]
6 | mkautodoc
7 | mkdocs-gen-files
8 | mkdocs-literate-nav
9 | mkdocs-section-index
10 | pymdown-extensions
11 |
--------------------------------------------------------------------------------
/docs/screenshots/user_settings__api_tokens.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/docs/screenshots/user_settings__api_tokens.png
--------------------------------------------------------------------------------
/docs/screenshots/user_settings__api_tokens2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/docs/screenshots/user_settings__api_tokens2.png
--------------------------------------------------------------------------------
/docs/usage.md:
--------------------------------------------------------------------------------
1 | Coming soon!
--------------------------------------------------------------------------------
/mkdocs.yml:
--------------------------------------------------------------------------------
1 | site_name: Origami
2 | site_description: A Python SDK for Noteable API interactions.
3 | #site_url: https://???
4 |
5 | theme:
6 | name: "material"
7 | logo: papersnake-dark.svg
8 | palette:
9 | - scheme: "default"
10 | media: "(prefers-color-scheme: light)"
11 | primary: "cyan"
12 | accent: "cyan"
13 | toggle:
14 | icon: "material/lightbulb"
15 | name: "Switch to dark mode"
16 | - scheme: "slate"
17 | media: "(prefers-color-scheme: dark)"
18 | primary: "cyan"
19 | accent: "cyan"
20 | toggle:
21 | icon: "material/lightbulb-outline"
22 | name: "Switch to light mode"
23 | features:
24 | - navigation.sections
25 | - content.code.copy
26 |
27 | repo_name: noteable-io/origami
28 | repo_url: https://github.com/noteable-io/origami/
29 | edit_uri: ""
30 |
31 | nav:
32 | - Introduction: "index.md"
33 | - Contributing: "contributing.md"
34 | - "Quick Start": "quickstart.md"
35 | - "Use Cases": "usage.md"
36 | - Code Reference: reference/
37 | - Changes:
38 | - Log: "changelog.md"
39 |
40 | markdown_extensions:
41 | - admonition
42 | - pymdownx.details
43 | - pymdownx.highlight:
44 | anchor_linenums: true
45 | - pymdownx.snippets
46 | - pymdownx.superfences
47 | - toc:
48 | permalink: "#"
49 |
50 | plugins:
51 | - search
52 | - gen-files:
53 | scripts:
54 | - docs/gen_doc_stubs.py
55 | - mkdocstrings:
56 | default_handler: python
57 | handlers:
58 | python:
59 | paths: [origami]
60 | rendering:
61 | show_source: true
62 | - literate-nav:
63 | nav_file: SUMMARY.md
64 | - section-index
65 |
--------------------------------------------------------------------------------
/origami/__init__.py:
--------------------------------------------------------------------------------
1 | from importlib_metadata import version
2 |
3 | __version__ = version("noteable-origami")
4 |
--------------------------------------------------------------------------------
/origami/cli.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import logging
3 | import os
4 |
5 | import typer
6 |
7 | from origami.clients.api import APIClient
8 | from origami.clients.rtu import RTUClient
9 | from origami.log_utils import setup_logging
10 |
11 | app = typer.Typer(no_args_is_help=True)
12 |
13 |
14 | async def _get_notebook(file_id: str, api_url: str = "https://app.noteable.io/gate/api"):
15 | if not os.environ["NOTEABLE_TOKEN"]:
16 | raise RuntimeError("NOTEABLE_TOKEN environment variable not set")
17 | api_client = APIClient(
18 | authorization_token=os.environ["NOTEABLE_TOKEN"],
19 | api_base_url=api_url,
20 | )
21 | rtu_client: RTUClient = await api_client.connect_realtime(file=file_id)
22 | print(rtu_client.builder.nb.json(indent=2))
23 |
24 |
25 | @app.command()
26 | def fetch(file_id: str, api_url: str = "https://app.noteable.io/gate/api"):
27 | asyncio.run(_get_notebook(file_id, api_url))
28 |
29 |
30 | async def _tail_notebook(file_id: str, api_url: str = "https://app.noteable.io/gate/api"):
31 | if not os.environ["NOTEABLE_TOKEN"]:
32 | raise RuntimeError("NOTEABLE_TOKEN environment variable not set")
33 | setup_logging()
34 | logging.getLogger("origami.clients.rtu").setLevel(logging.DEBUG)
35 | api_client = APIClient(
36 | authorization_token=os.environ["NOTEABLE_TOKEN"],
37 | api_base_url=api_url,
38 | )
39 | print("RTU Client starting initialization")
40 | await api_client.connect_realtime(file=file_id)
41 | print("RTU Client done initializing")
42 | while True:
43 | await asyncio.sleep(1)
44 |
45 |
46 | @app.command()
47 | def tail(file_id: str, api_url: str = "https://app.noteable.io/gate/api"):
48 | asyncio.run(_tail_notebook(file_id=file_id, api_url=api_url))
49 |
50 |
51 | if __name__ == "__main__":
52 | app()
53 |
--------------------------------------------------------------------------------
/origami/clients/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/clients/__init__.py
--------------------------------------------------------------------------------
/origami/clients/api.py:
--------------------------------------------------------------------------------
1 | import enum
2 | import logging
3 | import os
4 | import uuid
5 | from typing import List, Literal, Optional, Union
6 |
7 | import httpx
8 | import pydantic
9 |
10 | from origami.models.api.datasources import DataSource
11 | from origami.models.api.files import File, FileVersion
12 | from origami.models.api.outputs import KernelOutputCollection
13 | from origami.models.api.projects import Project
14 | from origami.models.api.spaces import Space
15 | from origami.models.api.users import User
16 | from origami.models.kernels import KernelSession
17 | from origami.models.notebook import Notebook
18 |
19 | logger = logging.getLogger(__name__)
20 |
21 |
22 | class AccessLevel(enum.Enum):
23 | owner = "role:owner"
24 | contributor = "role:contributor"
25 | commenter = "role:commenter"
26 | viewer = "role:viewer"
27 | executor = "role:executor"
28 |
29 | @classmethod
30 | def from_str(cls, s: str):
31 | for level in cls:
32 | if level.name == s:
33 | return level
34 | raise ValueError(f"Invalid access level {s}")
35 |
36 |
37 | class Visibility(enum.Enum):
38 | """Visibility levels associated with a specific Resource.
39 |
40 | Private = only invited users can access
41 | Open = any member can access
42 | Public = anyone can access
43 | """
44 |
45 | private = "private"
46 | open = "open"
47 | public = "public"
48 |
49 | @classmethod
50 | def from_str(cls, s: str):
51 | for vis in cls:
52 | if vis.name == s:
53 | return vis
54 | raise ValueError(f"Invalid visibility {s}")
55 |
56 |
57 | class Resource(enum.Enum):
58 | spaces = "spaces"
59 | projects = "projects"
60 | files = "files"
61 |
62 |
63 | class APIClient:
64 | def __init__(
65 | self,
66 | authorization_token: Optional[str] = None,
67 | api_base_url: str = "https://app.noteable.io/gate/api",
68 | headers: Optional[dict] = None,
69 | transport: Optional[httpx.AsyncHTTPTransport] = None,
70 | timeout: httpx.Timeout = httpx.Timeout(5.0),
71 | creator_client_type: str = "origami",
72 | ):
73 | # jwt and api_base_url saved as attributes because they're re-used when creating rtu client
74 | self.jwt = authorization_token or os.environ.get("NOTEABLE_TOKEN")
75 | if not self.jwt:
76 | raise ValueError(
77 | "Must provide authorization_token or set NOTEABLE_TOKEN environment variable"
78 | )
79 | self.api_base_url = os.environ.get("NOTEABLE_API_URL", api_base_url)
80 | self.headers = {"Authorization": f"Bearer {self.jwt}"}
81 | if headers:
82 | self.headers.update(headers)
83 |
84 | self.client = httpx.AsyncClient(
85 | base_url=self.api_base_url,
86 | headers=self.headers,
87 | transport=transport,
88 | timeout=timeout,
89 | )
90 | # creator_client_type helps log what kind of client created Resources like Files/Projects
91 | # or is interacting with Notebooks through RTU / Deltas. If you're not sure what to use
92 | # yourself, go with the default 'origami'
93 | if creator_client_type not in ["origami", "origamist", "planar_ally", "geas"]:
94 | # this list of valid creator client types is sourced from Gate's FrontendType enum
95 | creator_client_type = "unknown"
96 | self.creator_client_type = creator_client_type # Only used when generating an RTUClient
97 |
98 | def add_tags_and_contextvars(self, **tags):
99 | """Hook for Apps to override so they can set structlog contextvars or ddtrace tags etc"""
100 | pass
101 |
102 | async def user_info(self) -> User:
103 | """Get email and other info for User account of this Client's JWT."""
104 | endpoint = "/users/me"
105 | resp = await self.client.get(endpoint)
106 | resp.raise_for_status()
107 | user = User.model_validate(resp.json())
108 | self.add_tags_and_contextvars(user_id=str(user.id))
109 | return user
110 |
111 | async def share_resource(
112 | self, resource: Resource, resource_id: uuid.UUID, email: str, level: Union[str, AccessLevel]
113 | ) -> int:
114 | """
115 | Add another User as a collaborator to a Resource.
116 | """
117 | user_lookup_endpoint = f"/{resource.value}/{resource_id}/shareable-users"
118 | user_lookup_params = {"q": email}
119 | user_lookup_resp = await self.client.get(user_lookup_endpoint, params=user_lookup_params)
120 | user_lookup_resp.raise_for_status()
121 | users = user_lookup_resp.json()["data"]
122 |
123 | if isinstance(level, str):
124 | level = AccessLevel.from_str(level)
125 | share_endpoint = f"/{resource.value}/{resource_id}/users"
126 | for item in users:
127 | user_id = item["id"]
128 | share_body = {"access_level": level.value, "user_id": user_id}
129 | share_resp = await self.client.put(share_endpoint, json=share_body)
130 | share_resp.raise_for_status()
131 | return len(users)
132 |
133 | async def unshare_resource(self, resource: Resource, resource_id: uuid.UUID, email: str) -> int:
134 | """
135 | Remove access to a Resource for a User
136 | """
137 | # Need to look this up still to go from email to user-id
138 | user_lookup_endpoint = f"/{resource.value}/{resource_id}/shareable-users"
139 | user_lookup_params = {"q": email}
140 | user_lookup_resp = await self.client.get(user_lookup_endpoint, params=user_lookup_params)
141 | user_lookup_resp.raise_for_status()
142 | users = user_lookup_resp.json()["data"]
143 |
144 | for item in users:
145 | user_id = item["id"]
146 | unshare_endpoint = f"/{resource.value}/{resource_id}/users/{user_id}"
147 | unshare_resp = await self.client.delete(unshare_endpoint)
148 | unshare_resp.raise_for_status()
149 | return len(users)
150 |
151 | async def change_resource_visibility(
152 | self,
153 | resource: Resource,
154 | resource_id: uuid.UUID,
155 | visibility: Visibility,
156 | visibility_default_access_level: Optional[AccessLevel] = None,
157 | ):
158 | """
159 | Change overall visibility of a Resource.
160 |
161 | visibility_default_access_level is only required when visibility is not private.
162 | """
163 | if isinstance(visibility, str):
164 | visibility = Visibility.from_str(visibility)
165 |
166 | if visibility is not Visibility.private and visibility_default_access_level is None:
167 | raise ValueError(
168 | "visibility_default_access_level must be set when visibility is not private"
169 | )
170 |
171 | patch_body = {"visibility": visibility.value}
172 | if isinstance(visibility_default_access_level, str):
173 | visibility_default_access_level = AccessLevel.from_str(
174 | visibility_default_access_level
175 | ).value
176 |
177 | # always set this as either None or a valid (string) value
178 | patch_body["visibility_default_access_level"] = visibility_default_access_level
179 |
180 | endpoint = f"/{resource.value}/{resource_id}"
181 | resp = await self.client.patch(
182 | endpoint,
183 | json=patch_body,
184 | )
185 | resp.raise_for_status()
186 | return resp.json()
187 |
188 | # Spaces are collections of Projects. Some "scoped" resources such as Secrets and Datasources
189 | # can also be attached to a Space and made available to all users of that Space.
190 | async def create_space(self, name: str, description: Optional[str] = None) -> Space:
191 | endpoint = "/spaces"
192 | resp = await self.client.post(endpoint, json={"name": name, "description": description})
193 | resp.raise_for_status()
194 | space = Space.model_validate(resp.json())
195 | self.add_tags_and_contextvars(space_id=str(space.id))
196 | return space
197 |
198 | async def get_space(self, space_id: uuid.UUID) -> Space:
199 | self.add_tags_and_contextvars(space_id=str(space_id))
200 | endpoint = f"/spaces/{space_id}"
201 | resp = await self.client.get(endpoint)
202 | resp.raise_for_status()
203 | space = Space.model_validate(resp.json())
204 | return space
205 |
206 | async def delete_space(self, space_id: uuid.UUID) -> None:
207 | self.add_tags_and_contextvars(space_id=str(space_id))
208 | endpoint = f"/spaces/{space_id}"
209 | resp = await self.client.delete(endpoint)
210 | resp.raise_for_status()
211 | return None
212 |
213 | async def list_space_projects(self, space_id: uuid.UUID) -> List[Project]:
214 | """List all Projects in a Space."""
215 | self.add_tags_and_contextvars(space_id=str(space_id))
216 | endpoint = f"/spaces/{space_id}/projects"
217 | resp = await self.client.get(endpoint)
218 | resp.raise_for_status()
219 | projects = [Project.model_validate(project) for project in resp.json()]
220 | return projects
221 |
222 | async def share_space(
223 | self, space_id: uuid.UUID, email: str, level: Union[str, AccessLevel]
224 | ) -> int:
225 | """
226 | Add another user as a collaborator to a Space.
227 | """
228 | return await self.share_resource(Resource.spaces, space_id, email, level)
229 |
230 | async def unshare_space(self, space_id: uuid.UUID, email: str) -> int:
231 | """
232 | Remove access to a Space for a User
233 | """
234 | return await self.unshare_resource(Resource.spaces, space_id, email)
235 |
236 | async def change_space_visibility(
237 | self,
238 | space_id: uuid.UUID,
239 | visibility: Visibility,
240 | visibility_default_access_level: Optional[AccessLevel] = None,
241 | ) -> Visibility:
242 | """
243 | Change overall visibility of a Space
244 | """
245 | return await self.change_resource_visibility(
246 | Resource.spaces,
247 | space_id,
248 | visibility,
249 | visibility_default_access_level,
250 | )
251 |
252 | # Projects are collections of Files, including Notebooks. When a Kernel is launched for a
253 | # Notebook, all Files in the Project are volume mounted into the Kernel container at startup.
254 | async def create_project(
255 | self, space_id: uuid.UUID, name: str, description: Optional[str] = None
256 | ) -> Project:
257 | self.add_tags_and_contextvars(space_id=str(space_id))
258 | endpoint = "/projects"
259 | resp = await self.client.post(
260 | endpoint,
261 | json={
262 | "space_id": str(space_id),
263 | "name": name,
264 | "description": description,
265 | "with_empty_notebook": False,
266 | "creator_client_type": self.creator_client_type,
267 | },
268 | )
269 | resp.raise_for_status()
270 | project = Project.model_validate(resp.json())
271 | self.add_tags_and_contextvars(project_id=str(project.id))
272 | return project
273 |
274 | async def get_project(self, project_id: uuid.UUID) -> Project:
275 | self.add_tags_and_contextvars(project_id=str(project_id))
276 | endpoint = f"/projects/{project_id}"
277 | resp = await self.client.get(endpoint)
278 | resp.raise_for_status()
279 | project = Project.model_validate(resp.json())
280 | return project
281 |
282 | async def delete_project(self, project_id: uuid.UUID) -> Project:
283 | self.add_tags_and_contextvars(project_id=str(project_id))
284 | endpoint = f"/projects/{project_id}"
285 | resp = await self.client.delete(endpoint)
286 | resp.raise_for_status()
287 | project = Project.model_validate(resp.json())
288 | return project
289 |
290 | async def share_project(
291 | self, project_id: uuid.UUID, email: str, level: Union[str, AccessLevel]
292 | ) -> int:
293 | """
294 | Add another User as a collaborator to a Project.
295 | """
296 | return await self.share_resource(Resource.projects, project_id, email, level)
297 |
298 | async def unshare_project(self, project_id: uuid.UUID, email: str) -> int:
299 | """
300 | Remove access to a Project for a User
301 | """
302 | return await self.unshare_resource(Resource.projects, project_id, email)
303 |
304 | async def change_project_visibility(
305 | self,
306 | project_id: uuid.UUID,
307 | visibility: Visibility,
308 | visibility_default_access_level: Optional[AccessLevel] = None,
309 | ) -> Visibility:
310 | """
311 | Change overall visibility of a Project
312 | """
313 | return await self.change_resource_visibility(
314 | Resource.projects,
315 | project_id,
316 | visibility,
317 | visibility_default_access_level,
318 | )
319 |
320 | async def list_project_files(self, project_id: uuid.UUID) -> List[File]:
321 | """List all Files in a Project. Files do not have presigned download urls included here."""
322 | self.add_tags_and_contextvars(project_id=str(project_id))
323 | endpoint = f"/projects/{project_id}/files"
324 | resp = await self.client.get(endpoint)
325 | resp.raise_for_status()
326 | files = [File.model_validate(file) for file in resp.json()]
327 | return files
328 |
329 | # Files are flat files (like text, csv, etc) or Notebooks.
330 | async def _multi_step_file_create(
331 | self,
332 | project_id: uuid.UUID,
333 | path: str,
334 | file_type: Literal["file", "notebook"],
335 | content: bytes,
336 | ) -> File:
337 | # Uploading files using the /v1/files endpoint is a multi-step process.
338 | # 1. POST /v1/files to get a presigned upload url and file id
339 | # 2. PUT the file content to the presigned upload url, save the etag
340 | # 3. POST /v1/files/{file-id}/complete-upload with upload id / key / etag
341 | # file_type is 'file' for all non-Notebook files, and 'notebook' for Notebooks
342 | # (1) Reserve File in db
343 | body = {
344 | "project_id": str(project_id),
345 | "path": path,
346 | "type": file_type,
347 | "file_size_bytes": len(content),
348 | "creator_client_type": self.creator_client_type,
349 | }
350 | resp = await self.client.post("/v1/files", json=body)
351 | resp.raise_for_status()
352 |
353 | # (1.5) parse response
354 | js = resp.json()
355 | upload_url = js["presigned_upload_url_info"]["parts"][0]["upload_url"]
356 | upload_id = js["presigned_upload_url_info"]["upload_id"]
357 | upload_key = js["presigned_upload_url_info"]["key"]
358 | file = File.model_validate(js)
359 |
360 | # (2) Upload to pre-signed url
361 | # TODO: remove this hack if/when we get containers in Skaffold to be able to translate
362 | # localhost urls to the minio pod/container
363 | if "LOCAL_K8S" in os.environ and bool(os.environ["LOCAL_K8S"]):
364 | upload_url = upload_url.replace("localhost", "minio")
365 | async with httpx.AsyncClient() as plain_client:
366 | r = await plain_client.put(upload_url, content=content)
367 | r.raise_for_status()
368 |
369 | # (3) Tell API we finished uploading (returns 204)
370 | etag = r.headers["etag"].strip('"')
371 | body = {
372 | "upload_id": upload_id,
373 | "key": upload_key,
374 | "parts": [{"etag": etag, "part_number": 1}],
375 | }
376 | endpoint = f"/v1/files/{file.id}/complete-upload"
377 | r2 = await self.client.post(endpoint, json=body)
378 | r2.raise_for_status()
379 | return file
380 |
381 | async def create_file(self, project_id: uuid.UUID, path: str, content: bytes) -> File:
382 | """Create a non-Notebook File in a Project"""
383 | self.add_tags_and_contextvars(project_id=str(project_id))
384 | file = await self._multi_step_file_create(project_id, path, "file", content)
385 | self.add_tags_and_contextvars(file_id=str(file.id))
386 | logger.info("Created new file", extra={"file_id": str(file.id)})
387 | return file
388 |
389 | async def create_notebook(
390 | self, project_id: uuid.UUID, path: str, notebook: Optional[Notebook] = None
391 | ) -> File:
392 | """Create a Notebook in a Project"""
393 | self.add_tags_and_contextvars(project_id=str(project_id))
394 | if notebook is None:
395 | notebook = Notebook()
396 | content = notebook.model_dump_json().encode()
397 | file = await self._multi_step_file_create(project_id, path, "notebook", content)
398 | self.add_tags_and_contextvars(file_id=str(file.id))
399 | logger.info("Created new notebook", extra={"file_id": str(file.id)})
400 | return file
401 |
402 | async def get_file(self, file_id: uuid.UUID) -> File:
403 | """Get metadata about a File, not including its content. Includes presigned download url."""
404 | self.add_tags_and_contextvars(file_id=str(file_id))
405 | endpoint = f"/v1/files/{file_id}"
406 | resp = await self.client.get(endpoint)
407 | resp.raise_for_status()
408 | file = File.model_validate(resp.json())
409 | return file
410 |
411 | async def get_file_content(self, file_id: uuid.UUID) -> bytes:
412 | """Get the content of a File, including Notebooks."""
413 | self.add_tags_and_contextvars(file_id=str(file_id))
414 | file = await self.get_file(file_id)
415 | presigned_download_url = file.presigned_download_url
416 | if not presigned_download_url:
417 | raise ValueError(f"File {file.id} does not have a presigned download url")
418 | # TODO: remove this hack if/when we get containers in Skaffold to be able to translate
419 | # localhost urls to the minio pod/container
420 | if "LOCAL_K8S" in os.environ and bool(os.environ["LOCAL_K8S"]):
421 | presigned_download_url = presigned_download_url.replace("localhost", "minio")
422 | async with httpx.AsyncClient() as plain_http_client:
423 | resp = await plain_http_client.get(presigned_download_url)
424 | resp.raise_for_status()
425 | return resp.content
426 |
427 | async def get_file_versions(self, file_id: uuid.UUID) -> List[FileVersion]:
428 | """
429 | List all versions of a File. The response includes presigned urls to download the content
430 | of any previous version. Note when working with older versions, you do not want to establish
431 | an RTUClient to "catch up" past that version.
432 | """
433 | endpoint = f"/files/{file_id}/versions"
434 | resp = await self.client.get(endpoint)
435 | resp.raise_for_status()
436 | versions = [FileVersion.model_validate(version) for version in resp.json()]
437 | return versions
438 |
439 | async def delete_file(self, file_id: uuid.UUID) -> File:
440 | self.add_tags_and_contextvars(file_id=str(file_id))
441 | endpoint = f"/v1/files/{file_id}"
442 | resp = await self.client.delete(endpoint)
443 | resp.raise_for_status()
444 | file = File.model_validate(resp.json())
445 | return file
446 |
447 | async def share_file(
448 | self, file_id: uuid.UUID, email: str, level: Union[str, AccessLevel]
449 | ) -> int:
450 | """
451 | Add another User as a collaborator to a Notebook or File.
452 | """
453 | return await self.share_resource(Resource.files, file_id, email, level)
454 |
455 | async def unshare_file(self, file_id: uuid.UUID, email: str) -> int:
456 | """
457 | Remove access to a Notebook or File for a User
458 | """
459 | return await self.unshare_resource(Resource.files, file_id, email)
460 |
461 | async def change_file_visibility(
462 | self,
463 | file_id: uuid.UUID,
464 | visibility: Visibility,
465 | visibility_default_access_level: Optional[AccessLevel] = None,
466 | ) -> Visibility:
467 | """
468 | Change overall visibility of a Notebook or File
469 | """
470 | return await self.change_resource_visibility(
471 | Resource.files,
472 | file_id,
473 | visibility,
474 | visibility_default_access_level,
475 | )
476 |
477 | async def get_datasources_for_notebook(self, file_id: uuid.UUID) -> List[DataSource]:
478 | """Return a list of Datasources that can be used in SQL cells within a Notebook"""
479 | self.add_tags_and_contextvars(file_id=str(file_id))
480 | endpoint = f"/v1/datasources/by_notebook/{file_id}"
481 | resp = await self.client.get(endpoint)
482 | resp.raise_for_status()
483 | datasources = pydantic.parse_obj_as(List[DataSource], resp.json())
484 |
485 | return datasources
486 |
487 | async def launch_kernel(
488 | self, file_id: uuid.UUID, kernel_name: str = "python3", hardware_size: str = "small"
489 | ) -> KernelSession:
490 | endpoint = "/v1/sessions"
491 | data = {
492 | "file_id": str(file_id),
493 | "kernel_config": {
494 | "kernel_name": kernel_name,
495 | "hardware_size_identifier": hardware_size,
496 | },
497 | }
498 | resp = await self.client.post(endpoint, json=data)
499 | resp.raise_for_status()
500 | kernel_session = KernelSession.model_validate(resp.json())
501 | self.add_tags_and_contextvars(kernel_session_id=str(kernel_session.id))
502 | logger.info(
503 | "Launched new kernel",
504 | extra={"kernel_session_id": str(kernel_session.id), "file_id": str(file_id)},
505 | )
506 | return kernel_session
507 |
508 | async def shutdown_kernel(self, kernel_session_id: uuid.UUID) -> None:
509 | endpoint = f"/sessions/{kernel_session_id}"
510 | resp = await self.client.delete(endpoint, timeout=60)
511 | resp.raise_for_status()
512 | logger.info("Shut down kernel", extra={"kernel_session_id": str(kernel_session_id)})
513 |
514 | async def get_output_collection(
515 | self, output_collection_id: uuid.UUID
516 | ) -> KernelOutputCollection:
517 | endpoint = f"/outputs/collection/{output_collection_id}"
518 | resp = await self.client.get(endpoint)
519 | resp.raise_for_status()
520 | return KernelOutputCollection.model_validate(resp.json())
521 |
522 | async def connect_realtime(self, file: Union[File, uuid.UUID, str]) -> "RTUClient": # noqa
523 | """
524 | Create an RTUClient for a Notebook by file id. This will perform the following steps:
525 | - Check /v1/files to get the current version information and presigned download url
526 | - Download seed notebook and create a NotebookBuilder from it
527 | - Create an RTUClient, initialize the websocket connection, authenticate, and subscribe
528 | - Apply delts to in-memory NotebookBuilder
529 | """
530 | # Import here to avoid circular imports
531 | from origami.clients.rtu import RTUClient
532 |
533 | file_id = None
534 |
535 | if isinstance(file, str):
536 | file_id = uuid.UUID(file)
537 | elif isinstance(file, uuid.UUID):
538 | file_id = file
539 | elif isinstance(file, File):
540 | file_id = file.id
541 | else:
542 | raise ValueError(f"Must provide a `file_id` or a File, not {file}")
543 |
544 | self.add_tags_and_contextvars(file_id=str(file_id))
545 |
546 | logger.info(f"Creating RTUClient for file {file_id}")
547 | rtu_client = RTUClient(api_client=self, file_id=file_id)
548 | # .initialize() downloads the seed notebook, establishes websocket, subscribes to various
549 | # channels, and begins squashing deltas.
550 | await rtu_client.initialize()
551 | # This event is resolved once all deltas from the file_subscribe reply deltas_to_apply
552 | # payload have been applied to the RTUClient NotebookBuilder
553 | await rtu_client.deltas_to_apply_event.wait()
554 | return rtu_client
555 |
--------------------------------------------------------------------------------
/origami/clients/cache.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/clients/cache.py
--------------------------------------------------------------------------------
/origami/log_utils.py:
--------------------------------------------------------------------------------
1 | """
2 | origami emits all of its messages using vanilla logging. The setup_logging() function
3 | below will structure origami (and any other loggers) to be streamed out using structlogs
4 | ConsoleRenderer. While this is useful to the origami repo in general, for instance in the
5 | CLI script and in tests, this function primarily serves as a template for how to set up structlog
6 | in your own application and configure it to structure both structlog-emitted messages (your app
7 | logs probably) and vanilla logging (origami logs).
8 | """
9 | import logging
10 | import logging.config
11 |
12 |
13 | # Set up structlog "pretty" console rendering. See tests/conftest.py for source template
14 | def setup_logging(log_level: int = logging.INFO):
15 | # structlog is optional dependency, try/except here and just use plain logging if structlog
16 | # isn't installed. That will not render the "extra" log info like ZMQ content on send/recv
17 | # debug logs.
18 | try:
19 | import structlog
20 |
21 | structlog.configure(
22 | processors=[
23 | structlog.stdlib.PositionalArgumentsFormatter(),
24 | structlog.processors.StackInfoRenderer(),
25 | structlog.processors.format_exc_info,
26 | structlog.stdlib.ProcessorFormatter.wrap_for_formatter,
27 | ],
28 | logger_factory=structlog.stdlib.LoggerFactory(),
29 | wrapper_class=structlog.stdlib.BoundLogger,
30 | cache_logger_on_first_use=True,
31 | )
32 |
33 | # shared processors to be applied to both vanilla and structlog messages
34 | # after each is appropriately pre-processed
35 | processors = [
36 | # log level / logger name, effects coloring in ConsoleRenderer(colors=True)
37 | structlog.stdlib.add_log_level,
38 | structlog.stdlib.add_logger_name,
39 | # timestamp format
40 | structlog.processors.TimeStamper(fmt="iso"),
41 | # To see all CallsiteParameterAdder options:
42 | # https://www.structlog.org/en/stable/api.html#structlog.processors.CallsiteParameterAdder
43 | # more options include module, pathname, process, process_name, thread, thread_name
44 | structlog.processors.CallsiteParameterAdder(
45 | {
46 | structlog.processors.CallsiteParameter.FILENAME,
47 | structlog.processors.CallsiteParameter.FUNC_NAME,
48 | structlog.processors.CallsiteParameter.LINENO,
49 | }
50 | ),
51 | # Any structlog.contextvars.bind_contextvars included in middleware/functions
52 | structlog.contextvars.merge_contextvars,
53 | # strip _record and _from_structlog keys from event dictionary
54 | structlog.stdlib.ProcessorFormatter.remove_processors_meta,
55 | structlog.dev.ConsoleRenderer(colors=True),
56 | # ^^ In prod with any kind of logging service (datadog, grafana, etc), ConsoleRenderer
57 | # would probably be replaced with structlog.processors.JSONRenderer() or similar
58 | ]
59 |
60 | # Configs applied to logs generated by structlog or vanilla logging
61 | logging.config.dictConfig(
62 | {
63 | "version": 1,
64 | "disable_existing_loggers": False,
65 | "formatters": {
66 | "default": {
67 | "()": structlog.stdlib.ProcessorFormatter,
68 | "processors": processors,
69 | "foreign_pre_chain": [structlog.stdlib.ExtraAdder()],
70 | },
71 | },
72 | "handlers": {
73 | "default": {
74 | "class": "logging.StreamHandler",
75 | "formatter": "default",
76 | "stream": "ext://sys.stdout",
77 | },
78 | },
79 | "loggers": {
80 | # "" for applying handler to "root" (all libraries)
81 | # you could set this to "origami" to only see logs from this library
82 | "": {
83 | "handlers": ["default"],
84 | "level": log_level,
85 | "propagate": True,
86 | },
87 | },
88 | }
89 | )
90 | except ImportError:
91 | logger = logging.getLogger()
92 | logger.warning("Structlog not installed, using vanilla logging")
93 | logger.setLevel(log_level)
94 | logger.addHandler(logging.StreamHandler())
95 |
--------------------------------------------------------------------------------
/origami/models/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/__init__.py
--------------------------------------------------------------------------------
/origami/models/api/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/api/__init__.py
--------------------------------------------------------------------------------
/origami/models/api/base.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from datetime import datetime
3 | from typing import Optional
4 |
5 | from pydantic import BaseModel
6 |
7 |
8 | class ResourceBase(BaseModel):
9 | id: uuid.UUID
10 | created_at: datetime
11 | updated_at: datetime
12 | deleted_at: Optional[datetime] = None
13 |
--------------------------------------------------------------------------------
/origami/models/api/datasources.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from datetime import datetime
3 | from typing import Optional
4 |
5 | from pydantic import BaseModel
6 |
7 |
8 | class DataSource(BaseModel):
9 | datasource_id: uuid.UUID
10 | name: str
11 | description: str
12 | type_id: str # e.g. duckdb, postgresql
13 | sql_cell_handle: str # this goes in cell metadata for SQL cells
14 | # One of these three will be not None, and that tells you the scope of the datasource
15 | space_id: Optional[uuid.UUID] = None
16 | project_id: Optional[uuid.UUID] = None
17 | user_id: Optional[uuid.UUID] = None
18 | created_by_id: uuid.UUID
19 | created_at: datetime
20 | updated_at: datetime
21 | is_introspectable: bool
22 | is_legacy: bool
23 | usability: str
24 |
--------------------------------------------------------------------------------
/origami/models/api/files.py:
--------------------------------------------------------------------------------
1 | import os
2 | import pathlib
3 | import uuid
4 | from typing import Literal, Optional
5 |
6 | from pydantic import model_validator
7 |
8 | from origami.models.api.base import ResourceBase
9 |
10 |
11 | class File(ResourceBase):
12 | filename: str
13 | path: pathlib.Path
14 | project_id: uuid.UUID
15 | space_id: uuid.UUID
16 | size: Optional[int] = None
17 | mimetype: Optional[str] = None
18 | type: Literal["file", "notebook"]
19 | current_version_id: Optional[uuid.UUID] = None
20 | # presigned_download_url is None when listing Files in a Project, need to hit /api/v1/files/{id}
21 | # to get it. Use presigned download url to get File content including Notebooks
22 | presigned_download_url: Optional[str] = None
23 | url: Optional[str] = None
24 |
25 | @model_validator(mode="after")
26 | def construct_url(self):
27 | noteable_url = os.environ.get("PUBLIC_NOTEABLE_URL", "https://app.noteable.io")
28 | self.url = f"{noteable_url}/f/{self.id}/{self.path}"
29 |
30 | return self
31 |
32 |
33 | class FileVersion(ResourceBase):
34 | created_by_id: Optional[uuid.UUID] = None
35 | number: int
36 | name: Optional[str] = None
37 | description: Optional[str] = None
38 | file_id: uuid.UUID
39 | project_id: uuid.UUID
40 | space_id: uuid.UUID
41 | content_presigned_url: str
42 |
--------------------------------------------------------------------------------
/origami/models/api/outputs.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from typing import List, Optional
3 |
4 | from pydantic import BaseModel
5 |
6 | from origami.models.api.base import ResourceBase
7 |
8 |
9 | class KernelOutputContent(BaseModel):
10 | raw: Optional[str] = None
11 | url: Optional[str] = None
12 | mimetype: str
13 |
14 |
15 | class KernelOutput(ResourceBase):
16 | type: str
17 | display_id: Optional[str] = None
18 | available_mimetypes: List[str]
19 | content_metadata: KernelOutputContent
20 | content: Optional[KernelOutputContent] = None
21 | content_for_llm: Optional[KernelOutputContent] = None
22 | parent_collection_id: uuid.UUID
23 |
24 |
25 | class KernelOutputCollection(ResourceBase):
26 | cell_id: Optional[str] = None
27 | widget_model_id: Optional[str] = None
28 | file_id: uuid.UUID
29 | outputs: List[KernelOutput]
30 |
--------------------------------------------------------------------------------
/origami/models/api/projects.py:
--------------------------------------------------------------------------------
1 | import os
2 | import uuid
3 | from typing import Optional
4 |
5 | from pydantic import model_validator
6 |
7 | from origami.models.api.base import ResourceBase
8 |
9 |
10 | class Project(ResourceBase):
11 | name: str
12 | description: Optional[str] = None
13 | space_id: uuid.UUID
14 | url: Optional[str] = None
15 |
16 | @model_validator(mode="after")
17 | def construct_url(self):
18 | noteable_url = os.environ.get("PUBLIC_NOTEABLE_URL", "https://app.noteable.io")
19 | self.url = f"{noteable_url}/p/{self.id}"
20 |
21 | return self
22 |
--------------------------------------------------------------------------------
/origami/models/api/spaces.py:
--------------------------------------------------------------------------------
1 | import os
2 | from typing import Optional
3 |
4 | from pydantic import model_validator
5 |
6 | from origami.models.api.base import ResourceBase
7 |
8 |
9 | class Space(ResourceBase):
10 | name: str
11 | description: Optional[str] = None
12 | url: Optional[str] = None
13 |
14 | @model_validator(mode="after")
15 | def construct_url(self):
16 | noteable_url = os.environ.get("PUBLIC_NOTEABLE_URL", "https://app.noteable.io")
17 | self.url = f"{noteable_url}/s/{self.id}"
18 |
19 | return self
20 |
--------------------------------------------------------------------------------
/origami/models/api/users.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from typing import Optional
3 |
4 | from pydantic import model_validator
5 |
6 | from origami.models.api.base import ResourceBase
7 |
8 |
9 | class User(ResourceBase):
10 | """The user fields sent to/from the server"""
11 |
12 | handle: str
13 | email: Optional[str] = None # not returned if looking up user other than yourself
14 | first_name: str
15 | last_name: str
16 | origamist_default_project_id: Optional[uuid.UUID] = None
17 | principal_sub: Optional[str] = None # from /users/me only, represents auth type
18 | auth_type: Optional[str] = None
19 |
20 | @model_validator(mode="after")
21 | def construct_auth_type(self):
22 | if self.principal_sub:
23 | self.auth_type = self.principal_sub.split("|")[0]
24 |
25 | return self
26 |
--------------------------------------------------------------------------------
/origami/models/deltas/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/deltas/__init__.py
--------------------------------------------------------------------------------
/origami/models/deltas/base.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from datetime import datetime
3 | from typing import Any, Optional
4 |
5 | from pydantic import BaseModel, Field
6 |
7 | NULL_RESOURCE_SENTINEL = "__NULL_RESOURCE__"
8 |
9 |
10 | class FileDeltaBase(BaseModel):
11 | id: uuid.UUID = Field(default_factory=uuid.uuid4)
12 | file_id: uuid.UUID
13 | delta_type: str
14 | delta_action: str
15 | resource_id: str = NULL_RESOURCE_SENTINEL
16 | parent_delta_id: Optional[uuid.UUID] = None
17 | properties: Any = None # override in subclasses
18 | # created_at and created_by_id should not be filled out when creating new Delta requests.
19 | # they are filled out by the server when the Delta is written to the database (with user info
20 | # coming from the initial authenticate on the RTU session)
21 | created_at: Optional[datetime] = None
22 | created_by_id: Optional[uuid.UUID] = None
23 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/deltas/delta_types/__init__.py
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/cell_contents.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Literal, Union
2 |
3 | from pydantic import BaseModel, Field
4 |
5 | from origami.models.deltas.base import FileDeltaBase
6 |
7 |
8 | class CellContentsDelta(FileDeltaBase):
9 | delta_type: Literal["cell_contents"] = "cell_contents"
10 |
11 |
12 | class CellContentsUpdateProperties(BaseModel):
13 | patch: str # diff-match-patch
14 |
15 |
16 | class CellContentsUpdate(CellContentsDelta):
17 | # resource_id should be cell id to update
18 | delta_action: Literal["update"] = "update"
19 | properties: CellContentsUpdateProperties
20 |
21 |
22 | class CellContentsReplaceProperties(BaseModel):
23 | source: str # full replace, no diff-match-patch
24 |
25 |
26 | class CellContentsReplace(CellContentsDelta):
27 | # resource_id should be cell id to replace
28 | delta_action: Literal["replace"] = "replace"
29 | properties: CellContentsReplaceProperties
30 |
31 |
32 | CellContentsDeltas = Annotated[
33 | Union[
34 | CellContentsUpdate,
35 | CellContentsReplace,
36 | ],
37 | Field(discriminator="delta_action"),
38 | ]
39 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/cell_execute.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Literal, Union
2 |
3 | from pydantic import Field
4 |
5 | from origami.models.deltas.base import FileDeltaBase
6 |
7 |
8 | class CellExecuteDelta(FileDeltaBase):
9 | delta_type: Literal["cell_execute"] = "cell_execute"
10 |
11 |
12 | class CellExecute(CellExecuteDelta):
13 | # execute single cel
14 | # resource_id should be cell id to run
15 | delta_action: Literal["execute"] = "execute"
16 |
17 |
18 | class CellExecuteAfter(CellExecuteDelta):
19 | # execute specific cell id and all cells after it
20 | # resource_id should be cell id to run
21 | delta_action: Literal["execute_after"] = "execute_after"
22 |
23 |
24 | class CellExecuteBefore(CellExecuteDelta):
25 | # execute all cells up to specific cell, inclusive of that cell id
26 | # resource_id should be cell id to run
27 | delta_action: Literal["execute_before"] = "execute_before"
28 |
29 |
30 | class CellExecuteAll(CellExecuteDelta):
31 | # execute all cells
32 | delta_action: Literal["execute_all"] = "execute_all"
33 |
34 |
35 | CellExecuteDeltas = Annotated[
36 | Union[
37 | CellExecute,
38 | CellExecuteAfter,
39 | CellExecuteBefore,
40 | CellExecuteAll,
41 | ],
42 | Field(discriminator="delta_action"),
43 | ]
44 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/cell_metadata.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Any, Literal, Optional, Union
2 |
3 | from pydantic import BaseModel, Field
4 |
5 | from origami.models.deltas.base import FileDeltaBase
6 |
7 | NULL_PRIOR_VALUE_SENTINEL = "__NULL_PRIOR_VALUE__"
8 |
9 |
10 | class CellMetadataDelta(FileDeltaBase):
11 | delta_type: Literal["cell_metadata"] = "cell_metadata"
12 |
13 |
14 | # A lot of state is stored in cell metadata, including DEX and execute time
15 | class CellMetadataUpdateProperties(BaseModel):
16 | path: list
17 | value: Any = None
18 | prior_value: Any = NULL_PRIOR_VALUE_SENTINEL
19 |
20 |
21 | class CellMetadataUpdate(CellMetadataDelta):
22 | # resource_id should be cell id to update
23 | delta_action: Literal["update"] = "update"
24 | properties: CellMetadataUpdateProperties
25 |
26 |
27 | # Cell metadata replace is used for changing cell type and language (Python/R/etc)
28 | class CellMetadataReplaceProperties(BaseModel):
29 | type: Optional[str] = None
30 | language: Optional[str] = None
31 |
32 |
33 | class CellMetadataReplace(CellMetadataDelta):
34 | # resource_id should be cell id to replace
35 | delta_action: Literal["replace"] = "replace"
36 | properties: CellMetadataReplaceProperties
37 |
38 |
39 | CellMetadataDeltas = Annotated[
40 | Union[
41 | CellMetadataUpdate,
42 | CellMetadataReplace,
43 | ],
44 | Field(discriminator="delta_action"),
45 | ]
46 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/cell_output_collection.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from typing import Literal
3 |
4 | from pydantic import BaseModel
5 |
6 | from origami.models.deltas.base import FileDeltaBase
7 |
8 |
9 | class CellOutputCollectionDelta(FileDeltaBase):
10 | delta_type: Literal["cell_output_collection"] = "cell_output_collection"
11 |
12 |
13 | class CellOutputCollectionReplaceData(BaseModel):
14 | output_collection_id: uuid.UUID
15 |
16 |
17 | class CellOutputCollectionReplace(CellOutputCollectionDelta):
18 | # resource_id should be cell id to replace with new output ocllection id
19 | delta_action: Literal["replace"] = "replace"
20 | properties: CellOutputCollectionReplaceData
21 |
22 |
23 | # Since there's only one action, we don't have an Annotated Union
24 | CellOutputCollectionDeltas = CellOutputCollectionReplace
25 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/nb_cells.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Literal, Optional, Union
2 |
3 | from pydantic import BaseModel, Field
4 |
5 | from origami.models.deltas.base import FileDeltaBase
6 | from origami.models.notebook import NotebookCell
7 |
8 |
9 | class NBCellsDelta(FileDeltaBase):
10 | delta_type: Literal["nb_cells"] = "nb_cells"
11 |
12 |
13 | class NBCellsAddProperties(BaseModel):
14 | id: str # should be same as cell.id
15 | after_id: Optional[str] = None # insert this cell after another cell in the Notebook
16 | cell: NotebookCell
17 |
18 |
19 | class NBCellsAdd(NBCellsDelta):
20 | delta_action: Literal["add"] = "add"
21 | properties: NBCellsAddProperties
22 |
23 |
24 | class NBCellsDeleteProperties(BaseModel):
25 | id: str
26 |
27 |
28 | class NBCellsDelete(NBCellsDelta):
29 | delta_action: Literal["delete"] = "delete"
30 | properties: NBCellsDeleteProperties
31 |
32 |
33 | class NBCellsMoveProperties(BaseModel):
34 | id: str
35 | after_id: Optional[str] = None
36 |
37 |
38 | class NBCellsMove(NBCellsDelta):
39 | delta_action: Literal["move"] = "move"
40 | properties: NBCellsMoveProperties
41 |
42 |
43 | NBCellsDeltas = Annotated[
44 | Union[
45 | NBCellsAdd,
46 | NBCellsDelete,
47 | NBCellsMove,
48 | ],
49 | Field(discriminator="delta_action"),
50 | ]
51 |
--------------------------------------------------------------------------------
/origami/models/deltas/delta_types/nb_metadata.py:
--------------------------------------------------------------------------------
1 | from typing import Any, Literal, Optional
2 |
3 | from pydantic import BaseModel
4 |
5 | from origami.models.deltas.base import FileDeltaBase
6 |
7 |
8 | class NBMetadataDelta(FileDeltaBase):
9 | delta_type: Literal["nb_metadata"] = "nb_metadata"
10 |
11 |
12 | class NBMetadataProperties(BaseModel):
13 | path: list
14 | value: Any = None
15 | prior_value: Optional[Any] = None
16 |
17 |
18 | class NBMetadataUpdate(NBMetadataDelta):
19 | delta_action: Literal["update"] = "update"
20 | properties: NBMetadataProperties
21 |
22 |
23 | # Since there's only one option here, instead of annotated union we alias this to the single item
24 | NBMetadataDeltas = NBMetadataUpdate
25 |
--------------------------------------------------------------------------------
/origami/models/deltas/discriminators.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Union
2 |
3 | from pydantic import Field
4 |
5 | from origami.models.deltas.delta_types.cell_contents import CellContentsDeltas
6 | from origami.models.deltas.delta_types.cell_execute import CellExecuteDeltas
7 | from origami.models.deltas.delta_types.cell_metadata import CellMetadataDeltas
8 | from origami.models.deltas.delta_types.cell_output_collection import CellOutputCollectionDeltas
9 | from origami.models.deltas.delta_types.nb_cells import NBCellsDeltas
10 | from origami.models.deltas.delta_types.nb_metadata import NBMetadataDeltas
11 |
12 | # Use: pydantic.parse_obj_as(FileDelta, )
13 | FileDelta = Annotated[
14 | Union[
15 | CellContentsDeltas,
16 | CellExecuteDeltas,
17 | CellMetadataDeltas,
18 | CellOutputCollectionDeltas,
19 | NBCellsDeltas,
20 | NBMetadataDeltas,
21 | ],
22 | Field(discriminator="delta_type"),
23 | ]
24 |
--------------------------------------------------------------------------------
/origami/models/kernels.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from datetime import datetime
3 | from typing import Optional
4 |
5 | from pydantic import BaseModel
6 |
7 |
8 | class KernelDetails(BaseModel):
9 | name: str
10 | last_activity: Optional[datetime] = None
11 | execution_state: str
12 |
13 |
14 | class KernelStatusUpdate(BaseModel):
15 | session_id: uuid.UUID
16 | kernel: KernelDetails
17 |
18 |
19 | class CellState(BaseModel):
20 | cell_id: str
21 | state: str
22 |
23 |
24 | class KernelSession(BaseModel):
25 | id: uuid.UUID
26 | kernel: KernelDetails
27 |
--------------------------------------------------------------------------------
/origami/models/notebook.py:
--------------------------------------------------------------------------------
1 | """
2 | Modeling the Notebook File Format with Pydantic models. It also includes some helper properties
3 | relevant to Noteable format, such as whether a code cell is a SQL cell and retrieving the output
4 | collection id, which is a Noteable-specific cell output context.
5 |
6 | See https://nbformat.readthedocs.io/en/latest/format_description.html# for Notebook model spec.
7 |
8 | Devs: as usual with Pydantic modeling, the top-level model (Notebook) is at the bottom of this file,
9 | read from bottom up for most clarity.
10 | """
11 | import random
12 | import string
13 | import uuid
14 | from typing import Any, Dict, List, Literal, Optional, Union
15 |
16 | from pydantic import BaseModel, ConfigDict, Field, field_validator
17 | from typing_extensions import Annotated # for 3.8 compatibility
18 |
19 |
20 | # Cell outputs modeled with a discriminator pattern where the output_type
21 | # field will determine what kind of output we have
22 | # https://nbformat.readthedocs.io/en/latest/format_description.html#code-cell-outputs
23 | class StreamOutput(BaseModel):
24 | output_type: Literal["stream"] = "stream"
25 | name: str # stdout or stderr
26 | text: str
27 |
28 | @field_validator("text", mode="before")
29 | @classmethod
30 | def multiline_text(cls, v):
31 | """In the event we get a list of strings, combine into one string with newlines."""
32 | if isinstance(v, list):
33 | return "\n".join(v)
34 | return v
35 |
36 |
37 | class DisplayDataOutput(BaseModel):
38 | output_type: Literal["display_data"] = "display_data"
39 | data: Dict[str, Any]
40 | metadata: Dict[str, Any]
41 |
42 |
43 | class ExecuteResultOutput(BaseModel):
44 | output_type: Literal["execute_result"] = "execute_result"
45 | execution_count: Optional[int] = None
46 | data: Dict[str, Any]
47 | metadata: Dict[str, Any]
48 |
49 |
50 | class ErrorOutput(BaseModel):
51 | output_type: Literal["error"] = "error"
52 | ename: str
53 | evalue: str
54 | traceback: List[str]
55 |
56 |
57 | # Use: List[CellOutput] or pydantic.parse_obj_as(CellOutput, dict)
58 | CellOutput = Annotated[
59 | Union[StreamOutput, DisplayDataOutput, ExecuteResultOutput, ErrorOutput],
60 | Field(discriminator="output_type"),
61 | ]
62 |
63 |
64 | # Cell types
65 | class CellBase(BaseModel):
66 | """
67 | All Cell types have id, source and metadata.
68 | The source can be a string or list of strings in nbformat spec,
69 | but we only want to deal with source as a string throughout our
70 | code base so we have a validator here to cast the list of strings
71 | to a single string, both at initial read and during any mutations
72 | (e.g. applying diff-match-patch cell content updates).
73 | """
74 |
75 | id: str = Field(default_factory=lambda: str(uuid.uuid4()))
76 | source: str = ""
77 | metadata: Dict[str, Any] = Field(default_factory=dict)
78 |
79 | @field_validator("source", mode="before")
80 | @classmethod
81 | def multiline_source(cls, v):
82 | if isinstance(v, list):
83 | return "\n".join(v)
84 | return v
85 |
86 | model_config = ConfigDict(validate_on_assignment=True)
87 |
88 |
89 | class CodeCell(CellBase):
90 | cell_type: Literal["code"] = "code"
91 | execution_count: Optional[int] = None
92 | outputs: List[CellOutput] = Field(default_factory=list)
93 |
94 | @property
95 | def is_sql_cell(self):
96 | return self.metadata.get("noteable", {}).get("cell_type") == "sql"
97 |
98 | @property
99 | def output_collection_id(self) -> Optional[Union[str, uuid.UUID]]:
100 | return self.metadata.get("noteable", {}).get("output_collection_id")
101 |
102 |
103 | def make_sql_cell(
104 | cell_id: Optional[str] = None,
105 | source: str = "",
106 | db_connection: str = "@noteable",
107 | assign_results_to: Optional[str] = None,
108 | ) -> CodeCell:
109 | cell_id = cell_id or str(uuid.uuid4())
110 | # Remove first line of source if it starts with %%sql. That is the right syntax for regular
111 | # code cells with sql magic support, but Noteable SQL cells should have just the sql source
112 | if source.startswith("%%sql"):
113 | lines = source.splitlines()
114 | source = "\n".join(lines[1:])
115 |
116 | if not assign_results_to:
117 | name_suffix = "".join(random.choices(string.ascii_lowercase, k=4))
118 | assign_results_to = "df_" + name_suffix
119 | metadata = {
120 | "language": "sql",
121 | "type": "code",
122 | "noteable": {
123 | "cell_type": "sql",
124 | "db_connection": db_connection,
125 | "assign_results_to": assign_results_to,
126 | },
127 | }
128 | return CodeCell(cell_id=cell_id, source=source, metadata=metadata)
129 |
130 |
131 | class MarkdownCell(CellBase):
132 | cell_type: Literal["markdown"] = "markdown"
133 |
134 |
135 | class RawCell(CellBase):
136 | cell_type: Literal["raw"] = "raw"
137 |
138 |
139 | # Use: List[NotebookCell] or pydantic.parse_obj_as(NotebookCell, dict)
140 | NotebookCell = Annotated[
141 | Union[
142 | CodeCell,
143 | MarkdownCell,
144 | RawCell,
145 | ],
146 | Field(discriminator="cell_type"),
147 | ]
148 |
149 |
150 | class Notebook(BaseModel):
151 | nbformat: int = 4
152 | nbformat_minor: int = 5
153 | metadata: Dict[str, Any] = Field(default_factory=dict)
154 | cells: List[NotebookCell] = Field(default_factory=list)
155 |
156 | @property
157 | def language(self) -> Optional[str]:
158 | return self.metadata.get("language_info", {}).get("name")
159 |
160 | @property
161 | def language_version(self) -> Optional[str]:
162 | return self.metadata.get("language_info", {}).get("version")
163 |
--------------------------------------------------------------------------------
/origami/models/rtu/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/rtu/__init__.py
--------------------------------------------------------------------------------
/origami/models/rtu/base.py:
--------------------------------------------------------------------------------
1 | import uuid
2 | from datetime import datetime
3 | from typing import Any, Optional
4 |
5 | from pydantic import BaseModel, Field, model_validator
6 | from typing_extensions import Annotated
7 |
8 |
9 | class BooleanReplyData(BaseModel):
10 | # Gate will reply to most RTU requests with an RTU reply that's just success=True/False
11 | success: bool
12 |
13 |
14 | class BaseRTU(BaseModel):
15 | transaction_id: uuid.UUID = Field(default_factory=uuid.uuid4)
16 | channel: str
17 | channel_prefix: Annotated[
18 | Optional[str], Field(exclude=True)
19 | ] = None # override in Channels base classes to be Literal
20 | event: str # override in Events subclasses to be Literal
21 | data: Any = None # override in subclasses to be a pydantic model
22 |
23 | @model_validator(mode="after")
24 | def set_channel_prefix(self):
25 | self.channel_prefix = self.channel.split("/")[0]
26 | return self
27 |
28 |
29 | class BaseRTURequest(BaseRTU):
30 | pass
31 |
32 |
33 | class BaseRTUResponse(BaseRTU):
34 | processed_timestamp: datetime = Field(default_factory=datetime.utcnow)
35 |
--------------------------------------------------------------------------------
/origami/models/rtu/channels/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/models/rtu/channels/__init__.py
--------------------------------------------------------------------------------
/origami/models/rtu/channels/files.py:
--------------------------------------------------------------------------------
1 | """
2 | There are six events on the files/ channel:
3 |
4 | 1. subscribe_request and subscribe_reply
5 | 2. unsubscribe_request and unsubscribe_reply
6 | 3. new_delta_request and new_delta_reply (direct) / new_delta_event (broadcast)
7 | - RTU Errors for invalid_data or permission_denied
8 | 4. update_user_cell_selection_request and
9 | update_user_cell_selection_reply -> update_user_file_subscription_event
10 | 5. input_reply_request and input_reply_reply
11 | 6. transform_view_to_code_request and transform_view_to_code_reply (DEX export to code cell)
12 | - The follow on "event" is a new delta event
13 | """
14 | import uuid
15 | from datetime import datetime
16 | from typing import Annotated, Any, List, Literal, Optional, Union
17 |
18 | from pydantic import BaseModel, ConfigDict, Field, model_validator
19 |
20 | from origami.models.api.outputs import KernelOutput
21 | from origami.models.deltas.discriminators import FileDelta
22 | from origami.models.kernels import CellState, KernelStatusUpdate
23 | from origami.models.rtu.base import BaseRTURequest, BaseRTUResponse, BooleanReplyData
24 |
25 |
26 | class FilesRequest(BaseRTURequest):
27 | channel_prefix: Literal["files"] = "files"
28 |
29 |
30 | class FilesResponse(BaseRTUResponse):
31 | channel_prefix: Literal["files"] = "files"
32 |
33 |
34 | # When an RTU Client wants to get document model updates from a Notebook, it subscribes to the files
35 | # channel with that Notebook ID.
36 | class FileSubscribeRequestData(BaseModel):
37 | # One of these two must be set
38 | from_version_id: Optional[uuid.UUID] = None
39 | from_delta_id: Optional[uuid.UUID] = None
40 | model_config = ConfigDict(exclude_none=True)
41 |
42 | @model_validator(mode="after")
43 | def exactly_one_field(self):
44 | # Count how many fields are set (i.e., are not None)
45 | num_set_fields = sum(
46 | value is not None for value in (self.from_version_id, self.from_delta_id)
47 | )
48 |
49 | # If exactly one field is set, return the values as they are
50 | if not num_set_fields == 1:
51 | raise ValueError("Exactly one field must be set")
52 |
53 | return self
54 |
55 |
56 | class FileSubscribeRequest(FilesRequest):
57 | event: Literal["subscribe_request"] = "subscribe_request"
58 | data: FileSubscribeRequestData
59 |
60 |
61 | # File subscribe reply has several pieces of information
62 | # - List of deltas to squash into the NotebookBuilder immediately
63 | class FileSubscribeReplyData(BaseModel):
64 | deltas_to_apply: List[FileDelta]
65 | latest_delta_id: Optional[uuid.UUID] = None
66 | kernel_session: Optional[KernelStatusUpdate] = None # null if no active Kernel for the File
67 | cell_states: List[CellState]
68 | # TODO: user_subscriptions
69 |
70 |
71 | class FileSubscribeReply(FilesResponse):
72 | event: Literal["subscribe_reply"] = "subscribe_reply"
73 | data: FileSubscribeReplyData
74 |
75 |
76 | # Clients typically do not need to unsubscribe, they can just close the websocket connection
77 | class FileUnsubscribeRequest(FilesRequest):
78 | event: Literal["unsubscribe_request"] = "unsubscribe_request"
79 |
80 |
81 | class FileUnsubscribeReply(FilesResponse):
82 | event: Literal["unsubscribe_reply"] = "unsubscribe_reply"
83 | data: BooleanReplyData
84 |
85 |
86 | # Deltas are requests to change a document content or perform cell execution. The API server ensures
87 | # they are applied in a linear order, and will return a delta reply if it has been successfully
88 | # recorded, followed by a new delta event propogated to all connected clients.
89 | class NewDeltaRequestData(BaseModel):
90 | delta: FileDelta
91 | # When is this second field used?
92 | output_collection_id_to_copy: Optional[uuid.UUID] = None
93 |
94 |
95 | class NewDeltaRequest(FilesRequest):
96 | event: Literal["new_delta_request"] = "new_delta_request"
97 | data: NewDeltaRequestData
98 |
99 |
100 | class NewDeltaReply(FilesResponse):
101 | event: Literal["new_delta_reply"] = "new_delta_reply"
102 | data: BooleanReplyData
103 |
104 |
105 | class NewDeltaEvent(FilesResponse):
106 | event: Literal["new_delta_event"] = "new_delta_event"
107 | data: FileDelta
108 |
109 |
110 | # When Cells complete and there's output, a new CellOutputCollectionReplace Delta will come through
111 | # that is a container for multi-part output or a link to a pre-signed download url for large output
112 | # like an image/gif.
113 | class UpdateOutputCollectionEventData(BaseModel):
114 | pass
115 |
116 |
117 | class UpdateOutputCollectionEvent(FilesResponse):
118 | event: Literal["update_output_collection_event"] = "update_output_collection_event"
119 | data: UpdateOutputCollectionEventData
120 |
121 |
122 | # If Cells are streaming multiple outputs like a pip install or for loop and print, then we'll get
123 | # append to output events
124 | class AppendOutputEvent(FilesResponse):
125 | event: Literal["append_output_event"] = "append_output_event"
126 | data: KernelOutput
127 |
128 |
129 | # User cell selection is a collaboration feature, shows which cell each user is currently editing
130 | # Like Deltas, it follows a request -> reply -> event pattern
131 | class UpdateUserCellSelectionRequestData(BaseModel):
132 | id: uuid.UUID
133 |
134 |
135 | class UpdateUserCellSelectionRequest(FilesRequest):
136 | event: Literal["update_user_cell_selection_request"] = "update_user_cell_selection_request"
137 | data: UpdateUserCellSelectionRequestData
138 |
139 |
140 | class UpdateUserCellSelectionReply(FilesResponse):
141 | event: Literal["update_user_cell_selection_reply"] = "update_user_cell_selection_reply"
142 | data: BooleanReplyData
143 |
144 |
145 | class UpdateUserFileSubscriptionEventData(BaseModel):
146 | cell_id_selected: Optional[str] = None
147 | file_id: uuid.UUID
148 | last_event_at: datetime
149 | subscribed: bool
150 | user_id: uuid.UUID
151 |
152 |
153 | class UpdateUserFileSubscriptionEvent(FilesResponse):
154 | event: Literal["update_user_file_subscription_event"] = "update_user_file_subscription_event"
155 | data: UpdateUserFileSubscriptionEventData
156 |
157 |
158 | class RemoveUserFileSubscriptionEventData(BaseModel):
159 | user_id: uuid.UUID
160 |
161 |
162 | class RemoveUserFileSubscriptionEvent(FilesResponse):
163 | event: Literal["remove_user_file_subscription_event"] = "remove_user_file_subscription_event"
164 | data: RemoveUserFileSubscriptionEventData
165 |
166 |
167 | # CPU / Memory usage metrics reported via k8s
168 | class UsageMetricsEventData(BaseModel):
169 | cpu_usage_percent: int
170 | memory_usage_percent: int
171 |
172 |
173 | class UsageMetricsEvent(FilesResponse):
174 | event: Literal["usage_metrics_event"] = "usage_metrics_event"
175 | data: UsageMetricsEventData
176 |
177 |
178 | # Transform view to code is a DEX feature, it allows a user to create a new code cell that has
179 | # Python syntax to filter a Dataframe the same way as the current DEX grid view
180 | class TransformViewToCodeRequestData(BaseModel):
181 | # TODO: Shoup review this
182 | cell_id: str
183 | filters: Any = None
184 | ignore_index: bool = True
185 | overrides: dict = Field(default_factory=dict)
186 | target_cell_type: str = "code"
187 | variable_name: str = "df"
188 |
189 |
190 | class TransformViewToCodeRequest(FilesRequest):
191 | event: Literal["transform_view_to_code_request"] = "transform_view_to_code_request"
192 | data: TransformViewToCodeRequestData
193 |
194 |
195 | class TransformViewToCodeReply(FilesResponse):
196 | event: Literal["transform_view_to_code_reply"] = "transform_view_to_code_reply"
197 | data: BooleanReplyData
198 |
199 |
200 | # Widgets, ugh. Not attempting to model the payload, no current plan on doing anything with them
201 | # on the Origami side.
202 | class V0CreateWidgetModelEvent(FilesResponse):
203 | event: Literal["v0_create_widget_model_event"] = "v0_create_widget_model_event"
204 | data: Any = None
205 |
206 |
207 | # When the API squashes Deltas, it will emit a new file versions changed event
208 | class FileVersionsChangedEvent(FilesResponse):
209 | event: Literal["v0_file_versions_changed_event"] = "v0_file_versions_changed_event"
210 | data: Optional[dict] = None
211 |
212 |
213 | FileRequests = Annotated[
214 | Union[
215 | FileSubscribeRequest,
216 | FileUnsubscribeRequest,
217 | NewDeltaRequest,
218 | UpdateUserCellSelectionRequest,
219 | TransformViewToCodeRequest,
220 | ],
221 | Field(discriminator="event"),
222 | ]
223 |
224 | FileResponses = Annotated[
225 | Union[
226 | FileSubscribeReply,
227 | FileUnsubscribeReply,
228 | FileVersionsChangedEvent,
229 | NewDeltaReply,
230 | NewDeltaEvent,
231 | RemoveUserFileSubscriptionEvent,
232 | TransformViewToCodeReply,
233 | V0CreateWidgetModelEvent,
234 | UpdateUserCellSelectionReply,
235 | UpdateUserFileSubscriptionEvent,
236 | UpdateOutputCollectionEvent,
237 | AppendOutputEvent,
238 | UsageMetricsEvent,
239 | ],
240 | Field(discriminator="event"),
241 | ]
242 |
--------------------------------------------------------------------------------
/origami/models/rtu/channels/kernels.py:
--------------------------------------------------------------------------------
1 | """
2 | The kernels channel in RTU is primarily used for runtime updates like kernel and cell status,
3 | variable explorer, and outputs vice document model changes on the files channel (adding cells,
4 | updating content, etc)
5 | """
6 | import uuid
7 | from typing import Annotated, List, Literal, Optional, Union
8 |
9 | from pydantic import BaseModel, Field
10 |
11 | from origami.models.kernels import CellState, KernelStatusUpdate
12 | from origami.models.rtu.base import BaseRTURequest, BaseRTUResponse, BooleanReplyData
13 |
14 |
15 | class KernelsRequest(BaseRTURequest):
16 | channel_prefix: Literal["kernels"] = "kernels"
17 |
18 |
19 | class KernelsResponse(BaseRTUResponse):
20 | channel_prefix: Literal["kernels"] = "kernels"
21 |
22 |
23 | class KernelSubscribeRequestData(BaseModel):
24 | file_id: uuid.UUID
25 |
26 |
27 | class KernelSubscribeRequest(KernelsRequest):
28 | event: Literal["subscribe_request"] = "subscribe_request"
29 | data: KernelSubscribeRequestData
30 |
31 |
32 | # Kernel status is returned on subscribe and also updated through kernel status updates
33 | class KernelSubscribeReplyData(BaseModel):
34 | success: bool
35 | kernel_session: Optional[KernelStatusUpdate] = None # None if no Kernel is alive for a file
36 |
37 |
38 | class KernelSubscribeReply(KernelsResponse):
39 | event: Literal["subscribe_reply"] = "subscribe_reply"
40 | data: KernelSubscribeReplyData
41 |
42 |
43 | class KernelStatusUpdateResponse(KernelsResponse):
44 | event: Literal["kernel_status_update_event"] = "kernel_status_update_event"
45 | data: KernelStatusUpdate
46 |
47 |
48 | # Cell State
49 | class BulkCellStateUpdateData(BaseModel):
50 | cell_states: List[CellState]
51 |
52 |
53 | class BulkCellStateUpdateResponse(KernelsResponse):
54 | event: Literal["bulk_cell_state_update_event"] = "bulk_cell_state_update_event"
55 | data: BulkCellStateUpdateData
56 |
57 |
58 | # Variable explorer updates return a list of current variables in the kernel
59 | # On connect to a new Kernel, Clients can send a request to trigger an event. Otherwise events occur
60 | # after cell execution automatically.
61 | class VariableExplorerUpdateRequest(KernelsRequest):
62 | event: Literal["variable_explorer_update_request"] = "variable_explorer_update_request"
63 |
64 |
65 | # It is confusing but variable_explorer_update_request can either be an RTU client to Gate server
66 | # (RTURequest) or also be propogated out by Gate from another client, meaning it comes in as a
67 | # server-to-client (RTUResponse) so we need to model it just to avoid warning about unmodeled msgs
68 | class VariableExplorerUpdateRequestPropogated(KernelsResponse):
69 | event: Literal["variable_explorer_update_request"] = "variable_explorer_update_request"
70 | data: dict = Field(default_factory=dict)
71 |
72 |
73 | class VariableExplorerResponse(KernelsResponse):
74 | event: Literal["variable_explorer_event"] = "variable_explorer_event"
75 |
76 |
77 | class IntegratedAIRequestData(BaseModel):
78 | prompt: str
79 | # this may not be called on a specific cell, but at a specific point in time at a generic
80 | # "document" level, so we don't require a cell_id
81 | cell_id: Optional[str] = None
82 | # if a cell_id is provided and this is True, the result will be added to the cell's output
83 | # instead of just sent back as an RTU reply
84 | output_for_response: bool = False
85 |
86 |
87 | class IntegratedAIRequest(KernelsRequest):
88 | event: Literal["integrated_ai_request"] = "integrated_ai_request"
89 | data: IntegratedAIRequestData
90 |
91 |
92 | class IntegratedAIReply(KernelsResponse):
93 | event: Literal["integrated_ai_reply"] = "integrated_ai_reply"
94 | data: BooleanReplyData
95 |
96 |
97 | class IntegratedAIEvent(KernelsResponse):
98 | event: Literal["integrated_ai_event"] = "integrated_ai_event"
99 | # same data as the IntegratedAIRequest, just echoed back out
100 | data: IntegratedAIRequestData
101 |
102 |
103 | class IntegratedAIResultData(BaseModel):
104 | # the full response from OpenAI; in most cases, sidecar will have either created a new cell
105 | # or an output, so this result should really only be used when the RTU client needs it to exist
106 | # outside of the cell/output structure
107 | result: str
108 |
109 |
110 | # this is sidecar to gate as a result of calling the OpenAIHandler method (OpenAI response,
111 | # error, etc); after that, Gate propogates the data out as an IntegratedAIEvent
112 | class IntegratedAIResult(KernelsRequest):
113 | event: Literal["integrated_ai_result"] = "integrated_ai_result"
114 | data: IntegratedAIResultData
115 |
116 |
117 | class IntegratedAIResultReply(KernelsResponse):
118 | event: Literal["integrated_ai_result_reply"] = "integrated_ai_result_reply"
119 | data: BooleanReplyData
120 |
121 |
122 | class IntegratedAIResultEvent(KernelsResponse):
123 | event: Literal["integrated_ai_result_event"] = "integrated_ai_result_event"
124 | data: IntegratedAIResultData
125 |
126 |
127 | KernelRequests = Annotated[
128 | Union[
129 | KernelSubscribeRequest,
130 | VariableExplorerUpdateRequest,
131 | IntegratedAIRequest,
132 | IntegratedAIResult,
133 | ],
134 | Field(discriminator="event"),
135 | ]
136 |
137 | KernelResponses = Annotated[
138 | Union[
139 | KernelSubscribeReply,
140 | KernelStatusUpdateResponse,
141 | BulkCellStateUpdateResponse,
142 | VariableExplorerUpdateRequestPropogated,
143 | VariableExplorerResponse,
144 | IntegratedAIReply,
145 | IntegratedAIResultReply,
146 | IntegratedAIEvent,
147 | IntegratedAIResultEvent,
148 | ],
149 | Field(discriminator="event"),
150 | ]
151 |
--------------------------------------------------------------------------------
/origami/models/rtu/channels/system.py:
--------------------------------------------------------------------------------
1 | """
2 | The primary purpose of the system channel is authenticating an RTU session after the websocket
3 | connection has been established. There are a number of debug-related RTU events on this channel
4 | as well.
5 |
6 | 1. authenticate_request - pass in a JWT to authenticate the rest of the RTU session so that events
7 | on channels like files and projects, which require RBAC checks, have a User account to check
8 | 2. ping_request and ping_reply - used to test RTU connection
9 | 3. whoami_request and whoami_reply - used to get the User account associated with the RTU session
10 | (also returned as part of the payload on the authenticate_reply event though)
11 | """
12 |
13 | from typing import Annotated, Literal, Optional, Union
14 |
15 | from pydantic import BaseModel, Field
16 |
17 | from origami.models.api.users import User
18 | from origami.models.rtu.base import BaseRTURequest, BaseRTUResponse
19 |
20 |
21 | class SystemRequest(BaseRTURequest):
22 | channel: str = "system"
23 | channel_prefix: Literal["system"] = "system"
24 |
25 |
26 | class SystemResponse(BaseRTUResponse):
27 | channel: str = "system"
28 | channel_prefix: Literal["system"] = "system"
29 |
30 |
31 | # The first thing RTU Clients should do after websocket connection is authenticate with a JWT,
32 | # same access token as what is included in Authorization bearer headers for API requests
33 | class AuthenticateRequestData(BaseModel):
34 | token: str
35 | rtu_client_type: str = "origami"
36 |
37 |
38 | class AuthenticateRequest(SystemRequest):
39 | event: Literal["authenticate_request"] = "authenticate_request"
40 | data: AuthenticateRequestData
41 |
42 |
43 | class AuthenticateReplyData(BaseModel):
44 | success: bool
45 | user: User
46 |
47 |
48 | class AuthenticateReply(SystemResponse):
49 | event: Literal["authenticate_reply"] = "authenticate_reply"
50 | data: AuthenticateReplyData
51 |
52 |
53 | # Below is all mainly used for debug, App devs don't need to do anything with these usually
54 | class PingRequest(SystemRequest):
55 | event: Literal["ping_request"] = "ping_request"
56 |
57 |
58 | class PingResponse(SystemResponse):
59 | event: Literal["ping_response"] = "ping_response"
60 |
61 |
62 | class WhoAmIRequest(SystemRequest):
63 | event: Literal["whoami_request"] = "whoami_request"
64 |
65 |
66 | class WhoAmIResponseData(BaseModel):
67 | user: Optional[User] = None # is None if RTU session isn't authenticated
68 |
69 |
70 | class WhoAmIResponse(SystemResponse):
71 | event: Literal["whoami_response"] = "whoami_response"
72 | data: WhoAmIResponseData
73 |
74 |
75 | SystemRequests = Annotated[
76 | Union[
77 | AuthenticateRequest,
78 | PingRequest,
79 | WhoAmIRequest,
80 | ],
81 | Field(discriminator="event"),
82 | ]
83 | SystemResponses = Annotated[
84 | Union[
85 | AuthenticateReply,
86 | PingResponse,
87 | WhoAmIResponse,
88 | ],
89 | Field(discriminator="event"),
90 | ]
91 |
--------------------------------------------------------------------------------
/origami/models/rtu/discriminators.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Union
2 |
3 | from pydantic import Field, TypeAdapter
4 |
5 | from origami.models.rtu.base import BaseRTUResponse
6 | from origami.models.rtu.channels.files import FileRequests, FileResponses
7 | from origami.models.rtu.channels.kernels import KernelRequests, KernelResponses
8 | from origami.models.rtu.channels.system import SystemRequests, SystemResponses
9 | from origami.models.rtu.errors import RTUError
10 |
11 | # Use: pydantic.pares_obj_as(RTURequest, )
12 | RTURequest = Annotated[
13 | Union[
14 | FileRequests,
15 | KernelRequests,
16 | SystemRequests,
17 | ],
18 | Field(discriminator="channel_prefix"),
19 | ]
20 |
21 | # Use: pydantic.pares_obj_as(RTUResponse, )
22 | # If the payload isn't a normal response by channel/event, will fall back to trying to parse as an
23 | # RTUError (invalid event, invalid data, permission denied) or error out entirely. If it's not an
24 | # error or known model, parse as base response. RTU Client will log a warning for base responses.
25 | RTUResponse = Union[
26 | Annotated[
27 | Union[
28 | FileResponses,
29 | KernelResponses,
30 | SystemResponses,
31 | ],
32 | Field(discriminator="channel_prefix"),
33 | ],
34 | RTUError,
35 | BaseRTUResponse,
36 | ]
37 |
38 |
39 | RTUResponseParser = TypeAdapter(RTUResponse)
40 |
--------------------------------------------------------------------------------
/origami/models/rtu/errors.py:
--------------------------------------------------------------------------------
1 | from typing import Annotated, Literal, Union
2 |
3 | from pydantic import BaseModel, Field
4 |
5 | from origami.models.rtu.base import BaseRTUResponse
6 |
7 |
8 | class ErrorData(BaseModel):
9 | message: str
10 |
11 |
12 | # Error when we send over a request that doesn't match any handlers
13 | class InvalidEvent(BaseRTUResponse):
14 | event: Literal["invalid_event"] = "invalid_event"
15 | data: ErrorData
16 |
17 |
18 | # Error when the payload of our request has a validation error
19 | class InvalidData(BaseRTUResponse):
20 | event: Literal["invalid_data"] = "invalid_data"
21 | data: ErrorData
22 |
23 |
24 | # Error when RTU session isn't authenticated or the request does not pass RBAC checks
25 | class PermissionDenied(BaseRTUResponse):
26 | event: Literal["permission_denied"] = "permission_denied"
27 | data: ErrorData
28 |
29 |
30 | class InconsistentStateEvent(BaseRTUResponse):
31 | event: Literal["inconsistent_state_event"] = "inconsistent_state_event"
32 | data: ErrorData
33 |
34 |
35 | RTUError = Annotated[
36 | Union[
37 | InvalidEvent,
38 | InvalidData,
39 | PermissionDenied,
40 | InconsistentStateEvent,
41 | ],
42 | Field(discriminator="event"),
43 | ]
44 |
--------------------------------------------------------------------------------
/origami/notebook/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/origami/notebook/__init__.py
--------------------------------------------------------------------------------
/origami/notebook/builder.py:
--------------------------------------------------------------------------------
1 | """
2 | The NotebookBuilder is used in applications that need to keep an in-memory representation of a
3 | Notebook and update it with RTU / Delta formatted messages.
4 | """
5 | import collections
6 | import logging
7 | import uuid
8 | from typing import Callable, Dict, Optional, Tuple, Type, Union
9 |
10 | import diff_match_patch
11 | import nbformat
12 | import orjson
13 |
14 | from origami.models.deltas.delta_types.cell_contents import CellContentsReplace, CellContentsUpdate
15 | from origami.models.deltas.delta_types.cell_execute import (
16 | CellExecute,
17 | CellExecuteAfter,
18 | CellExecuteAll,
19 | CellExecuteBefore,
20 | )
21 | from origami.models.deltas.delta_types.cell_metadata import (
22 | NULL_PRIOR_VALUE_SENTINEL,
23 | CellMetadataReplace,
24 | CellMetadataUpdate,
25 | )
26 | from origami.models.deltas.delta_types.cell_output_collection import CellOutputCollectionReplace
27 | from origami.models.deltas.delta_types.nb_cells import NBCellsAdd, NBCellsDelete, NBCellsMove
28 | from origami.models.deltas.delta_types.nb_metadata import NBMetadataUpdate
29 | from origami.models.deltas.discriminators import FileDelta
30 | from origami.models.notebook import Notebook, NotebookCell
31 |
32 | logger = logging.getLogger(__name__)
33 |
34 |
35 | class CellNotFound(Exception):
36 | def __init__(self, cell_id: str):
37 | self.cell_id = cell_id
38 |
39 | def __str__(self):
40 | return f"Exception: Cell {self.cell_id} not found"
41 |
42 |
43 | class NotebookBuilder:
44 | """
45 | Apply RTU File Deltas to an in-memory representation of a Notebook.
46 | """
47 |
48 | def __init__(self, seed_notebook: Notebook):
49 | if not isinstance(seed_notebook, Notebook):
50 | raise TypeError("seed_notebook must be a Pydantic Notebook model")
51 | self._seed_notebook = seed_notebook
52 | self.nb: Notebook = seed_notebook.model_copy()
53 | self.dmp = diff_match_patch.diff_match_patch()
54 |
55 | cell_id_counts = collections.defaultdict(int)
56 | for cell in self.nb.cells:
57 | cell_id_counts[cell.id] += 1
58 | for cell_id, count in cell_id_counts.items():
59 | if count > 1:
60 | logger.warning(f"Found {count} cells with id {cell_id}")
61 |
62 | # RTUClient uses the builder.last_applied_delta_id to figure out whether to apply incoming
63 | # deltas or queue them in an unapplied_deltas list for replay
64 | self.last_applied_delta_id: Optional[uuid.UUID] = None
65 | # to keep track of deleted cells so we can ignore them in future deltas
66 | self.deleted_cell_ids: set[str] = set()
67 |
68 | @property
69 | def cell_ids(self) -> list[str]:
70 | return [cell.id for cell in self.nb.cells]
71 |
72 | @classmethod
73 | def from_nbformat(self, nb: nbformat.NotebookNode) -> "NotebookBuilder":
74 | """Instantiate a NotebookBuilder from a nbformat NotebookNode"""
75 | nb = Notebook.parse_obj(nb.dict())
76 | return NotebookBuilder(nb)
77 |
78 | def get_cell(self, cell_id: str) -> Tuple[int, NotebookCell]:
79 | """
80 | Convenience method to return a cell by cell id.
81 | Raises CellNotFound if cell id is not in the Notebook
82 | """
83 | for index, cell in enumerate(self.nb.cells):
84 | if cell.id == cell_id:
85 | return (index, cell)
86 | raise CellNotFound(cell_id)
87 |
88 | def apply_delta(self, delta: FileDelta) -> None:
89 | """
90 | Apply a FileDelta to the NotebookBuilder.
91 | """
92 | handlers: Dict[Type[FileDelta], Callable] = {
93 | NBCellsAdd: self.add_cell,
94 | NBCellsDelete: self.delete_cell,
95 | NBCellsMove: self.move_cell,
96 | CellContentsUpdate: self.update_cell_contents,
97 | CellContentsReplace: self.replace_cell_contents,
98 | CellMetadataUpdate: self.update_cell_metadata,
99 | CellMetadataReplace: self.replace_cell_metadata,
100 | NBMetadataUpdate: self.update_notebook_metadata,
101 | CellOutputCollectionReplace: self.replace_cell_output_collection,
102 | CellExecute: self.log_execute_delta,
103 | CellExecuteAll: self.log_execute_delta,
104 | CellExecuteBefore: self.log_execute_delta,
105 | CellExecuteAfter: self.log_execute_delta,
106 | }
107 | if type(delta) not in handlers:
108 | raise ValueError(f"No handler for {delta.delta_type=}, {delta.delta_action=}")
109 |
110 | handler = handlers[type(delta)]
111 | try:
112 | handler(delta)
113 | self.last_applied_delta_id = delta.id
114 | except Exception as e: # noqa: E722
115 | logger.exception("Error squashing Delta into NotebookBuilder", extra={"delta": delta})
116 | raise e
117 |
118 | def add_cell(self, delta: NBCellsAdd):
119 | """
120 | Add a new cell to the Notebook.
121 | - If after_id is specified, add it after that cell. Otherwise at top of Notebook
122 | - cell_id can be specified at higher level delta.properties and should be copied down into
123 | the cell part of the delta.properties
124 | """
125 | cell_id = delta.properties.id
126 | # Warning if we're adding a duplicate cell id
127 | if cell_id in self.cell_ids:
128 | logger.warning(
129 | f"Received NBCellsAdd delta with cell id {cell_id}, duplicate of existing cell"
130 | )
131 | new_cell = delta.properties.cell
132 | # Push "delta.properites.id" down into cell id ...
133 | new_cell.id = cell_id
134 | if delta.properties.after_id:
135 | index, _ = self.get_cell(delta.properties.after_id)
136 | self.nb.cells.insert(index + 1, new_cell)
137 | else:
138 | self.nb.cells.insert(0, new_cell)
139 |
140 | def delete_cell(self, delta: NBCellsDelete):
141 | """Deletes a cell from the Notebook. If the cell can't be found, warn but don't error."""
142 | cell_id = delta.properties.id
143 | index, _ = self.get_cell(cell_id)
144 | self.nb.cells.pop(index)
145 | self.deleted_cell_ids.add(cell_id)
146 |
147 | def move_cell(self, delta: NBCellsMove):
148 | """Moves a cell from one position to another in the Notebook"""
149 | cell_id = delta.properties.id
150 | index, _ = self.get_cell(cell_id)
151 | cell_to_move = self.nb.cells.pop(index)
152 | if delta.properties.after_id:
153 | target_index, _ = self.get_cell(delta.properties.after_id)
154 | self.nb.cells.insert(target_index + 1, cell_to_move)
155 | return
156 | else:
157 | self.nb.cells.insert(0, cell_to_move)
158 |
159 | def update_cell_contents(self, delta: CellContentsUpdate):
160 | """Update cell content using the diff-match-patch algorithm"""
161 | patches = self.dmp.patch_fromText(delta.properties.patch)
162 | _, cell = self.get_cell(delta.resource_id)
163 | merged_text = self.dmp.patch_apply(patches, cell.source)[0]
164 | cell.source = merged_text
165 |
166 | def replace_cell_contents(self, delta: CellContentsReplace):
167 | """Pure replacement of cell source content"""
168 | _, cell = self.get_cell(delta.resource_id)
169 | cell.source = delta.properties.source
170 |
171 | def update_notebook_metadata(self, delta: NBMetadataUpdate):
172 | """Update top-level Notebook metadata using a partial update / nested path technique"""
173 | # Need to traverse the Notebook metadata dictionary by a list of keys.
174 | # If that key isn't there already, create it with value of empty dict
175 | # e.g. path=['foo', 'bar', 'baz'], value='xyz' needs to set
176 | # self.nb.metadata['foo']['bar']['baz'] = 'xyz'
177 | # and add those nested keys into metadata if they don't exist already
178 | dict_path = self.nb.metadata
179 | for leading_key in delta.properties.path[:-1]:
180 | if leading_key not in dict_path:
181 | dict_path[leading_key] = {}
182 | dict_path = dict_path[leading_key]
183 |
184 | last_key = delta.properties.path[-1]
185 | if (
186 | last_key in dict_path
187 | and delta.properties.prior_value
188 | and delta.properties.prior_value != NULL_PRIOR_VALUE_SENTINEL
189 | and dict_path[last_key] != delta.properties.prior_value
190 | ):
191 | logger.warning(
192 | f"Notebook metadata path {delta.properties.path} expected to have prior value {delta.properties.prior_value} but was {dict_path[last_key]}" # noqa: E501
193 | )
194 |
195 | dict_path[last_key] = delta.properties.value
196 |
197 | def update_cell_metadata(self, delta: CellMetadataUpdate):
198 | """Update cell metadata using a partial update / nested path technique"""
199 | if delta.resource_id in self.deleted_cell_ids:
200 | logger.debug(
201 | f"Skipping update_cell_metadata for deleted cell {delta.resource_id}",
202 | extra={"delta_properties_path": delta.properties.path},
203 | )
204 | return
205 |
206 | try:
207 | _, cell = self.get_cell(delta.resource_id)
208 | except CellNotFound:
209 | # Most often happens when a User deletes a cell that's in progress of being executed,
210 | # and we end up emitting a cell execution timing metadata as it gets deleted
211 | logger.warning(
212 | "Got update_cell_metadata for cell that isn't in notebook or deleted_cell_ids", # noqa: E501
213 | extra={"delta_properties_path": delta.properties.path},
214 | )
215 | return
216 |
217 | # see comment in update_notebook_metadata explaining dictionary traversal
218 | dict_path = cell.metadata
219 | for leading_key in delta.properties.path[:-1]:
220 | if leading_key not in dict_path:
221 | dict_path[leading_key] = {}
222 | dict_path = dict_path[leading_key]
223 |
224 | last_key = delta.properties.path[-1]
225 | if (
226 | last_key in dict_path
227 | and delta.properties.prior_value
228 | and delta.properties.prior_value != NULL_PRIOR_VALUE_SENTINEL
229 | and str(dict_path[last_key]) != str(delta.properties.prior_value)
230 | ):
231 | logger.warning(
232 | f"Cell {cell.id} metadata path {delta.properties.path} expected to have prior value {delta.properties.prior_value} but was {dict_path[last_key]}" # noqa: E501
233 | )
234 |
235 | dict_path[last_key] = delta.properties.value
236 |
237 | def replace_cell_metadata(self, delta: CellMetadataReplace):
238 | """Switch a cell type between code / markdown or change cell language (e.g. Python to R)"""
239 | _, cell = self.get_cell(delta.resource_id)
240 |
241 | if delta.properties.type:
242 | cell.cell_type = delta.properties.type
243 | if delta.properties.language:
244 | if "noteable" not in cell.metadata:
245 | cell.metadata["noteable"] = {}
246 | cell.metadata["noteable"]["cell_type"] = delta.properties.language
247 |
248 | def replace_cell_output_collection(self, delta: CellOutputCollectionReplace):
249 | """Update cell metadata to point to an Output Collection container id"""
250 | if delta.resource_id in self.deleted_cell_ids:
251 | logger.warning(
252 | f"Skipping replace_cell_output_collection for deleted cell {delta.resource_id}"
253 | )
254 | return
255 |
256 | try:
257 | _, cell = self.get_cell(delta.resource_id)
258 | except CellNotFound:
259 | logger.warning(
260 | "Got replace_cell_output_collection for cell that isn't in notebook or deleted_cell_ids", # noqa: E501
261 | )
262 | return
263 |
264 | if "noteable" not in cell.metadata:
265 | cell.metadata["noteable"] = {}
266 | cell.metadata["noteable"]["output_collection_id"] = delta.properties.output_collection_id
267 |
268 | def log_execute_delta(
269 | self, delta: Union[CellExecute, CellExecuteBefore, CellExecuteAfter, CellExecuteAll]
270 | ):
271 | """Handles delta_type: execute, delta_action: execute | execute_all"""
272 | logger.debug(
273 | "Squashing execute delta",
274 | extra={"delta_type": delta.delta_type, "delta_action": delta.delta_action},
275 | )
276 | pass
277 |
278 | def dumps(self, indent: bool = True) -> bytes:
279 | """
280 | Serialize the in-memory Notebook to JSON.
281 | """
282 | if indent:
283 | return orjson.dumps(self.nb.dict(exclude_unset=True), option=orjson.OPT_INDENT_2)
284 | else:
285 | return orjson.dumps(self.nb.dict(exclude_unset=True))
286 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | # NOTE: you have to use single-quoted strings in TOML for regular expressions.
2 | # It's the equivalent of r-strings in Python. Multiline strings are treated as
3 | # verbose regular expressions by Black. Use [ ] to denote a significant space
4 | # character.
5 |
6 | [tool.poetry]
7 | name = "noteable-origami"
8 | version = "2.0.0"
9 | description = "The Noteable API interface"
10 | authors = ["Matt Seal "]
11 | maintainers = ["Matt Seal "]
12 | license = "BSD-3-Clause"
13 | readme = "README.md"
14 | repository = "https://github.com/noteable-io/origami"
15 | # old setup.cfg had a bdist_wheel option.
16 | # To build a wheel, use poetry build -f wheel
17 | keywords = ["notebook", "api", "noteable"]
18 | classifiers=[
19 | 'Intended Audience :: Developers',
20 | 'License :: OSI Approved :: BSD License',
21 | 'Programming Language :: Python',
22 | 'Programming Language :: Python :: 3.8',
23 | 'Programming Language :: Python :: 3.9',
24 | 'Programming Language :: Python :: 3.10',
25 | ]
26 | packages = [
27 | { include = "origami" },
28 | ]
29 |
30 | # Manifest.in is subsumed by poetry here
31 | # https://python-poetry.org/docs/pyproject/#include-and-exclude
32 | include = []
33 |
34 |
35 | [tool.poetry.dependencies]
36 | python = ">=3.8,<4.0"
37 | bitmath = "^1.3.3"
38 | httpx = ">=0.22"
39 | jwt = "^1.3.1"
40 | nbformat = "^5.4.0"
41 | orjson = "^3.8.7"
42 | pydantic = "^2.4.2"
43 | websockets = ">=11.0"
44 | backoff = "^2.1.2"
45 | cryptography = ">=40.0"
46 | diff-match-patch = "^20200713"
47 | sending = "^0.3.0"
48 | importlib-metadata = ">=6.8.0"
49 | structlog = { version = "*", optional = true }
50 | typer = { version = "^0.9.0", optional = true }
51 |
52 | [tool.poetry.extras]
53 | cli = ["structlog", "typer"]
54 |
55 | [tool.poetry.scripts]
56 | origami = "origami.cli:app"
57 |
58 | [tool.poetry.group.dev.dependencies]
59 | pytest = "^7.2.1"
60 | pytest-cov = "^4.0.0"
61 | black = "^23.1.0"
62 | isort = "^5.12.0"
63 | flake8-docstrings = "^1.6.0"
64 | notebook = "^6.4.11"
65 | pytest-asyncio = "^0.19.0"
66 | structlog = "^23.1.0"
67 |
68 | [build-system]
69 | requires = ["poetry-core>=1.0.0"]
70 | build-backend = "poetry.core.masonry.api"
71 |
72 | [tool.black]
73 | line-length = 100
74 | include = '\.pyi?$'
75 | exclude = '''
76 | /(
77 | \.git
78 | | \.hg
79 | | \.mypy_cache
80 | | \.tox
81 | | \.venv
82 | | _build
83 | | buck-out
84 | | build
85 | | dist
86 | | migrations
87 | | src/nbformat
88 |
89 | # The following are specific to Black, you probably don't want those.
90 | | blib2to3
91 | | tests/data
92 | | profiling
93 | )/
94 | '''
95 |
96 | [tool.isort]
97 | line_length = 100
98 | multi_line_output = 3
99 | include_trailing_comma = true
100 | known_third_party = []
101 |
102 | [tool.ruff]
103 | line-length = 100
104 |
105 | [tool.coverage.run]
106 | branch = false
107 | omit = ["origami/_version.py", "*/tests/*"]
108 |
109 | [tool.coverage.report]
110 | exclude_lines = ["if self.debug:",
111 | "pragma: no cover",
112 | "raise AssertionError",
113 | "raise NotImplementedError",
114 | "if __name__ == '__main__':"]
115 |
116 | [tool.pytest.ini_options]
117 | testpaths = [
118 | "origami/tests",
119 | ]
120 | # https://pytest-asyncio.readthedocs.io/en/latest/reference/configuration.html#configuration
121 | asyncio_mode = "auto"
122 |
--------------------------------------------------------------------------------
/setup.cfg:
--------------------------------------------------------------------------------
1 |
2 | [flake8]
3 | # References:
4 | # https://flake8.readthedocs.io/en/latest/user/configuration.html
5 | # https://flake8.readthedocs.io/en/latest/user/error-codes.html
6 |
7 | # Note: there cannot be spaces after comma's here
8 | exclude =
9 | __init__.py,
10 | origami/tests
11 | ignore =
12 | # Extra space in brackets
13 | E20,
14 | # Multiple spaces around ","
15 | E231,E241,
16 | # Comments
17 | E26,
18 | # Import formatting
19 | E4,
20 | # Comparing types instead of isinstance
21 | E721,
22 | # Assigning lambda expression
23 | E731,
24 | # Do not use variables named ‘l’, ‘O’, or ‘I’
25 | E741,
26 | # Long descriptions can trigger this one erroneously
27 | D205,
28 | # First line period ends can be ignored
29 | D400,
30 | # Prose of docstrings doesn't need to be this strict
31 | D401,
32 | # line breaks before binary operators
33 | W503
34 | max-line-length = 120
35 | max-complexity = 23
36 |
37 | [bdist_wheel]
38 | universal=0
39 |
40 | [coverage:run]
41 | branch = False
42 | omit =
43 | origami/tests/*
44 | origami/version.py
45 |
46 | [coverage:report]
47 | exclude_lines =
48 | if self\.debug:
49 | pragma: no cover
50 | raise AssertionError
51 | raise NotImplementedError
52 | if __name__ == .__main__.:
53 | ignore_errors = True
54 | omit = origami/tests/*,origami/version.py
55 |
56 | [tool:pytest]
57 | filterwarnings = always
58 |
--------------------------------------------------------------------------------
/tests/e2e/api/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/tests/e2e/api/__init__.py
--------------------------------------------------------------------------------
/tests/e2e/api/test_files.py:
--------------------------------------------------------------------------------
1 | import uuid
2 |
3 | import httpx
4 |
5 | from origami.clients.api import APIClient
6 | from origami.models.api.files import File
7 |
8 |
9 | async def test_file_crud(api_client: APIClient, test_project_id):
10 | name = uuid.uuid4().hex + ".txt"
11 | content = b"foo\nbar\nbaz"
12 | new_file = await api_client.create_file(project_id=test_project_id, path=name, content=content)
13 | assert new_file.filename == name
14 | assert new_file.project_id == test_project_id
15 |
16 | # on create, Gate does not give back a presigned url, need to request it.
17 | f: File = await api_client.get_file(new_file.id)
18 | assert f.presigned_download_url is not None
19 |
20 | async with httpx.AsyncClient() as plain_client:
21 | resp = await plain_client.get(f.presigned_download_url)
22 | assert resp.status_code == 200
23 | assert resp.content == content
24 |
25 | # Delete file
26 | deleted_file = await api_client.delete_file(new_file.id)
27 | assert deleted_file.id == new_file.id
28 | assert deleted_file.deleted_at is not None
29 |
30 |
31 | async def test_get_file_version(api_client: APIClient, notebook_maker):
32 | f: File = await notebook_maker()
33 | versions = await api_client.get_file_versions(f.id)
34 | assert len(versions) == 1
35 | assert versions[0].file_id == f.id
36 | # The three key fields are id (version id), number, and presigned url to download content
37 | assert versions[0].id is not None
38 | assert versions[0].number == 0
39 | assert versions[0].content_presigned_url is not None
40 |
41 | # Trigger a version save -- something needs to change (i.e. make a delta) or save as named
42 | endpoint = f"/v1/files/{f.id}/versions"
43 | resp = await api_client.client.post(endpoint, json={"name": "foo"})
44 | assert resp.status_code == 201
45 |
46 | new_versions = await api_client.get_file_versions(f.id)
47 | assert new_versions[0].number == 1
48 |
49 | assert len(new_versions) == 2
50 |
--------------------------------------------------------------------------------
/tests/e2e/api/test_projects.py:
--------------------------------------------------------------------------------
1 | import uuid
2 |
3 | from origami.clients.api import APIClient
4 | from origami.models.api.files import File
5 | from origami.models.api.projects import Project
6 |
7 |
8 | async def test_project_crud(api_client: APIClient, test_space_id: uuid.UUID):
9 | name = 'test-project-' + str(uuid.uuid4())
10 | project = await api_client.create_project(name=name, space_id=test_space_id)
11 | assert isinstance(project, Project)
12 | assert project.name == name
13 |
14 | existing_project = await api_client.get_project(project.id)
15 | assert existing_project.id == project.id
16 | assert existing_project.name == name
17 |
18 | deleted_project = await api_client.delete_project(project.id)
19 | assert deleted_project.id == project.id
20 | assert deleted_project.deleted_at is not None
21 |
22 |
23 | async def test_list_project_files(
24 | api_client: APIClient, test_project_id: uuid.UUID, file_maker, notebook_maker
25 | ):
26 | salt = str(uuid.uuid4())
27 | flat_file: File = await file_maker(test_project_id, f'flat-file-{salt}.txt', b'flat file')
28 | notebook: File = await notebook_maker(test_project_id, f'nested/notebook-{salt}.ipynb')
29 | file_list = await api_client.list_project_files(test_project_id)
30 | assert len(file_list) > 0
31 | assert isinstance(file_list[0], File)
32 | file_ids = [f.id for f in file_list]
33 | assert flat_file.id in file_ids
34 | assert notebook.id in file_ids
35 |
--------------------------------------------------------------------------------
/tests/e2e/api/test_spaces.py:
--------------------------------------------------------------------------------
1 | import uuid
2 |
3 | from origami.clients.api import APIClient
4 | from origami.models.api.projects import Project
5 | from origami.models.api.spaces import Space
6 |
7 |
8 | async def test_space_crud(api_client: APIClient):
9 | name = "test-space-" + str(uuid.uuid4())
10 | space = await api_client.create_space(name=name)
11 | assert isinstance(space, Space)
12 | assert space.name == name
13 |
14 | existing_space = await api_client.get_space(space.id)
15 | assert existing_space.id == space.id
16 | assert existing_space.name == name
17 |
18 | await api_client.delete_space(space.id)
19 |
20 |
21 | async def test_list_space_projects(
22 | api_client: APIClient,
23 | test_space_id: uuid.UUID,
24 | new_project: Project,
25 | ):
26 | projects = await api_client.list_space_projects(test_space_id)
27 | assert len(projects) > 0
28 | assert isinstance(projects[0], Project)
29 | assert new_project.id in [p.id for p in projects]
30 |
--------------------------------------------------------------------------------
/tests/e2e/api/test_users.py:
--------------------------------------------------------------------------------
1 | from origami.clients.api import APIClient
2 | from origami.models.api.users import User
3 |
4 |
5 | async def test_users_me(api_client: APIClient) -> None:
6 | user: User = await api_client.user_info()
7 | assert isinstance(user, User)
8 | assert user.id is not None
9 |
--------------------------------------------------------------------------------
/tests/e2e/conftest.py:
--------------------------------------------------------------------------------
1 | import logging
2 | import logging.config
3 | import os
4 | import uuid
5 | from typing import Optional
6 |
7 | import httpx
8 | import pytest
9 | import structlog
10 |
11 | from origami.clients.api import APIClient
12 | from origami.models.api.files import File
13 | from origami.models.api.projects import Project
14 | from origami.models.notebook import Notebook
15 |
16 | logger = structlog.get_logger()
17 |
18 |
19 | @pytest.fixture(autouse=True, scope='session')
20 | def setup_logging():
21 | """Configure structlog in tests the same way we do in production apps"""
22 | structlog.configure(
23 | processors=[
24 | structlog.stdlib.PositionalArgumentsFormatter(),
25 | structlog.processors.StackInfoRenderer(),
26 | structlog.processors.format_exc_info,
27 | structlog.stdlib.ProcessorFormatter.wrap_for_formatter,
28 | ],
29 | logger_factory=structlog.stdlib.LoggerFactory(),
30 | wrapper_class=structlog.stdlib.BoundLogger,
31 | cache_logger_on_first_use=True,
32 | )
33 |
34 | # shared processors to be applied to both vanilla and structlog messages
35 | # after each is appropriately pre-processed
36 | processors = [
37 | # log level / logger name, effects coloring in ConsoleRenderer(colors=True)
38 | structlog.stdlib.add_log_level,
39 | structlog.stdlib.add_logger_name,
40 | # timestamp format
41 | structlog.processors.TimeStamper(fmt="iso"),
42 | # To see all CallsiteParameterAdder options:
43 | # https://www.structlog.org/en/stable/api.html#structlog.processors.CallsiteParameterAdder
44 | # more options include module, pathname, process, process_name, thread, thread_name
45 | structlog.processors.CallsiteParameterAdder(
46 | {
47 | structlog.processors.CallsiteParameter.FILENAME,
48 | structlog.processors.CallsiteParameter.FUNC_NAME,
49 | structlog.processors.CallsiteParameter.LINENO,
50 | }
51 | ),
52 | # Any structlog.contextvars.bind_contextvars included in middleware/functions
53 | structlog.contextvars.merge_contextvars,
54 | # strip _record and _from_structlog keys from event dictionary
55 | structlog.stdlib.ProcessorFormatter.remove_processors_meta,
56 | structlog.dev.ConsoleRenderer(colors=True),
57 | # ^^ In prod with any kind of logging service (datadog, grafana, etc), ConsoleRenderer
58 | # would probably be replaced with structlog.processors.JSONRenderer() or similar
59 | ]
60 |
61 | # Configs applied to logs generated by structlog or vanilla logging
62 | logging.config.dictConfig(
63 | {
64 | "version": 1,
65 | "disable_existing_loggers": False,
66 | "formatters": {
67 | "default": {
68 | "()": structlog.stdlib.ProcessorFormatter,
69 | "processors": processors,
70 | "foreign_pre_chain": [structlog.stdlib.ExtraAdder()],
71 | },
72 | },
73 | "handlers": {
74 | "default": {
75 | "class": "logging.StreamHandler",
76 | "formatter": "default",
77 | "stream": "ext://sys.stdout",
78 | },
79 | },
80 | "loggers": {
81 | # "" for applying handler to "root" (all libraries)
82 | # you could set this to "kernel_sidecar" to only see logs from this library
83 | "": {
84 | "handlers": ["default"],
85 | "level": 'INFO',
86 | "propagate": True,
87 | },
88 | },
89 | }
90 | )
91 |
92 |
93 | # hardcoded values below are just what @kafonek was using in local skaffold
94 | @pytest.fixture
95 | def test_space_id() -> uuid.UUID:
96 | return uuid.UUID(os.environ.get('TEST_SPACE_ID', '1ecc737e-0252-49a1-af9b-a0a400db5888'))
97 |
98 |
99 | @pytest.fixture
100 | def test_project_id() -> uuid.UUID:
101 | return uuid.UUID(os.environ.get('TEST_PROJECT_ID', 'a752faf4-bbc7-4fe1-9c5f-be92394e48a2'))
102 |
103 |
104 | @pytest.fixture
105 | def test_user_id() -> uuid.UUID:
106 | return uuid.UUID(os.environ.get('TEST_USER_ID', '9eb39719-4fc1-44de-9155-3edaeb32ce2c'))
107 |
108 |
109 | class LogWarningTransport(httpx.AsyncHTTPTransport):
110 | """
111 | Automatically log information about any non-2xx response.
112 | """
113 |
114 | async def handle_async_request(self, request: httpx.Request) -> httpx.Response:
115 | resp = await super().handle_async_request(request)
116 | if resp.is_error:
117 | response_content = await resp.aread()
118 | logger.warning(f'{request.method} {request.url} {resp.status_code} {response_content}')
119 | return resp
120 |
121 |
122 | @pytest.fixture
123 | def api_client() -> APIClient:
124 | if "NOTEABLE_API_URL" not in os.environ:
125 | logger.warning("Using default (prod) Noteable API, did you mean to set NOTEABLE_API_URL?")
126 | return APIClient(transport=LogWarningTransport())
127 |
128 |
129 | @pytest.fixture
130 | async def new_project(api_client: APIClient, test_space_id: uuid.UUID) -> Project:
131 | """Create and cleanup a new Project"""
132 | name = 'test-project-' + str(uuid.uuid4())
133 | new_project = await api_client.create_project(name=name, space_id=test_space_id)
134 | yield new_project
135 | await api_client.delete_project(new_project.id)
136 |
137 |
138 | @pytest.fixture
139 | async def file_maker(api_client: APIClient, test_project_id: uuid.UUID):
140 | """Create and cleanup non-Notebook files"""
141 | file_ids = []
142 |
143 | async def make_file(
144 | project_id: Optional[uuid.UUID] = None, path: Optional[str] = None, content: bytes = b""
145 | ) -> File:
146 | if not project_id:
147 | project_id = test_project_id
148 | if not path:
149 | salt = str(uuid.uuid4())
150 | path = f'test-file-{salt}.txt'
151 | file = await api_client.create_file(project_id, path, content)
152 | file_ids.append(file.id)
153 | return file
154 |
155 | yield make_file
156 | for file_id in file_ids:
157 | await api_client.delete_file(file_id)
158 |
159 |
160 | @pytest.fixture
161 | async def notebook_maker(api_client: APIClient, test_project_id: uuid.UUID):
162 | """Create and cleanup Notebook files"""
163 | notebook_ids = []
164 |
165 | async def make_notebook(
166 | project_id: Optional[uuid.UUID] = None,
167 | path: Optional[str] = None,
168 | notebook: Optional[Notebook] = None,
169 | ) -> File:
170 | if not project_id:
171 | project_id = test_project_id
172 | if not path:
173 | salt = str(uuid.uuid4())
174 | path = f'test-notebook-{salt}.ipynb'
175 | file = await api_client.create_notebook(project_id, path, notebook)
176 | notebook_ids.append(file.id)
177 | return file
178 |
179 | yield make_notebook
180 | for notebook_id in notebook_ids:
181 | await api_client.delete_file(notebook_id)
182 |
--------------------------------------------------------------------------------
/tests/e2e/rtu/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/noteable-io/origami/09a007167c304ab6fd119809b130692aa4bf456f/tests/e2e/rtu/__init__.py
--------------------------------------------------------------------------------
/tests/e2e/rtu/test_execution.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 |
3 | import pytest
4 |
5 | from origami.clients.api import APIClient
6 | from origami.clients.rtu import RTUClient
7 | from origami.models.api.files import File
8 | from origami.models.api.outputs import KernelOutputCollection
9 | from origami.models.kernels import KernelSession
10 | from origami.models.notebook import CodeCell, MarkdownCell, Notebook
11 |
12 |
13 | async def test_single_cell(api_client: APIClient, notebook_maker):
14 | notebook = Notebook(cells=[CodeCell(id="cell_1", source='print("hello world"); 2 + 2')])
15 | file: File = await notebook_maker(notebook=notebook)
16 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
17 | await asyncio.sleep(2)
18 |
19 | rtu_client: RTUClient = await api_client.connect_realtime(file)
20 | assert rtu_client.builder.nb.cells == notebook.cells
21 |
22 | kernel_session: KernelSession = await api_client.launch_kernel(file.id, kernel_name="python3")
23 | await rtu_client.wait_for_kernel_idle()
24 |
25 | queued_execution = await rtu_client.queue_execution("cell_1")
26 | # Assert cell_1 output collection has multiple outputs
27 | cell_1_fut = list(queued_execution)[0]
28 | cell: CodeCell = await cell_1_fut # wait for cell_1 to be done
29 | output_collection: KernelOutputCollection = await api_client.get_output_collection(
30 | cell.output_collection_id
31 | )
32 | try:
33 | assert len(output_collection.outputs) == 2
34 | assert output_collection.outputs[0].content.raw == "hello world\n"
35 | assert output_collection.outputs[1].content.raw == "4"
36 | finally:
37 | await rtu_client.shutdown()
38 | await api_client.shutdown_kernel(kernel_session.id)
39 |
40 |
41 | async def test_run_all(api_client: APIClient, notebook_maker):
42 | notebook = Notebook(
43 | cells=[
44 | CodeCell(id="cell_1", source="2 + 2"),
45 | MarkdownCell(source="## a header"),
46 | CodeCell(id="cell_2", source="3 + 3"),
47 | ]
48 | )
49 | file: File = await notebook_maker(notebook=notebook)
50 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
51 | await asyncio.sleep(2)
52 |
53 | rtu_client: RTUClient = await api_client.connect_realtime(file)
54 | assert rtu_client.builder.nb.cells == notebook.cells
55 |
56 | kernel_session: KernelSession = await api_client.launch_kernel(file.id)
57 | await rtu_client.wait_for_kernel_idle()
58 |
59 | queued_execution = await rtu_client.queue_execution(run_all=True)
60 | # should only get two futures back, one for each code cell
61 | assert len(queued_execution) == 2
62 | cells = await asyncio.gather(*queued_execution)
63 | cell1_output = await api_client.get_output_collection(cells[0].output_collection_id)
64 | cell2_output = await api_client.get_output_collection(cells[1].output_collection_id)
65 | try:
66 | assert cell1_output.outputs[0].content.raw == "4"
67 | assert cell2_output.outputs[0].content.raw == "6"
68 | finally:
69 | await rtu_client.shutdown()
70 | await api_client.shutdown_kernel(kernel_session.id)
71 |
72 |
73 | async def test_execution_request_err_if_no_kernel_started(api_client: APIClient, notebook_maker):
74 | notebook = Notebook(cells=[CodeCell(id="cell_1", source="2 + 2")])
75 | file: File = await notebook_maker(notebook=notebook)
76 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
77 | await asyncio.sleep(2)
78 |
79 | rtu_client: RTUClient = await api_client.connect_realtime(file)
80 | assert rtu_client.builder.nb.cells == notebook.cells
81 |
82 | with pytest.raises(RuntimeError):
83 | await rtu_client.queue_execution("cell_1")
84 |
--------------------------------------------------------------------------------
/tests/e2e/rtu/test_notebook.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import uuid
3 |
4 | import httpx
5 | import pytest
6 |
7 | from origami.clients.api import APIClient
8 | from origami.clients.rtu import RTUClient
9 | from origami.models.api.files import File
10 | from origami.notebook.builder import CellNotFound
11 |
12 |
13 | async def test_add_and_remove_cell(api_client: APIClient, notebook_maker):
14 | file: File = await notebook_maker()
15 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
16 | await asyncio.sleep(2)
17 | rtu_client: RTUClient = await api_client.connect_realtime(file)
18 | try:
19 | assert rtu_client.builder.nb.cells == []
20 |
21 | cell = await rtu_client.add_cell(source='print("hello world")')
22 | assert cell.cell_type == "code"
23 | assert cell.id in rtu_client.cell_ids
24 |
25 | await rtu_client.delete_cell(cell.id)
26 | with pytest.raises(CellNotFound):
27 | rtu_client.builder.get_cell(cell.id)
28 | finally:
29 | await rtu_client.shutdown()
30 |
31 |
32 | async def test_change_cell_type(api_client: APIClient, notebook_maker):
33 | file: File = await notebook_maker()
34 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
35 | await asyncio.sleep(2)
36 | rtu_client: RTUClient = await api_client.connect_realtime(file)
37 | try:
38 | assert rtu_client.builder.nb.cells == []
39 |
40 | source_cell = await rtu_client.add_cell(source="1 + 1")
41 | _, cell = rtu_client.builder.get_cell(source_cell.id)
42 | assert cell.cell_type == "code"
43 |
44 | await rtu_client.change_cell_type(cell.id, "markdown")
45 | _, cell = rtu_client.builder.get_cell(source_cell.id)
46 | assert cell.cell_type == "markdown"
47 |
48 | await rtu_client.change_cell_type(cell.id, "sql")
49 | _, cell = rtu_client.builder.get_cell(source_cell.id)
50 | assert cell.cell_type == "code"
51 | assert cell.is_sql_cell
52 | finally:
53 | await rtu_client.shutdown()
54 |
55 |
56 | async def test_update_cell_content(api_client: APIClient, notebook_maker):
57 | file: File = await notebook_maker()
58 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
59 | await asyncio.sleep(2)
60 | rtu_client: RTUClient = await api_client.connect_realtime(file)
61 | try:
62 | assert rtu_client.builder.nb.cells == []
63 |
64 | source_cell = await rtu_client.add_cell(source="1 + 1")
65 | _, cell = rtu_client.builder.get_cell(source_cell.id)
66 |
67 | cell = await rtu_client.update_cell_content(cell.id, "@@ -1,5 +1,5 @@\n-1 + 1\n+2 + 2\n")
68 | assert cell.source == "2 + 2"
69 | finally:
70 | await rtu_client.shutdown()
71 |
72 |
73 | async def test_replace_cell_content(api_client: APIClient, notebook_maker):
74 | file: File = await notebook_maker()
75 | # TODO: remove sleep when Gate stops permission denied on newly created files (db time-travel)
76 | await asyncio.sleep(2)
77 | rtu_client: RTUClient = await api_client.connect_realtime(file)
78 | try:
79 | assert rtu_client.builder.nb.cells == []
80 |
81 | source_cell = await rtu_client.add_cell(source="1 + 1")
82 | _, cell = rtu_client.builder.get_cell(source_cell.id)
83 |
84 | cell = await rtu_client.replace_cell_content(cell.id, "2 + 2")
85 | assert cell.source == "2 + 2"
86 | finally:
87 | await rtu_client.shutdown()
88 |
89 |
90 | async def test_connect_bad_file_id(api_client: APIClient):
91 | with pytest.raises(httpx.HTTPStatusError) as e:
92 | await api_client.connect_realtime(file=uuid.uuid4())
93 | assert e.value.response.status_code == 404
94 |
--------------------------------------------------------------------------------
/tests/unit/models/conftest.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | import pytest
4 |
5 |
6 | @pytest.fixture
7 | def tmp_noteable_url_environ() -> str:
8 | orig_value = os.environ.get("PUBLIC_NOTEABLE_URL", "")
9 |
10 | new_value = "https://localhost/api"
11 | os.environ["PUBLIC_NOTEABLE_URL"] = new_value
12 |
13 | yield new_value
14 |
15 | os.environ["PUBLIC_NOTEABLE_URL"] = orig_value
16 |
--------------------------------------------------------------------------------
/tests/unit/models/test_files.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from uuid import uuid4
3 |
4 | from origami.models.api.files import File
5 |
6 |
7 | class TestFile:
8 | def test_construct_url(self, tmp_noteable_url_environ: str):
9 | file = File(
10 | id=uuid4(),
11 | created_at=datetime.now(),
12 | updated_at=datetime.now(),
13 | filename="foo.txt",
14 | path="/etc/foo.txt",
15 | project_id=uuid4(),
16 | space_id=uuid4(),
17 | size=12,
18 | mimetype="text/plain",
19 | type="file",
20 | current_version_id=uuid4(),
21 | presigned_download_url="https://foo.bar/blat",
22 | )
23 |
24 | assert file.url == f"{tmp_noteable_url_environ}/f/{file.id}/{file.path}"
25 |
--------------------------------------------------------------------------------
/tests/unit/models/test_notebook.py:
--------------------------------------------------------------------------------
1 | from uuid import uuid4
2 |
3 | import pytest
4 |
5 | from origami.models.notebook import CellBase, StreamOutput
6 |
7 |
8 | class TestStreamOutput:
9 | @pytest.mark.parametrize("text_value", [["this", "is", "multiline"], "this\nis\nmultiline"])
10 | def test_multiline_text(self, text_value):
11 | output = StreamOutput(name="output", text=text_value)
12 |
13 | assert output.text == "this\nis\nmultiline"
14 |
15 |
16 | class TestCellBase:
17 | @pytest.mark.parametrize("source_value", [["this", "is", "multiline"], "this\nis\nmultiline"])
18 | def test_multiline_source(self, source_value):
19 | cell = CellBase(id=str(uuid4()), source=source_value)
20 |
21 | assert cell.source == "this\nis\nmultiline"
22 |
--------------------------------------------------------------------------------
/tests/unit/models/test_project.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from uuid import uuid4
3 |
4 | from origami.models.api.projects import Project
5 |
6 |
7 | class TestProject:
8 | def test_construct_url(self, tmp_noteable_url_environ: str):
9 | project = Project(
10 | id=uuid4(),
11 | space_id=uuid4(),
12 | created_at=datetime.now(),
13 | updated_at=datetime.now(),
14 | name="My Project",
15 | description="Describe",
16 | )
17 |
18 | assert project.url == f"{tmp_noteable_url_environ}/p/{project.id}"
19 |
--------------------------------------------------------------------------------
/tests/unit/models/test_rtu_base.py:
--------------------------------------------------------------------------------
1 | from typing import Type
2 |
3 | import pytest
4 |
5 | from origami.models.rtu.base import BaseRTU, BaseRTURequest, BaseRTUResponse
6 |
7 |
8 | @pytest.mark.parametrize("clazz", [BaseRTU, BaseRTURequest, BaseRTUResponse])
9 | class TestRTUFamily:
10 | def test_set_channel_prefix(self, clazz: Type[BaseRTU]):
11 | """Channel prefix is derived from channel."""
12 | obj = clazz(
13 | channel="foo/12345",
14 | event="foo_event",
15 | )
16 |
17 | assert obj.channel_prefix == "foo"
18 |
19 | def test_channel_prefix_does_not_serialize(self, clazz):
20 | """channel_prefix should not be part of object serialization"""
21 | obj = clazz(
22 | channel="foo/12345",
23 | event="foo_event",
24 | )
25 |
26 | assert "channel_prefix" not in obj.model_dump_json()
27 |
--------------------------------------------------------------------------------
/tests/unit/models/test_space.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | from uuid import uuid4
3 |
4 | from origami.models.api.spaces import Space
5 |
6 |
7 | class TestSpace:
8 | def test_construct_url(self, tmp_noteable_url_environ: str):
9 | space = Space(
10 | id=uuid4(),
11 | created_at=datetime.now(),
12 | updated_at=datetime.now(),
13 | name="MySpace",
14 | description="Where did Tom end up?",
15 | )
16 |
17 | assert space.url == f"{tmp_noteable_url_environ}/s/{space.id}"
18 |
--------------------------------------------------------------------------------
/tests/unit/models/test_user.py:
--------------------------------------------------------------------------------
1 | import os
2 | from datetime import datetime
3 | from uuid import uuid4
4 |
5 | from origami.models.api.users import User
6 |
7 |
8 | class TestUser:
9 | def test_construct_auth_type(self):
10 | user = User(
11 | id=uuid4(),
12 | created_at=datetime.now(),
13 | updated_at=datetime.now(),
14 | handle="joe",
15 | email="joe@sample.com",
16 | first_name="Joe",
17 | last_name="Sample",
18 | origamist_default_project_id=uuid4(),
19 | principal_sub="oauth|456fdghdfdfgj",
20 | )
21 |
22 | assert user.auth_type == "oauth"
23 |
--------------------------------------------------------------------------------
/tests/unit/test_sql_cells.py:
--------------------------------------------------------------------------------
1 | from origami.models.notebook import make_sql_cell
2 |
3 |
4 | def test_sql_cell():
5 | cell = make_sql_cell(source="SELECT * FROM table")
6 | assert cell.is_sql_cell
7 | assert cell.source == "SELECT * FROM table"
8 | assert cell.metadata["noteable"]["db_connection"] == "@noteable"
9 |
10 |
11 | def test_strip_sql_magic_prefix():
12 | cell = make_sql_cell(source="%%sql @noteable\nSELECT * FROM table")
13 | assert cell.is_sql_cell
14 | assert cell.source == "SELECT * FROM table"
15 | assert cell.metadata["noteable"]["db_connection"] == "@noteable"
16 |
--------------------------------------------------------------------------------
/tests/unit/test_version.py:
--------------------------------------------------------------------------------
1 | import origami
2 |
3 |
4 | def test_version():
5 | assert origami.__version__ is not None
6 |
--------------------------------------------------------------------------------