├── .github └── workflows │ └── poetry-publish.yml ├── .gitignore ├── LICENCE.md ├── README.md ├── poetry.lock ├── pyproject.toml ├── sqla_async_orm_queries ├── __init__.py └── models.py └── tests └── test_model.py /.github/workflows/poetry-publish.yml: -------------------------------------------------------------------------------- 1 | name: Upload sqla async orm queries 2 | 3 | on: 4 | release: 5 | types: [created] 6 | 7 | jobs: 8 | release: 9 | runs-on: ubuntu-latest 10 | steps: 11 | - uses: actions/checkout@v2 12 | - uses: actions/setup-python@v1 13 | with: 14 | python-version: '3.8' 15 | architecture: x64 16 | - name: Install dependencies 17 | run: | 18 | python -m pip install --upgrade pip 19 | pip install poetry 20 | 21 | - name: Build and publish 22 | run: | 23 | poetry version $(git describe --tags --abbrev=0) 24 | poetry build 25 | poetry publish --username ${{ secrets.PYPI_USERNAME }} --password ${{ secrets.PYPI_PASSWORD }} 26 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | venv/ 2 | .venv/ 3 | dist/ 4 | __pycache__/ 5 | -------------------------------------------------------------------------------- /LICENCE.md: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Amrah Baghirov 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # SQLAlchemy ORM with Async Support 2 | 3 | This project provides an abstract base class `Model` with advanced CRUD operations, audit logging, and support for SQLAlchemy's async ORM functionality. It includes several useful features such as soft delete functionality, Pydantic model integration for validation, and audit logging with event listeners. 4 | 5 | ## Key Features 6 | 7 | - **Async Session Management**: Provides session management using async SQLAlchemy. 8 | - **CRUD Operations**: Includes create, read, update, and delete operations. 9 | - **Soft Delete**: Ability to soft-delete records by marking them as inactive. 10 | - **Audit Logging**: Automatically logs changes to models (insert, update, delete). 11 | - **Pydantic Model Integration**: Supports data validation using Pydantic models. 12 | - **Event Listeners**: Event listeners automatically trigger audit logging on inserts, updates, and deletes. 13 | - **Transaction Management**: Supports transactional operations with rollback on failure. 14 | - **Pagination Support**: Allows for paginated queries and returns results with metadata. 15 | - **Bulk Operations**: Provides bulk create, update, and delete functionality. 16 | 17 | ### Installation 18 | ```sh 19 | pip install sqla-async-orm-queries 20 | ``` 21 | 22 | Alternatively, if you prefer to use poetry for package dependencies: 23 | ```sh 24 | poetry shell 25 | poetry add sqla-async-orm-queries 26 | ``` 27 | 28 | Usage Examples 29 | Before using this code, ensure you have the following dependency installed: 30 | 31 | Python 3.8 or above 32 | 33 | 34 | ## How to Use 35 | 36 | ### 1. Define a Model 37 | 38 | To define your own model, inherit from the `Model` class and define your fields using SQLAlchemy's `Column` types. Each model can also define its own Pydantic schema for validation purposes. 39 | 40 | ```python 41 | from sqlalchemy import Column, Integer, String 42 | from sqla_async_orm_queries.models import Model, PydanticModelMixin, AuditLog 43 | 44 | class User(Model): 45 | __tablename__ = 'users' 46 | id = Column(Integer, primary_key=True) 47 | name = Column(String, nullable=False) 48 | email = Column(String, unique=True, nullable=False) 49 | 50 | class PydanticModel(PydanticModelMixin): 51 | id: Optional[int] 52 | name: str 53 | email: str 54 | ``` 55 | 56 | ### 2. Initialize the Session 57 | 58 | You need to initialize the session factory to enable database operations. Make sure to provide an `async_sessionmaker` for managing async sessions. 59 | 60 | ```python 61 | from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker 62 | from sqla_async_orm_queries.models import Model, PydanticModelMixin, AuditLog 63 | 64 | DATABASE_URL = "sqlite+aiosqlite:///:memory:" 65 | 66 | engine = create_async_engine(DATABASE_URL) 67 | SessionLocal = async_sessionmaker(bind=engine, expire_on_commit=False) 68 | 69 | Model.init_session(SessionLocal) 70 | ``` 71 | 72 | ### 3. Perform CRUD Operations 73 | 74 | CRUD operations are supported out of the box. You can create, read, update, delete, and soft delete records using the provided methods. 75 | 76 | ```python 77 | # Create a new user 78 | user_data = {'name': 'John Doe', 'email': 'john@example.com'} 79 | user = await User.create(user_data) 80 | 81 | # Read a user 82 | user = await User.select_one(User.id == 1) 83 | 84 | # Update a user 85 | await User.update({'name': 'Jane Doe'}, User.id == 1) 86 | 87 | # Soft delete a user 88 | await User.soft_delete(User.id == 1) 89 | ``` 90 | 91 | ### 4. Audit Logging 92 | 93 | The `AuditLog` model automatically logs any insert, update, or delete operation. The logs are stored in the `audit_logs` table. 94 | 95 | ```python 96 | # View audit logs 97 | audit_logs = await AuditLog.select_all() 98 | ``` 99 | 100 | ### 5. Transaction Management 101 | 102 | The `transactional` method provides an easy way to run a set of operations inside a transaction. If any error occurs, the transaction will be rolled back. 103 | 104 | ```python 105 | async def my_operations(session): 106 | await User.create({'name': 'New User', 'email': 'new_user@example.com'}, session=session) 107 | 108 | await User.transactional(my_operations) 109 | ``` 110 | 111 | ### 6. Bulk Operations 112 | 113 | You can create, update, and delete multiple records in a single transaction using the `bulk_create`, `bulk_update`, and `bulk_delete` methods. 114 | 115 | ```python 116 | # Bulk create users 117 | users_data = [{'name': 'User 1', 'email': 'user1@example.com'}, {'name': 'User 2', 'email': 'user2@example.com'}] 118 | await User.bulk_create(users_data) 119 | 120 | # Bulk delete users 121 | await User.bulk_delete([User.id == 1, User.id == 2]) 122 | ``` 123 | 124 | ### 7. Pagination 125 | 126 | Paginate through results using the `select_with_pagination` method: 127 | 128 | ```python 129 | pagination = await User.select_with_pagination(page=2, per_page=10) 130 | print(pagination.items) # The users on the second page 131 | ``` 132 | 133 | ## Installation 134 | 135 | 1. Install the necessary dependencies: 136 | 137 | ```bash 138 | pip install sqlalchemy aiosqlite pydantic 139 | ``` 140 | 141 | 2. Copy or clone this repository and define your models by inheriting from `Model`. 142 | 143 | ## Running Tests 144 | 145 | You can run tests using `pytest` and `pytest-asyncio` for testing asynchronous operations. 146 | 147 | 1. Install the test dependencies: 148 | 149 | ```bash 150 | pip install pytest pytest-asyncio 151 | ``` 152 | 153 | 2. Run the tests: 154 | 155 | ```bash 156 | pytest 157 | ``` 158 | 159 | ## License 160 | 161 | This project is licensed under the MIT License. 162 | -------------------------------------------------------------------------------- /poetry.lock: -------------------------------------------------------------------------------- 1 | # This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand. 2 | 3 | [[package]] 4 | name = "asyncpg" 5 | version = "0.28.0" 6 | description = "An asyncio PostgreSQL driver" 7 | optional = false 8 | python-versions = ">=3.7.0" 9 | files = [ 10 | {file = "asyncpg-0.28.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a6d1b954d2b296292ddff4e0060f494bb4270d87fb3655dd23c5c6096d16d83"}, 11 | {file = "asyncpg-0.28.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0740f836985fd2bd73dca42c50c6074d1d61376e134d7ad3ad7566c4f79f8184"}, 12 | {file = "asyncpg-0.28.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e907cf620a819fab1737f2dd90c0f185e2a796f139ac7de6aa3212a8af96c050"}, 13 | {file = "asyncpg-0.28.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:86b339984d55e8202e0c4b252e9573e26e5afa05617ed02252544f7b3e6de3e9"}, 14 | {file = "asyncpg-0.28.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0c402745185414e4c204a02daca3d22d732b37359db4d2e705172324e2d94e85"}, 15 | {file = "asyncpg-0.28.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c88eef5e096296626e9688f00ab627231f709d0e7e3fb84bb4413dff81d996d7"}, 16 | {file = "asyncpg-0.28.0-cp310-cp310-win32.whl", hash = "sha256:90a7bae882a9e65a9e448fdad3e090c2609bb4637d2a9c90bfdcebbfc334bf89"}, 17 | {file = "asyncpg-0.28.0-cp310-cp310-win_amd64.whl", hash = "sha256:76aacdcd5e2e9999e83c8fbcb748208b60925cc714a578925adcb446d709016c"}, 18 | {file = "asyncpg-0.28.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a0e08fe2c9b3618459caaef35979d45f4e4f8d4f79490c9fa3367251366af207"}, 19 | {file = "asyncpg-0.28.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b24e521f6060ff5d35f761a623b0042c84b9c9b9fb82786aadca95a9cb4a893b"}, 20 | {file = "asyncpg-0.28.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99417210461a41891c4ff301490a8713d1ca99b694fef05dabd7139f9d64bd6c"}, 21 | {file = "asyncpg-0.28.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f029c5adf08c47b10bcdc857001bbef551ae51c57b3110964844a9d79ca0f267"}, 22 | {file = "asyncpg-0.28.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ad1d6abf6c2f5152f46fff06b0e74f25800ce8ec6c80967f0bc789974de3c652"}, 23 | {file = "asyncpg-0.28.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d7fa81ada2807bc50fea1dc741b26a4e99258825ba55913b0ddbf199a10d69d8"}, 24 | {file = "asyncpg-0.28.0-cp311-cp311-win32.whl", hash = "sha256:f33c5685e97821533df3ada9384e7784bd1e7865d2b22f153f2e4bd4a083e102"}, 25 | {file = "asyncpg-0.28.0-cp311-cp311-win_amd64.whl", hash = "sha256:5e7337c98fb493079d686a4a6965e8bcb059b8e1b8ec42106322fc6c1c889bb0"}, 26 | {file = "asyncpg-0.28.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:1c56092465e718a9fdcc726cc3d9dcf3a692e4834031c9a9f871d92a75d20d48"}, 27 | {file = "asyncpg-0.28.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4acd6830a7da0eb4426249d71353e8895b350daae2380cb26d11e0d4a01c5472"}, 28 | {file = "asyncpg-0.28.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63861bb4a540fa033a56db3bb58b0c128c56fad5d24e6d0a8c37cb29b17c1c7d"}, 29 | {file = "asyncpg-0.28.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:a93a94ae777c70772073d0512f21c74ac82a8a49be3a1d982e3f259ab5f27307"}, 30 | {file = "asyncpg-0.28.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d14681110e51a9bc9c065c4e7944e8139076a778e56d6f6a306a26e740ed86d2"}, 31 | {file = "asyncpg-0.28.0-cp37-cp37m-win32.whl", hash = "sha256:8aec08e7310f9ab322925ae5c768532e1d78cfb6440f63c078b8392a38aa636a"}, 32 | {file = "asyncpg-0.28.0-cp37-cp37m-win_amd64.whl", hash = "sha256:319f5fa1ab0432bc91fb39b3960b0d591e6b5c7844dafc92c79e3f1bff96abef"}, 33 | {file = "asyncpg-0.28.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b337ededaabc91c26bf577bfcd19b5508d879c0ad009722be5bb0a9dd30b85a0"}, 34 | {file = "asyncpg-0.28.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4d32b680a9b16d2957a0a3cc6b7fa39068baba8e6b728f2e0a148a67644578f4"}, 35 | {file = "asyncpg-0.28.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f4f62f04cdf38441a70f279505ef3b4eadf64479b17e707c950515846a2df197"}, 36 | {file = "asyncpg-0.28.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f20cac332c2576c79c2e8e6464791c1f1628416d1115935a34ddd7121bfc6a4"}, 37 | {file = "asyncpg-0.28.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:59f9712ce01e146ff71d95d561fb68bd2d588a35a187116ef05028675462d5ed"}, 38 | {file = "asyncpg-0.28.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:fc9e9f9ff1aa0eddcc3247a180ac9e9b51a62311e988809ac6152e8fb8097756"}, 39 | {file = "asyncpg-0.28.0-cp38-cp38-win32.whl", hash = "sha256:9e721dccd3838fcff66da98709ed884df1e30a95f6ba19f595a3706b4bc757e3"}, 40 | {file = "asyncpg-0.28.0-cp38-cp38-win_amd64.whl", hash = "sha256:8ba7d06a0bea539e0487234511d4adf81dc8762249858ed2a580534e1720db00"}, 41 | {file = "asyncpg-0.28.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d009b08602b8b18edef3a731f2ce6d3f57d8dac2a0a4140367e194eabd3de457"}, 42 | {file = "asyncpg-0.28.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ec46a58d81446d580fb21b376ec6baecab7288ce5a578943e2fc7ab73bf7eb39"}, 43 | {file = "asyncpg-0.28.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b48ceed606cce9e64fd5480a9b0b9a95cea2b798bb95129687abd8599c8b019"}, 44 | {file = "asyncpg-0.28.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8858f713810f4fe67876728680f42e93b7e7d5c7b61cf2118ef9153ec16b9423"}, 45 | {file = "asyncpg-0.28.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5e18438a0730d1c0c1715016eacda6e9a505fc5aa931b37c97d928d44941b4bf"}, 46 | {file = "asyncpg-0.28.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e9c433f6fcdd61c21a715ee9128a3ca48be8ac16fa07be69262f016bb0f4dbd2"}, 47 | {file = "asyncpg-0.28.0-cp39-cp39-win32.whl", hash = "sha256:41e97248d9076bc8e4849da9e33e051be7ba37cd507cbd51dfe4b2d99c70e3dc"}, 48 | {file = "asyncpg-0.28.0-cp39-cp39-win_amd64.whl", hash = "sha256:3ed77f00c6aacfe9d79e9eff9e21729ce92a4b38e80ea99a58ed382f42ebd55b"}, 49 | {file = "asyncpg-0.28.0.tar.gz", hash = "sha256:7252cdc3acb2f52feaa3664280d3bcd78a46bd6c10bfd681acfffefa1120e278"}, 50 | ] 51 | 52 | [package.extras] 53 | docs = ["Sphinx (>=5.3.0,<5.4.0)", "sphinx-rtd-theme (>=1.2.2)", "sphinxcontrib-asyncio (>=0.3.0,<0.4.0)"] 54 | test = ["flake8 (>=5.0,<6.0)", "uvloop (>=0.15.3)"] 55 | 56 | [[package]] 57 | name = "greenlet" 58 | version = "3.0.3" 59 | description = "Lightweight in-process concurrent programming" 60 | optional = false 61 | python-versions = ">=3.7" 62 | files = [ 63 | {file = "greenlet-3.0.3-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a"}, 64 | {file = "greenlet-3.0.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881"}, 65 | {file = "greenlet-3.0.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b"}, 66 | {file = "greenlet-3.0.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a"}, 67 | {file = "greenlet-3.0.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83"}, 68 | {file = "greenlet-3.0.3-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405"}, 69 | {file = "greenlet-3.0.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f"}, 70 | {file = "greenlet-3.0.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb"}, 71 | {file = "greenlet-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9"}, 72 | {file = "greenlet-3.0.3-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61"}, 73 | {file = "greenlet-3.0.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559"}, 74 | {file = "greenlet-3.0.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e"}, 75 | {file = "greenlet-3.0.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"}, 76 | {file = "greenlet-3.0.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379"}, 77 | {file = "greenlet-3.0.3-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22"}, 78 | {file = "greenlet-3.0.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3"}, 79 | {file = "greenlet-3.0.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d"}, 80 | {file = "greenlet-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728"}, 81 | {file = "greenlet-3.0.3-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be"}, 82 | {file = "greenlet-3.0.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e"}, 83 | {file = "greenlet-3.0.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676"}, 84 | {file = "greenlet-3.0.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc"}, 85 | {file = "greenlet-3.0.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230"}, 86 | {file = "greenlet-3.0.3-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf"}, 87 | {file = "greenlet-3.0.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305"}, 88 | {file = "greenlet-3.0.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6"}, 89 | {file = "greenlet-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2"}, 90 | {file = "greenlet-3.0.3-cp37-cp37m-macosx_11_0_universal2.whl", hash = "sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274"}, 91 | {file = "greenlet-3.0.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0"}, 92 | {file = "greenlet-3.0.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f"}, 93 | {file = "greenlet-3.0.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414"}, 94 | {file = "greenlet-3.0.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c"}, 95 | {file = "greenlet-3.0.3-cp37-cp37m-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41"}, 96 | {file = "greenlet-3.0.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7"}, 97 | {file = "greenlet-3.0.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6"}, 98 | {file = "greenlet-3.0.3-cp37-cp37m-win32.whl", hash = "sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d"}, 99 | {file = "greenlet-3.0.3-cp37-cp37m-win_amd64.whl", hash = "sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67"}, 100 | {file = "greenlet-3.0.3-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca"}, 101 | {file = "greenlet-3.0.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04"}, 102 | {file = "greenlet-3.0.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc"}, 103 | {file = "greenlet-3.0.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506"}, 104 | {file = "greenlet-3.0.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b"}, 105 | {file = "greenlet-3.0.3-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4"}, 106 | {file = "greenlet-3.0.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5"}, 107 | {file = "greenlet-3.0.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da"}, 108 | {file = "greenlet-3.0.3-cp38-cp38-win32.whl", hash = "sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3"}, 109 | {file = "greenlet-3.0.3-cp38-cp38-win_amd64.whl", hash = "sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf"}, 110 | {file = "greenlet-3.0.3-cp39-cp39-macosx_11_0_universal2.whl", hash = "sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53"}, 111 | {file = "greenlet-3.0.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257"}, 112 | {file = "greenlet-3.0.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac"}, 113 | {file = "greenlet-3.0.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71"}, 114 | {file = "greenlet-3.0.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61"}, 115 | {file = "greenlet-3.0.3-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b"}, 116 | {file = "greenlet-3.0.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6"}, 117 | {file = "greenlet-3.0.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113"}, 118 | {file = "greenlet-3.0.3-cp39-cp39-win32.whl", hash = "sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e"}, 119 | {file = "greenlet-3.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067"}, 120 | {file = "greenlet-3.0.3.tar.gz", hash = "sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491"}, 121 | ] 122 | 123 | [package.extras] 124 | docs = ["Sphinx", "furo"] 125 | test = ["objgraph", "psutil"] 126 | 127 | [[package]] 128 | name = "sqlalchemy" 129 | version = "2.0.22" 130 | description = "Database Abstraction Library" 131 | optional = false 132 | python-versions = ">=3.7" 133 | files = [ 134 | {file = "SQLAlchemy-2.0.22-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f146c61ae128ab43ea3a0955de1af7e1633942c2b2b4985ac51cc292daf33222"}, 135 | {file = "SQLAlchemy-2.0.22-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:875de9414393e778b655a3d97d60465eb3fae7c919e88b70cc10b40b9f56042d"}, 136 | {file = "SQLAlchemy-2.0.22-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:13790cb42f917c45c9c850b39b9941539ca8ee7917dacf099cc0b569f3d40da7"}, 137 | {file = "SQLAlchemy-2.0.22-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e04ab55cf49daf1aeb8c622c54d23fa4bec91cb051a43cc24351ba97e1dd09f5"}, 138 | {file = "SQLAlchemy-2.0.22-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:a42c9fa3abcda0dcfad053e49c4f752eef71ecd8c155221e18b99d4224621176"}, 139 | {file = "SQLAlchemy-2.0.22-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:14cd3bcbb853379fef2cd01e7c64a5d6f1d005406d877ed9509afb7a05ff40a5"}, 140 | {file = "SQLAlchemy-2.0.22-cp310-cp310-win32.whl", hash = "sha256:d143c5a9dada696bcfdb96ba2de4a47d5a89168e71d05a076e88a01386872f97"}, 141 | {file = "SQLAlchemy-2.0.22-cp310-cp310-win_amd64.whl", hash = "sha256:ccd87c25e4c8559e1b918d46b4fa90b37f459c9b4566f1dfbce0eb8122571547"}, 142 | {file = "SQLAlchemy-2.0.22-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4f6ff392b27a743c1ad346d215655503cec64405d3b694228b3454878bf21590"}, 143 | {file = "SQLAlchemy-2.0.22-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f776c2c30f0e5f4db45c3ee11a5f2a8d9de68e81eb73ec4237de1e32e04ae81c"}, 144 | {file = "SQLAlchemy-2.0.22-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8f1792d20d2f4e875ce7a113f43c3561ad12b34ff796b84002a256f37ce9437"}, 145 | {file = "SQLAlchemy-2.0.22-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d80eeb5189d7d4b1af519fc3f148fe7521b9dfce8f4d6a0820e8f5769b005051"}, 146 | {file = "SQLAlchemy-2.0.22-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:69fd9e41cf9368afa034e1c81f3570afb96f30fcd2eb1ef29cb4d9371c6eece2"}, 147 | {file = "SQLAlchemy-2.0.22-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:54bcceaf4eebef07dadfde424f5c26b491e4a64e61761dea9459103ecd6ccc95"}, 148 | {file = "SQLAlchemy-2.0.22-cp311-cp311-win32.whl", hash = "sha256:7ee7ccf47aa503033b6afd57efbac6b9e05180f492aeed9fcf70752556f95624"}, 149 | {file = "SQLAlchemy-2.0.22-cp311-cp311-win_amd64.whl", hash = "sha256:b560f075c151900587ade06706b0c51d04b3277c111151997ea0813455378ae0"}, 150 | {file = "SQLAlchemy-2.0.22-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:2c9bac865ee06d27a1533471405ad240a6f5d83195eca481f9fc4a71d8b87df8"}, 151 | {file = "SQLAlchemy-2.0.22-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:625b72d77ac8ac23da3b1622e2da88c4aedaee14df47c8432bf8f6495e655de2"}, 152 | {file = "SQLAlchemy-2.0.22-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b39a6e21110204a8c08d40ff56a73ba542ec60bab701c36ce721e7990df49fb9"}, 153 | {file = "SQLAlchemy-2.0.22-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53a766cb0b468223cafdf63e2d37f14a4757476157927b09300c8c5832d88560"}, 154 | {file = "SQLAlchemy-2.0.22-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0e1ce8ebd2e040357dde01a3fb7d30d9b5736b3e54a94002641dfd0aa12ae6ce"}, 155 | {file = "SQLAlchemy-2.0.22-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:505f503763a767556fa4deae5194b2be056b64ecca72ac65224381a0acab7ebe"}, 156 | {file = "SQLAlchemy-2.0.22-cp312-cp312-win32.whl", hash = "sha256:154a32f3c7b00de3d090bc60ec8006a78149e221f1182e3edcf0376016be9396"}, 157 | {file = "SQLAlchemy-2.0.22-cp312-cp312-win_amd64.whl", hash = "sha256:129415f89744b05741c6f0b04a84525f37fbabe5dc3774f7edf100e7458c48cd"}, 158 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3940677d341f2b685a999bffe7078697b5848a40b5f6952794ffcf3af150c301"}, 159 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55914d45a631b81a8a2cb1a54f03eea265cf1783241ac55396ec6d735be14883"}, 160 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2096d6b018d242a2bcc9e451618166f860bb0304f590d205173d317b69986c95"}, 161 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:19c6986cf2fb4bc8e0e846f97f4135a8e753b57d2aaaa87c50f9acbe606bd1db"}, 162 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:6ac28bd6888fe3c81fbe97584eb0b96804bd7032d6100b9701255d9441373ec1"}, 163 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-win32.whl", hash = "sha256:cb9a758ad973e795267da334a92dd82bb7555cb36a0960dcabcf724d26299db8"}, 164 | {file = "SQLAlchemy-2.0.22-cp37-cp37m-win_amd64.whl", hash = "sha256:40b1206a0d923e73aa54f0a6bd61419a96b914f1cd19900b6c8226899d9742ad"}, 165 | {file = "SQLAlchemy-2.0.22-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3aa1472bf44f61dd27987cd051f1c893b7d3b17238bff8c23fceaef4f1133868"}, 166 | {file = "SQLAlchemy-2.0.22-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:56a7e2bb639df9263bf6418231bc2a92a773f57886d371ddb7a869a24919face"}, 167 | {file = "SQLAlchemy-2.0.22-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ccca778c0737a773a1ad86b68bda52a71ad5950b25e120b6eb1330f0df54c3d0"}, 168 | {file = "SQLAlchemy-2.0.22-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7c6c3e9350f9fb16de5b5e5fbf17b578811a52d71bb784cc5ff71acb7de2a7f9"}, 169 | {file = "SQLAlchemy-2.0.22-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:564e9f9e4e6466273dbfab0e0a2e5fe819eec480c57b53a2cdee8e4fdae3ad5f"}, 170 | {file = "SQLAlchemy-2.0.22-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:af66001d7b76a3fab0d5e4c1ec9339ac45748bc4a399cbc2baa48c1980d3c1f4"}, 171 | {file = "SQLAlchemy-2.0.22-cp38-cp38-win32.whl", hash = "sha256:9e55dff5ec115316dd7a083cdc1a52de63693695aecf72bc53a8e1468ce429e5"}, 172 | {file = "SQLAlchemy-2.0.22-cp38-cp38-win_amd64.whl", hash = "sha256:4e869a8ff7ee7a833b74868a0887e8462445ec462432d8cbeff5e85f475186da"}, 173 | {file = "SQLAlchemy-2.0.22-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9886a72c8e6371280cb247c5d32c9c8fa141dc560124348762db8a8b236f8692"}, 174 | {file = "SQLAlchemy-2.0.22-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a571bc8ac092a3175a1d994794a8e7a1f2f651e7c744de24a19b4f740fe95034"}, 175 | {file = "SQLAlchemy-2.0.22-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8db5ba8b7da759b727faebc4289a9e6a51edadc7fc32207a30f7c6203a181592"}, 176 | {file = "SQLAlchemy-2.0.22-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b0b3f2686c3f162123adba3cb8b626ed7e9b8433ab528e36ed270b4f70d1cdb"}, 177 | {file = "SQLAlchemy-2.0.22-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0c1fea8c0abcb070ffe15311853abfda4e55bf7dc1d4889497b3403629f3bf00"}, 178 | {file = "SQLAlchemy-2.0.22-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:4bb062784f37b2d75fd9b074c8ec360ad5df71f933f927e9e95c50eb8e05323c"}, 179 | {file = "SQLAlchemy-2.0.22-cp39-cp39-win32.whl", hash = "sha256:58a3aba1bfb32ae7af68da3f277ed91d9f57620cf7ce651db96636790a78b736"}, 180 | {file = "SQLAlchemy-2.0.22-cp39-cp39-win_amd64.whl", hash = "sha256:92e512a6af769e4725fa5b25981ba790335d42c5977e94ded07db7d641490a85"}, 181 | {file = "SQLAlchemy-2.0.22-py3-none-any.whl", hash = "sha256:3076740335e4aaadd7deb3fe6dcb96b3015f1613bd190a4e1634e1b99b02ec86"}, 182 | {file = "SQLAlchemy-2.0.22.tar.gz", hash = "sha256:5434cc601aa17570d79e5377f5fd45ff92f9379e2abed0be5e8c2fba8d353d2b"}, 183 | ] 184 | 185 | [package.dependencies] 186 | greenlet = {version = "!=0.4.17", markers = "platform_machine == \"aarch64\" or platform_machine == \"ppc64le\" or platform_machine == \"x86_64\" or platform_machine == \"amd64\" or platform_machine == \"AMD64\" or platform_machine == \"win32\" or platform_machine == \"WIN32\""} 187 | typing-extensions = ">=4.2.0" 188 | 189 | [package.extras] 190 | aiomysql = ["aiomysql (>=0.2.0)", "greenlet (!=0.4.17)"] 191 | aiosqlite = ["aiosqlite", "greenlet (!=0.4.17)", "typing-extensions (!=3.10.0.1)"] 192 | asyncio = ["greenlet (!=0.4.17)"] 193 | asyncmy = ["asyncmy (>=0.2.3,!=0.2.4,!=0.2.6)", "greenlet (!=0.4.17)"] 194 | mariadb-connector = ["mariadb (>=1.0.1,!=1.1.2,!=1.1.5)"] 195 | mssql = ["pyodbc"] 196 | mssql-pymssql = ["pymssql"] 197 | mssql-pyodbc = ["pyodbc"] 198 | mypy = ["mypy (>=0.910)"] 199 | mysql = ["mysqlclient (>=1.4.0)"] 200 | mysql-connector = ["mysql-connector-python"] 201 | oracle = ["cx-oracle (>=7)"] 202 | oracle-oracledb = ["oracledb (>=1.0.1)"] 203 | postgresql = ["psycopg2 (>=2.7)"] 204 | postgresql-asyncpg = ["asyncpg", "greenlet (!=0.4.17)"] 205 | postgresql-pg8000 = ["pg8000 (>=1.29.1)"] 206 | postgresql-psycopg = ["psycopg (>=3.0.7)"] 207 | postgresql-psycopg2binary = ["psycopg2-binary"] 208 | postgresql-psycopg2cffi = ["psycopg2cffi"] 209 | postgresql-psycopgbinary = ["psycopg[binary] (>=3.0.7)"] 210 | pymysql = ["pymysql"] 211 | sqlcipher = ["sqlcipher3-binary"] 212 | 213 | [[package]] 214 | name = "typing-extensions" 215 | version = "4.8.0" 216 | description = "Backported and Experimental Type Hints for Python 3.8+" 217 | optional = false 218 | python-versions = ">=3.8" 219 | files = [ 220 | {file = "typing_extensions-4.8.0-py3-none-any.whl", hash = "sha256:8f92fc8806f9a6b641eaa5318da32b44d401efaac0f6678c9bc448ba3605faa0"}, 221 | {file = "typing_extensions-4.8.0.tar.gz", hash = "sha256:df8e4339e9cb77357558cbdbceca33c303714cf861d1eef15e1070055ae8b7ef"}, 222 | ] 223 | 224 | [metadata] 225 | lock-version = "2.0" 226 | python-versions = "^3.8.2" 227 | content-hash = "f12cb788e30a7f5881882d6cdeb0c1d261c3facc18cf39a758ba47cb90b43d4f" 228 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [tool.poetry] 2 | name = "sqla-async-orm-queries" 3 | version = "2.0.0-beta" 4 | description = "Base model for SqlAlchemy async orm queries" 5 | authors = ["Amrah "] 6 | license = "MIT" 7 | readme = "README.md" 8 | homepage = "https://github.com/amrahhh/sqla_async_orm_queries" 9 | repository = "https://github.com/amrahhh/sqla_async_orm_queries" 10 | classifiers = [ 11 | "Intended Audience :: Developers", 12 | "Programming Language :: Python", 13 | "Programming Language :: Python :: 3.7", 14 | "Programming Language :: Python :: 3.8", 15 | "Programming Language :: Python :: 3.9", 16 | "Programming Language :: Python :: 3.10", 17 | "Programming Language :: Python :: 3.11", 18 | "Programming Language :: Python :: 3 :: Only", 19 | "License :: OSI Approved :: MIT License", 20 | "Operating System :: OS Independent", 21 | ] 22 | 23 | [tool.poetry.dependencies] 24 | python = "^3.8.2" 25 | sqlalchemy = ">=2.0.21" 26 | asyncpg = ">=0.28.0" 27 | greenlet = "^3.0.3" 28 | -------------------------------------------------------------------------------- /sqla_async_orm_queries/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /sqla_async_orm_queries/models.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations # For forward references 2 | from typing import List, TypeVar, Callable, Optional, Any, Dict 3 | from sqlalchemy import ( 4 | delete, 5 | select, 6 | update, 7 | func, 8 | Column, 9 | Integer, 10 | String, 11 | DateTime, 12 | JSON, 13 | event, 14 | Boolean 15 | ) 16 | from sqlalchemy.ext.asyncio import AsyncAttrs, AsyncSession, async_sessionmaker 17 | from sqlalchemy.orm import ( 18 | DeclarativeBase, 19 | selectinload, 20 | Session, 21 | ) 22 | from contextlib import asynccontextmanager, contextmanager 23 | from datetime import datetime, timezone 24 | from pydantic import BaseModel as PydanticBaseModel, ValidationError 25 | from functools import wraps 26 | import json 27 | 28 | TModels = TypeVar("TModels", bound="Model") 29 | 30 | 31 | class Base(DeclarativeBase, AsyncAttrs): 32 | pass 33 | 34 | 35 | class PydanticModelMixin(PydanticBaseModel): 36 | class Config: 37 | from_attributes = True 38 | 39 | 40 | class SoftDeleteMixin: 41 | is_deleted = Column(Boolean, default=True) 42 | 43 | @classmethod 44 | async def soft_delete(cls, *args, session: Optional[AsyncSession] = None): 45 | await cls.update({"is_deleted": False}, *args, session=session) 46 | 47 | def make_serializable(data): 48 | if isinstance(data, dict): 49 | return {key: make_serializable(value) for key, value in data.items()} 50 | elif isinstance(data, list): 51 | return [make_serializable(item) for item in data] 52 | elif isinstance(data, datetime): 53 | return data.isoformat() 54 | elif hasattr(data, "to_dict"): 55 | return data.to_dict() 56 | return data 57 | 58 | def transactional(func): 59 | @wraps(func) 60 | async def wrapper(*args, **kwargs): 61 | if 'session' in kwargs and kwargs['session'] is not None: 62 | return await func(*args, **kwargs) 63 | else: 64 | async with cls.get_session() as session: 65 | try: 66 | kwargs['session'] = session 67 | result = await func(*args, **kwargs) 68 | await session.commit() 69 | return result 70 | except Exception as e: 71 | await session.rollback() 72 | raise e 73 | return wrapper 74 | 75 | class Model(Base, SoftDeleteMixin): 76 | __abstract__ = True 77 | session_factory: Optional[async_sessionmaker[AsyncSession]] = None 78 | 79 | # Define a Pydantic model inside each SQLAlchemy model 80 | class PydanticModel(PydanticModelMixin): 81 | pass 82 | 83 | @classmethod 84 | def init_session(cls, session_factory: async_sessionmaker[AsyncSession]): 85 | if not isinstance(session_factory, async_sessionmaker): 86 | raise TypeError( 87 | "The session_factory must be an instance of async_sessionmaker." 88 | ) 89 | cls.session_factory = session_factory 90 | 91 | @classmethod 92 | def _ensure_session_factory(cls): 93 | if cls.session_factory is None: 94 | raise RuntimeError( 95 | "Session factory is not initialized. Call init_session first." 96 | ) 97 | 98 | @staticmethod 99 | def _order_by(query, order_by, cls): 100 | if order_by: 101 | columns = [] 102 | for col in order_by: 103 | if col.startswith("-"): 104 | columns.append(getattr(cls, col[1:]).desc()) 105 | else: 106 | columns.append(getattr(cls, col)) 107 | return query.order_by(*columns) 108 | return query 109 | 110 | @classmethod 111 | def _build_loader( 112 | cls, loader_fields: List[str], loader_func: Callable = selectinload 113 | ): 114 | loaders = [] 115 | for field in loader_fields: 116 | if hasattr(cls, field): 117 | loaders.append(loader_func(getattr(cls, field))) 118 | else: 119 | raise AttributeError(f"{cls.__name__} has no attribute '{field}'") 120 | return loaders 121 | 122 | @classmethod 123 | @asynccontextmanager 124 | async def get_session(cls, session: Optional[AsyncSession] = None): 125 | if session is not None: 126 | yield session 127 | else: 128 | cls._ensure_session_factory() 129 | async with cls.session_factory() as session: 130 | async with session.begin(): 131 | yield session 132 | 133 | @classmethod 134 | def validate_data(cls, data: dict): 135 | try: 136 | cls.PydanticModel(**data) 137 | except ValidationError as e: 138 | raise ValueError(f"Validation error: {e.errors()}") 139 | 140 | def to_pydantic(self): 141 | return self.PydanticModel.model_validate(self) 142 | 143 | @classmethod 144 | def from_pydantic(cls, pydantic_model): 145 | return cls(**pydantic_model.dict()) 146 | 147 | def to_dict(self, include_relationships=False): 148 | data = { 149 | column.name: getattr(self, column.name) for column in self.__table__.columns 150 | } 151 | if include_relationships: 152 | for relation in self.__mapper__.relationships: 153 | related = getattr(self, relation.key) 154 | if related is not None: 155 | if isinstance(related, list): 156 | data[relation.key] = [item.to_dict() for item in related] 157 | else: 158 | data[relation.key] = related.to_dict() 159 | return data 160 | 161 | def to_json(self, include_relationships=False): 162 | return json.dumps( 163 | self.to_dict(include_relationships=include_relationships), 164 | default=make_serializable, 165 | ) 166 | 167 | @classmethod 168 | async def create(cls, data: dict, session: Optional[AsyncSession] = None) -> Model: 169 | async with cls.get_session(session) as session: 170 | instance = cls(**data) 171 | session.add(instance) 172 | return instance 173 | 174 | @classmethod 175 | async def bulk_create( 176 | cls, data_list: List[dict], session: Optional[AsyncSession] = None 177 | ) -> List[Model]: 178 | instances = [] 179 | async with cls.get_session(session) as session: 180 | for data in data_list: 181 | instance = cls(**data) 182 | instances.append(instance) 183 | session.add_all(instances) 184 | return instances 185 | 186 | @classmethod 187 | async def select_one( 188 | cls, 189 | *args: Any, 190 | order_by: List[str] = None, 191 | load_with: List[str] = None, 192 | loader_func: Callable = selectinload, 193 | columns: List[str] = None, 194 | session: Optional[AsyncSession] = None, 195 | include_inactive=False, 196 | ) -> Optional[Model]: 197 | if not include_inactive and hasattr(cls, "is_deleted"): 198 | args = (*args, cls.is_deleted == True) 199 | async with cls.get_session(session) as session: 200 | loaders = cls._build_loader(load_with, loader_func) if load_with else [] 201 | if columns: 202 | selected_columns = [getattr(cls, col) for col in columns] 203 | query = select(*selected_columns).options(*loaders) 204 | else: 205 | query = select(cls).options(*loaders) 206 | query = query.where(*args) 207 | query = cls._order_by(query, order_by, cls) 208 | result = await session.execute(query) 209 | return result.one_or_none() if columns else result.scalar_one_or_none() 210 | 211 | @classmethod 212 | async def select_all( 213 | cls, 214 | *args: Any, 215 | order_by: List[str] = None, 216 | load_with: List[str] = None, 217 | loader_func: Callable = selectinload, 218 | columns: List[str] = None, 219 | session: Optional[AsyncSession] = None, 220 | include_inactive=False, 221 | limit: int = None, 222 | offset: int = None, 223 | ) -> List[Model]: 224 | if not include_inactive and hasattr(cls, "is_deleted"): 225 | args = (*args, cls.is_deleted == True) 226 | async with cls.get_session(session) as session: 227 | loaders = cls._build_loader(load_with, loader_func) if load_with else [] 228 | if columns: 229 | selected_columns = [getattr(cls, col) for col in columns] 230 | query = select(*selected_columns).options(*loaders) 231 | else: 232 | query = select(cls).options(*loaders) 233 | query = query.where(*args) 234 | query = cls._order_by(query, order_by, cls) 235 | if limit: 236 | query = query.limit(limit) 237 | if offset: 238 | query = query.offset(offset) 239 | result = await session.execute(query) 240 | return result.all() if columns else result.scalars().all() 241 | 242 | @classmethod 243 | async def update( 244 | cls, 245 | data: dict, 246 | *args: Any, 247 | session: Optional[AsyncSession] = None, 248 | ) -> List[Any]: 249 | async with cls.get_session(session) as session: 250 | query = update(cls).where(*args).values(**data).returning(cls.id) 251 | result = await session.execute(query) 252 | return result.scalars().all() 253 | 254 | @classmethod 255 | async def bulk_update( 256 | cls, 257 | data: dict, 258 | *args: Any, 259 | session: Optional[AsyncSession] = None, 260 | ) -> List[Any]: 261 | async with cls.get_session(session) as session: 262 | updated_ids = [] 263 | 264 | query = update(cls).where(*args).values(**data).returning(cls.id) 265 | result = await session.execute(query) 266 | updated_ids.extend(result.scalars().all()) 267 | return updated_ids 268 | 269 | @classmethod 270 | async def delete(cls, *args: Any, session: Optional[AsyncSession] = None) -> int: 271 | async with cls.get_session(session) as session: 272 | query = delete(cls).where(*args) 273 | result = await session.execute(query) 274 | return result.rowcount 275 | 276 | @classmethod 277 | async def bulk_delete( 278 | cls, conditions_list: List[Any], session: Optional[AsyncSession] = None 279 | ) -> int: 280 | async with cls.get_session(session) as session: 281 | total_deleted = 0 282 | for conditions in conditions_list: 283 | query = delete(cls).where(conditions) 284 | result = await session.execute(query) 285 | total_deleted += result.rowcount 286 | return total_deleted 287 | 288 | @classmethod 289 | async def get_count(cls, *args: Any, session: Optional[AsyncSession] = None) -> int: 290 | async with cls.get_session(session) as session: 291 | result = await session.execute(select(func.count(cls.id)).where(*args)) 292 | return result.scalar() 293 | 294 | @classmethod 295 | async def select_with_pagination( 296 | cls, 297 | *args: Any, 298 | page: int = 1, 299 | per_page: int = 10, 300 | order_by: List[str] = None, 301 | load_with: List[str] = None, 302 | loader_func: Callable = selectinload, 303 | session: Optional[AsyncSession] = None, 304 | include_inactive=False, 305 | ) -> PaginationResult: 306 | if page < 1 or per_page < 1: 307 | raise ValueError("Page and per_page must be positive integers") 308 | offset = (page - 1) * per_page 309 | total = await cls.get_count(*args, session=session) 310 | items = await cls.select_all( 311 | *args, 312 | order_by=order_by, 313 | load_with=load_with, 314 | loader_func=loader_func, 315 | session=session, 316 | offset=offset, 317 | limit=per_page, 318 | include_inactive=include_inactive, 319 | ) 320 | return PaginationResult(items, total, page, per_page) 321 | 322 | @classmethod 323 | async def transactional( 324 | cls, operations: Callable, session: Optional[AsyncSession] = None 325 | ): 326 | async with cls.get_session(session) as session: 327 | try: 328 | await operations(session) 329 | except Exception as e: 330 | await session.rollback() 331 | raise e 332 | else: 333 | await session.commit() 334 | 335 | # Dynamic Filters 336 | @classmethod 337 | def build_filters(cls, filters: Dict[str, Any]) -> List[Any]: 338 | conditions = [] 339 | for field, value in filters.items(): 340 | if hasattr(cls, field): 341 | conditions.append(getattr(cls, field) == value) 342 | else: 343 | raise AttributeError(f"{cls.__name__} has no attribute '{field}'") 344 | return conditions 345 | 346 | @classmethod 347 | def get_query(cls): 348 | return select(cls) 349 | 350 | @staticmethod 351 | def log_operation(mapper, connection, target, operation): 352 | sync_session = Session(bind=connection) 353 | data = make_serializable(target.to_dict()) 354 | audit = AuditLog( 355 | table_name=target.__tablename__, 356 | operation=operation, 357 | timestamp=datetime.now(tz=timezone.utc), 358 | data=data, 359 | ) 360 | sync_session.add(audit) 361 | sync_session.flush() 362 | 363 | @classmethod 364 | def after_insert(cls, mapper, connection, target): 365 | if hasattr(target, "created_at"): 366 | target.created_at = datetime.now(tz=timezone.utc) 367 | cls.log_operation(mapper, connection, target, "insert") 368 | 369 | @classmethod 370 | def after_update(cls, mapper, connection, target): 371 | if hasattr(target, "updated_at"): 372 | target.updated_at = datetime.now(tz=timezone.utc) 373 | cls.log_operation(mapper, connection, target, "update") 374 | 375 | @classmethod 376 | def after_delete(cls, mapper, connection, target): 377 | cls.log_operation(mapper, connection, target, "delete") 378 | 379 | # Attach Event Listeners 380 | @classmethod 381 | def attach_listeners(cls): 382 | event.listen(cls, "after_insert", cls.after_insert) 383 | event.listen(cls, "after_update", cls.after_update) 384 | event.listen(cls, "after_delete", cls.after_delete) 385 | 386 | @classmethod 387 | def detach_listeners(cls): 388 | event.remove(cls, "after_insert", cls.after_insert) 389 | event.remove(cls, "after_update", cls.after_update) 390 | event.remove(cls, "after_delete", cls.after_delete) 391 | 392 | @classmethod 393 | @contextmanager 394 | def listeners_disabled(cls): 395 | cls.detach_listeners() 396 | try: 397 | yield 398 | finally: 399 | cls.attach_listeners() 400 | 401 | async def save(self, session: Optional[AsyncSession] = None): 402 | async with self.get_session(session) as session: 403 | session.add(self) 404 | 405 | async def apply(self, session: Optional[AsyncSession] = None): 406 | async with self.get_session(session) as session: 407 | session.add(self) 408 | # await session.commit() 409 | 410 | @classmethod 411 | async def save_all( 412 | cls, models: List[TModels], session: Optional[AsyncSession] = None 413 | ): 414 | async with cls.get_session(session) as session: 415 | for model in models: 416 | model.validate_data(model.to_dict()) 417 | session.add_all(models) 418 | 419 | @classmethod 420 | async def execute_query( 421 | cls, 422 | query, 423 | scalar: bool = False, 424 | all: bool = False, 425 | session: AsyncSession = None, 426 | ): 427 | async with cls.get_session(session) as session: 428 | result = await session.execute(query) 429 | if scalar and all: 430 | data = result.scalars().all() 431 | elif not scalar and all: 432 | data = result.all() 433 | elif scalar and not all: 434 | data = result.scalar() 435 | else: 436 | raise NotImplementedError 437 | return data 438 | 439 | 440 | class PaginationResult: 441 | def __init__(self, items, total, page, per_page): 442 | self.items = items 443 | self.total = total 444 | self.page = page 445 | self.per_page = per_page 446 | 447 | @property 448 | def pages(self): 449 | return (self.total - 1) // self.per_page + 1 450 | 451 | @property 452 | def has_next(self): 453 | return self.page < self.pages 454 | 455 | @property 456 | def has_prev(self): 457 | return self.page > 1 458 | 459 | def to_dict(self): 460 | return { 461 | "items": [item.to_dict() for item in self.items], 462 | "total": self.total, 463 | "page": self.page, 464 | "per_page": self.per_page, 465 | "pages": self.pages, 466 | "has_next": self.has_next, 467 | "has_prev": self.has_prev, 468 | } 469 | 470 | 471 | class AuditLog(Model): 472 | __tablename__ = "audit_logs" 473 | id = Column(Integer, primary_key=True) 474 | table_name = Column(String) 475 | operation = Column(String) 476 | timestamp = Column(DateTime, default=datetime.now(tz=timezone.utc)) 477 | data = Column(JSON) 478 | -------------------------------------------------------------------------------- /tests/test_model.py: -------------------------------------------------------------------------------- 1 | import pytest 2 | import sqlalchemy 3 | from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker 4 | from sqlalchemy import Column, Integer, String, Boolean, DateTime 5 | from datetime import datetime, timezone 6 | from typing import Optional 7 | 8 | # Import your Model classes here 9 | # Assuming the codebase is in a file named 'models.py' 10 | from sqla_async_orm_queries.models import Model, PydanticModelMixin, AuditLog 11 | 12 | # Define a test database URL (in-memory SQLite) 13 | TEST_DATABASE_URL = "sqlite+aiosqlite:///:memory:" 14 | 15 | # Create the async engine and factory 16 | engine = create_async_engine(TEST_DATABASE_URL, echo=True) 17 | AsyncSessionLocal = async_sessionmaker(bind=engine, expire_on_commit=False) 18 | 19 | # Initialize the session factory in your Model 20 | Model.init_session(AsyncSessionLocal) 21 | 22 | # Define test models 23 | class User(Model): 24 | __tablename__ = 'users' 25 | id = Column(Integer, primary_key=True,autoincrement=True) 26 | name = Column(String, nullable=False) 27 | email = Column(String, unique=True, nullable=False) 28 | is_active = Column(Boolean, default=True) 29 | created_at = Column(DateTime, default=datetime.now(tz=timezone.utc)) 30 | 31 | class PydanticModel(PydanticModelMixin): 32 | id: Optional[int] = None 33 | name: Optional[str] = None 34 | email: Optional[str] = None 35 | is_active: Optional[bool] = True 36 | created_at: Optional[datetime] = datetime.now(tz=timezone.utc) 37 | 38 | # Attach event listeners for audit logging 39 | User.attach_listeners() 40 | 41 | 42 | @pytest.fixture(scope="session", autouse=True) 43 | async def setup_database(): 44 | # Create the tables 45 | async with engine.begin() as conn: 46 | await conn.run_sync(Model.metadata.create_all) 47 | yield 48 | async with engine.begin() as conn: 49 | await conn.run_sync(Model.metadata.drop_all) 50 | 51 | 52 | # Test cases 53 | @pytest.mark.asyncio 54 | async def test_create_user(setup_database): 55 | user_data = {'name': 'John Doe', 'email': 'john@example.com'} 56 | user = await User.create(user_data) 57 | assert user.id is not None 58 | assert user.name == 'John Doe' 59 | assert user.email == 'john@example.com' 60 | 61 | @pytest.mark.asyncio 62 | async def test_read_user(): 63 | user = await User.select_one(User.email == 'john@example.com') 64 | assert user is not None 65 | assert user.name == 'John Doe' 66 | 67 | @pytest.mark.asyncio 68 | async def test_update_user(): 69 | await User.update({'name': 'Jane Doe'}, User.email == 'john@example.com') 70 | user = await User.select_one(User.email == 'john@example.com') 71 | assert user.name == 'Jane Doe' 72 | 73 | @pytest.mark.asyncio 74 | async def test_soft_delete_user(): 75 | await User.soft_delete(User.email == 'john@example.com') 76 | user = await User.select_one(User.email == 'john@example.com') 77 | assert user is None # Because is_active is False 78 | user = await User.select_one(User.email == 'john@example.com', include_inactive=True) 79 | assert user is not None 80 | assert user.is_deleted is False 81 | 82 | @pytest.mark.asyncio 83 | async def test_pagination(): 84 | # Create multiple users 85 | users_data = [ 86 | {'name': f'User {i}', 'email': f'user{i}@example.com'} for i in range(1, 21) 87 | ] 88 | await User.bulk_create(users_data) 89 | 90 | # Paginate 91 | pagination = await User.select_with_pagination(page=2, per_page=5) 92 | assert pagination.page == 2 93 | assert pagination.per_page == 5 94 | assert len(pagination.items) == 5 95 | assert pagination.total == 21 96 | assert pagination.pages == 5 97 | assert pagination.has_next == True 98 | assert pagination.has_prev == True 99 | 100 | @pytest.mark.asyncio 101 | async def test_validation(): 102 | # Missing 'email' field 103 | invalid_user_data = {'name': 'Invalid User'} 104 | with pytest.raises(sqlalchemy.exc.IntegrityError) as excinfo: 105 | await User.create(invalid_user_data) 106 | assert 'IntegrityError' in str(excinfo.value) 107 | 108 | @pytest.mark.asyncio 109 | async def test_event_listener(): 110 | # Create a user to trigger the event listener 111 | user_data = {'name': 'Event User', 'email': 'event@example.com'} 112 | await User.create(user_data) 113 | 114 | # Check if an audit log was created 115 | audit_logs = await AuditLog.select_all() 116 | assert len(audit_logs) > 0 117 | audit_log = audit_logs[-1] 118 | assert audit_log.operation == 'insert' 119 | assert audit_log.table_name == 'users' 120 | assert audit_log.data['email'] == 'event@example.com' 121 | 122 | @pytest.mark.asyncio 123 | async def test_detach_listeners(): 124 | # Detach listeners 125 | User.detach_listeners() 126 | 127 | # Create a user without triggering the event listener 128 | user_data = {'name': 'No Audit User', 'email': 'noaudit@example.com'} 129 | await User.create(user_data) 130 | 131 | # Reattach listeners 132 | User.attach_listeners() 133 | 134 | # Check if no new audit log was created 135 | audit_logs = await AuditLog.select_all() 136 | assert all(log.data.get('email') != 'noaudit@example.com' for log in audit_logs) 137 | 138 | @pytest.mark.asyncio 139 | async def test_serialization(): 140 | user = await User.select_one(User.email == 'event@example.com') 141 | user_dict = user.to_dict() 142 | assert isinstance(user_dict, dict) 143 | assert user_dict['email'] == 'event@example.com' 144 | 145 | user_json = user.to_json() 146 | assert isinstance(user_json, str) 147 | assert '"email": "event@example.com"' in user_json 148 | 149 | @pytest.mark.asyncio 150 | async def test_dynamic_filters(): 151 | filters = {'name': 'Event User'} 152 | conditions = User.build_filters(filters) 153 | users = await User.select_all(*conditions) 154 | assert len(users) == 1 155 | assert users[0].email == 'event@example.com' 156 | 157 | @pytest.mark.asyncio 158 | async def test_transactional(): 159 | async def operations(session): 160 | await User.create({'name': 'Transact User', 'email': 'transact@example.com'},session=session) 161 | raise Exception("Intentional Error") 162 | 163 | with pytest.raises(Exception) as excinfo: 164 | async with User.get_session() as session: 165 | await User.transactional(operations,session) 166 | assert 'Intentional Error' in str(excinfo.value) 167 | 168 | # Ensure that the user was not created due to rollback 169 | user = await User.select_one(User.email == 'transact@example.com') 170 | assert user is None 171 | 172 | @pytest.mark.asyncio 173 | async def test_bulk_create(): 174 | users_data = [ 175 | {'name': 'Bulk User 1', 'email': 'bulk1@example.com'}, 176 | {'name': 'Bulk User 2', 'email': 'bulk2@example.com'}, 177 | ] 178 | users = await User.bulk_create(users_data) 179 | assert len(users) == 2 180 | emails = [user.email for user in users] 181 | assert 'bulk1@example.com' in emails 182 | assert 'bulk2@example.com' in emails 183 | 184 | @pytest.mark.asyncio 185 | async def test_bulk_update(): 186 | await User.bulk_update( 187 | {'name': 'Updated Bulk User 2'}, 188 | User.email.like('bulk%@example.com'), 189 | 190 | ) 191 | users = await User.select_all(User.email.like('bulk%@example.com')) 192 | names = [user.name for user in users] 193 | assert 'Updated Bulk User 2' in names 194 | 195 | @pytest.mark.asyncio 196 | async def test_bulk_delete(): 197 | await User.bulk_delete([User.email == 'bulk1@example.com', User.email == 'bulk2@example.com']) 198 | users = await User.select_all(User.email.like('bulk%@example.com')) 199 | assert len(users) == 0 200 | 201 | @pytest.mark.asyncio 202 | async def test_soft_delete_with_include_inactive(): 203 | # Create a user 204 | user_data = {'name': 'Inactive User', 'email': 'inactive@example.com'} 205 | await User.create(user_data) 206 | 207 | # Soft delete the user 208 | await User.soft_delete(User.email == 'inactive@example.com') 209 | 210 | # Select including inactive users 211 | user = await User.select_one(User.email == 'inactive@example.com', include_inactive=True) 212 | assert user is not None 213 | assert user.is_deleted == False 214 | 215 | # Select excluding inactive users 216 | user = await User.select_one(User.email == 'inactive@example.com') 217 | assert user is None 218 | 219 | @pytest.mark.asyncio 220 | async def test_event_listener_with_detach_context(): 221 | async def create_user_without_audit(): 222 | with User.listeners_disabled(): 223 | await User.create({'name': 'No Audit Context', 'email': 'noauditcontext@example.com'}) 224 | 225 | await create_user_without_audit() 226 | 227 | audit_logs = await AuditLog.select_all() 228 | emails = [log.data.get('email') for log in audit_logs] 229 | assert 'noauditcontext@example.com' not in emails --------------------------------------------------------------------------------