├── .github └── workflows │ └── python-package.yml ├── .gitignore ├── .isort.cfg ├── CHANGELOG.rst ├── LICENSE ├── Makefile ├── README.md ├── examples └── __init__.py ├── fastapi_asyncpg ├── __init__.py └── sql.py ├── mypy.ini ├── setup.cfg ├── setup.py ├── tests ├── __init__.py ├── conftest.py ├── data.sql ├── test_con_rollback.py └── test_db.py └── tox.ini /.github/workflows/python-package.yml: -------------------------------------------------------------------------------- 1 | # This workflow will install Python dependencies, run tests and lint with a variety of Python versions 2 | # For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions 3 | 4 | name: Python package 5 | 6 | on: 7 | push: 8 | branches: [ master ] 9 | pull_request: 10 | branches: [ master ] 11 | 12 | jobs: 13 | pre-checks: 14 | runs-on: ubuntu-latest 15 | strategy: 16 | matrix: 17 | python-version: [3.9] 18 | 19 | steps: 20 | - name: Checkout the repository 21 | uses: actions/checkout@v2 22 | 23 | - name: Setup Python 24 | uses: actions/setup-python@v1 25 | with: 26 | python-version: ${{ matrix.python-version }} 27 | 28 | - name: Install package 29 | run: | 30 | pip install mypy 31 | pip install black 32 | pip install isort 33 | - name: Run pre-checks 34 | run: | 35 | mypy fastapi_asyncpg/ 36 | isort -c -rc fastapi_asyncpg/ 37 | black -l 80 --check --verbose fastapi_asyncpg/ 38 | # Job to run tests 39 | tests: 40 | runs-on: ubuntu-latest 41 | strategy: 42 | matrix: 43 | python-version: [3.7, 3.8, 3.9] 44 | # Set environment variables 45 | steps: 46 | - name: Checkout the repository 47 | uses: actions/checkout@v2 48 | 49 | - name: Setup Python 50 | uses: actions/setup-python@v1 51 | with: 52 | python-version: ${{ matrix.python-version }} 53 | 54 | - name: Install the package 55 | run: | 56 | pip install -e .[test] 57 | 58 | - name: Run tests 59 | run: | 60 | pytest -vs tests/ 61 | 62 | - name: Upload coverage to Codecov 63 | uses: codecov/codecov-action@v1 64 | with: 65 | file: ./coverage.xml 66 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | **/__pycache__ 2 | env/ 3 | venv/ 4 | .coverage 5 | .env 6 | .idea 7 | .installed.cfg 8 | .pytest_cache/ 9 | .mypy_cache/ 10 | .tox/ 11 | bin/ 12 | coverage.xml 13 | develop-eggs/ 14 | lib/ 15 | lib64 16 | parts/ 17 | pyvenv.cfg 18 | *.egg-info 19 | *.profraw 20 | *.py? 21 | *.swp 22 | -------------------------------------------------------------------------------- /.isort.cfg: -------------------------------------------------------------------------------- 1 | [settings] 2 | force_single_line=True 3 | sections=FUTURE,THIRDPARTY,FIRSTPARTY,LOCALFOLDER,STDLIB 4 | no_lines_before=LOCALFOLDER,THIRDPARTY,FIRSTPARTY,STDLIB 5 | force_alphabetical_sort=True 6 | -------------------------------------------------------------------------------- /CHANGELOG.rst: -------------------------------------------------------------------------------- 1 | - 1.0.1 2 | Fix testing pool incorrectly disconnected 3 | 4 | - 1.0.0 5 | Initial release 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2020 Jordi collell 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | .PHONY: isort black flake8 mypy 2 | 3 | lint: isort black flake8 mypy 4 | 5 | isort: 6 | isort fastapi_asyncpg 7 | 8 | black: 9 | black fastapi_asyncpg/ -l 80 10 | 11 | flake8: 12 | flake8 fastapi_asyncpg 13 | 14 | mypy: 15 | mypy fastapi_asyncpg 16 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # FastAPI AsyncPG 2 | 3 | FastAPI integration for AsyncPG 4 | 5 | ## Narrative 6 | 7 | First of all, so sorry for my poor english. I will be so happy, 8 | if someone pushes a PR correcting all my english mistakes. Anyway 9 | I will try to do my best. 10 | 11 | Looking at fastapi ecosystem seems like everyone is trying to integrate 12 | fastapi with orms, but from my experience working with raw 13 | sql I'm so productive. 14 | 15 | If you think a bit around, your real model layer, is the schema on your 16 | db (you can add abastractions on top of it), but what ends 17 | is your data, and these are tables, columns and rows. 18 | 19 | Also, sql, it's one of the best things I learned 20 | because it's something that always is there. 21 | 22 | On another side, postgresql it's robust and rock solid, 23 | thousands of projects depend on it, and use it as their storage layer. 24 | AsyncPG it's a crazy fast postgresql driver 25 | written from scratch. 26 | 27 | FastAPI seems like a clean, and developer productive approach to web 28 | frameworks. It's crazy how well it integrates with OpenAPI, 29 | and how easy makes things to a developer to move on. 30 | 31 | ## Integration 32 | 33 | fastapi_asyncpg trys to integrate fastapi and asyncpg in an idiomatic way. 34 | fastapi_asyncpg when configured exposes two injectable providers to 35 | fastapi path functions, can use: 36 | 37 | - `db.connection` : it's just a raw connection picked from the pool, 38 | that it's auto released when pathfunction ends, this is mostly 39 | merit of the DI system around fastapi. 40 | 41 | - `db.transaction`: the same, but wraps the pathfuncion on a transaction 42 | this is more or less the same than the `atomic` decorator from Django. 43 | also `db.atomic` it's aliased 44 | 45 | ```python 46 | from fastapi import FastAPI 47 | from fastapi import Depends 48 | from fastapi_asyncpg import configure_asyncpg 49 | 50 | app = FastAPI() 51 | # we need to pass the fastapi app to make use of lifespan asgi events 52 | db = configure_asyncpg(app, "postgresql://postgres:postgres@localhost/db") 53 | 54 | @db.on_init 55 | async def initialization(conn): 56 | # you can run your db initialization code here 57 | await conn.execute("SELECT 1") 58 | 59 | 60 | @app.get("/") 61 | async def get_content(db=Depends(db.connection)): 62 | rows = await db.fetch("SELECT wathever FROM tablexxx") 63 | return [dict(r) for r in rows] 64 | 65 | @app.post("/") 66 | async def mutate_something_compled(db=Depends(db.atomic)) 67 | await db.execute() 68 | await db.execute() 69 | # if something fails, everyting is rolleback, you know all or nothing 70 | ``` 71 | 72 | And there's also an `initialization` callable on the main factory function. 73 | That can be used like in flask to initialize whatever you need on the db. 74 | The `initialization` is called right after asyncpg stablishes a connection, 75 | and before the app fully boots. (Some projects use this as a poor migration 76 | runner, not the best practice if you are deploying multiple 77 | instances of the app). 78 | 79 | ## Testing 80 | 81 | For testing we use [pytest-docker-fixtures](https://pypi.org/project/pytest-docker-fixtures/), it requires docker on the host machine or on whatever CI you use 82 | (seems like works as expected with github actions) 83 | 84 | It works, creating a container for the session and exposing it as pytest fixture. 85 | It's a good practice to run tests with a real database, and 86 | pytest-docker-fixtures make it's so easy. As a bonus, all fixtures run on a CI. 87 | We use Jenkins witht docker and docker, but also seems like travis and github actions 88 | also work. 89 | 90 | The fixture needs to be added to the pytest plugins `conftest.py` file. 91 | 92 | on conftest.py 93 | 94 | ```python 95 | pytest_plugins = [ 96 | "pytest_docker_fixtures", 97 | ] 98 | ``` 99 | 100 | With this in place, we can just yield a pg fixture 101 | 102 | ```python 103 | from pytest_docker_fixtures import images 104 | 105 | # image params can be configured from here 106 | images.configure( 107 | "postgresql", "postgres", "11.1", env={"POSTGRES_DB": "test_db"} 108 | ) 109 | 110 | # and then on our test we have a pg container running 111 | # ready to recreate our db 112 | async def test_pg(pg): 113 | host, port = pg 114 | dsn = f"postgresql://postgres@{host}:{port}/test_db" 115 | await asyncpg.Connect(dsn=dsn) 116 | # let's go 117 | 118 | ``` 119 | 120 | With this in place, we can just create our own pytest.fixture that 121 | _patches_ the app dsn to make it work with our custom created 122 | container. 123 | 124 | ````python 125 | 126 | from .app import app, db 127 | from async_asgi_testclient import TestClient 128 | 129 | import pytest 130 | 131 | pytestmark = pytest.mark.asyncio 132 | 133 | @pytest.fixture 134 | async def asgi_app(pg) 135 | host, port = pg 136 | dsn = f"postgresql://postgres@{host}:{port}/test_db" 137 | # here we patch the dsn for the db 138 | # con_opts: are also accessible 139 | db.dsn = dsn 140 | yield app, db 141 | 142 | async def test_something(asgi_app): 143 | app, db = asgi_app 144 | async with db.pool.acquire() as db: 145 | # setup your test state 146 | 147 | # this context manager handlers lifespan events 148 | async with TestClient(app) as client: 149 | res = await client.request("/") 150 | ``` 151 | 152 | Anyway if the application will grow, to multiples subpackages, 153 | and apps, we trend to build the main app as a factory, that 154 | creates it, something like: 155 | 156 | ```python 157 | from fastapi_asyncpg import configure_asyncpg 158 | from apppackage import settings 159 | 160 | import venusian 161 | 162 | def make_asgi_app(settings): 163 | app = FastAPI() 164 | db = configure_asyncpg(settings.DSN) 165 | 166 | scanner = venusian.Scanner(app=app) 167 | venusian.scan(theapp) 168 | return app 169 | ```` 170 | 171 | Then on the fixture, we just need, to factorze and app from our function 172 | 173 | ```python 174 | 175 | from .factory import make_asgi_app 176 | from async_asgi_testclient import TestClient 177 | 178 | import pytest 179 | 180 | pytestmark = pytest.mark.asyncio 181 | 182 | @pytest.fixture 183 | async def asgi_app(pg) 184 | host, port = pg 185 | dsn = f"postgresql://postgres@{host}:{port}/test_db" 186 | app = make_asgi_app({"dsn": dsn}) 187 | # ther's a pointer on the pool into app.state 188 | yield app 189 | 190 | async def test_something(asgi_app): 191 | app = asgi_app 192 | pool = app.state.pool 193 | async with db.pool.acquire() as db: 194 | # setup your test state 195 | 196 | # this context manager handlers lifespan events 197 | async with TestClient(app) as client: 198 | res = await client.request("/") 199 | 200 | ``` 201 | 202 | There's also another approach exposed and used on [tests](tests/test_db.py), 203 | that exposes a single connection to the test and rolls back changes on end. 204 | We use this approach on a large project (500 tables per schema and 205 | multiples schemas), and seems like it speeds up a bit test creation. 206 | This approach is what [Databases](https://www.encode.io/databases/) it's using. 207 | Feel free to follow the tests to see if it feets better. 208 | 209 | ## Extras 210 | 211 | There are some utility functions I daily use with asyncpg that helps me 212 | speed up some sql operations like, they are all on sql.py, and mostly are 213 | self documented. They are in use on tests. 214 | 215 | ### Authors 216 | 217 | `fastapi_asyncpg` was written by `Jordi collell `\_. 218 | -------------------------------------------------------------------------------- /examples/__init__.py: -------------------------------------------------------------------------------- 1 | from fastapi import FastAPI 2 | from fastapi import Depends 3 | from fastapi_asyncpg import configure_asyncpg 4 | 5 | import pydantic as pd 6 | 7 | app = FastAPI() 8 | 9 | 10 | db = configure_asyncpg(app, "postgresql://postgres:postgres@localhost/db") 11 | 12 | 13 | class Demo(pd.BaseModel): 14 | key: str 15 | value: str 16 | 17 | 18 | class DemoObj(Demo): 19 | demo_id: int 20 | 21 | 22 | @db.on_init 23 | async def initialize_db(db): 24 | await db.execute( 25 | """ 26 | CREATE TABLE IF NOT EXISTS demo ( 27 | demo_id serial primary key, 28 | key varchar not null, 29 | value varchar not null, 30 | UNIQUE(key) 31 | ); 32 | """ 33 | ) 34 | 35 | 36 | @app.post("/", response_model=DemoObj) 37 | async def add_resource(data: Demo, db=Depends(db.connection)): 38 | """ 39 | Add a resource to db: 40 | curl -X POST -d '{"key": "test", "value": "asdf"}' \ 41 | http://localhost:8000/ 42 | """ 43 | result = await db.fetchrow( 44 | """ 45 | INSERT into demo values (default, $1, $2) returning * 46 | """, 47 | data.key, 48 | data.value, 49 | ) 50 | return dict(result) 51 | 52 | 53 | @app.get("/{key:str}", response_model=DemoObj) 54 | async def get_resouce(key: str, db=Depends(db.connection)): 55 | result = await db.fetchrow( 56 | """ 57 | SELECT * from demo where key=$1 58 | """, 59 | key, 60 | ) 61 | return dict(result) 62 | -------------------------------------------------------------------------------- /fastapi_asyncpg/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import annotations 2 | 3 | from fastapi import FastAPI 4 | 5 | import asyncpg 6 | import typing 7 | 8 | 9 | async def noop(db: asyncpg.Connection): 10 | return 11 | 12 | 13 | class configure_asyncpg: 14 | def __init__( 15 | self, 16 | app: FastAPI, 17 | dsn: str, 18 | *, 19 | init_db: typing.Callable = None, # callable for running sql on init 20 | pool=None, # usable on testing 21 | **options, 22 | ): 23 | """This is the entry point to configure an asyncpg pool with fastapi. 24 | 25 | Arguments 26 | app: The fastapp application that we use to store the pool 27 | and bind to it's initialitzation events 28 | dsn: A postgresql desn like postgresql://user:password@postgresql:5432/db 29 | init_db: Optional callable that receives a db connection, 30 | for doing an initialitzation of it 31 | pool: This is used for testing to skip the pool initialitzation 32 | an just use the SingleConnectionTestingPool 33 | **options: connection options to directly pass to asyncpg driver 34 | see: https://magicstack.github.io/asyncpg/current/api/index.html#connection-pools 35 | """ 36 | self.app = app 37 | self.dsn = dsn 38 | self.init_db = init_db 39 | self.con_opts = options 40 | self._pool = pool 41 | self.app.router.add_event_handler("startup", self.on_connect) 42 | self.app.router.add_event_handler("shutdown", self.on_disconnect) 43 | 44 | async def on_connect(self): 45 | """handler called during initialitzation of asgi app, that connects to 46 | the db""" 47 | # if the pool is comming from outside (tests), don't connect it 48 | if self._pool: 49 | self.app.state.pool = self._pool 50 | return 51 | pool = await asyncpg.create_pool(dsn=self.dsn, **self.con_opts) 52 | async with pool.acquire() as db: 53 | await self.init_db(db) 54 | self.app.state.pool = pool 55 | 56 | async def on_disconnect(self): 57 | # if the pool is comming from outside, don't desconnect it 58 | # someone else will do (usualy a pytest fixture) 59 | if self._pool: 60 | return 61 | await self.app.state.pool.close() 62 | 63 | def on_init(self, func): 64 | self.init_db = func 65 | return func 66 | 67 | @property 68 | def pool(self): 69 | return self.app.state.pool 70 | 71 | async def connection(self): 72 | """ 73 | A ready to use connection Dependency just usable 74 | on your path functions that gets a connection from the pool 75 | Example: 76 | db = configure_asyncpg(app, "dsn://") 77 | @app.get("/") 78 | async def get_content(db = Depens(db.connection)): 79 | await db.fetch("SELECT * from pg_schemas") 80 | """ 81 | async with self.pool.acquire() as db: 82 | yield db 83 | 84 | async def transaction(self): 85 | """ 86 | A ready to use transaction Dependecy just usable on a path function 87 | Example: 88 | db = configure_asyncpg(app, "dsn://") 89 | @app.get("/") 90 | async def get_content(db = Depens(db.transaction)): 91 | await db.execute("insert into keys values (1, 2)") 92 | await db.execute("insert into keys values (1, 2)") 93 | All view function executed, are wrapped inside a postgresql transaction 94 | """ 95 | async with self.pool.acquire() as db: 96 | txn = db.transaction() 97 | await txn.start() 98 | try: 99 | yield db 100 | except: # noqa 101 | await txn.rollback() 102 | raise 103 | else: 104 | await txn.commit() 105 | 106 | atomic = transaction 107 | 108 | 109 | class SingleConnectionTestingPool: 110 | """A fake pool that simulates pooling, but runs on 111 | a single transaction that it's rolled back after 112 | each test. 113 | With some large schemas this seems to be faster than 114 | the other approach 115 | """ 116 | 117 | def __init__( 118 | self, 119 | conn: asyncpg.Connection, 120 | initialize: typing.Callable = None, 121 | add_logger_postgres: bool = False, 122 | ): 123 | self._conn = conn 124 | self.tx = None 125 | self.started = False 126 | self.add_logger_postgres = add_logger_postgres 127 | self.initialize = initialize 128 | 129 | def acquire(self, *, timeout=None): 130 | return ConAcquireContext(self._conn, self) 131 | 132 | async def start(self): 133 | if self.started: 134 | return 135 | 136 | def log_postgresql(con, message): 137 | print(message) 138 | 139 | if self.add_logger_postgres: 140 | self._conn.add_log_listener(log_postgresql) 141 | self.tx = self._conn.transaction() 142 | await self.tx.start() 143 | await self.initialize(self._conn) 144 | self.started = True 145 | 146 | async def release(self): 147 | if self.tx: 148 | await self.tx.rollback() 149 | 150 | def __getattr__(self, key): 151 | return getattr(self._conn, key) 152 | 153 | 154 | async def create_pool_test( 155 | dsn: str, 156 | *, 157 | initialize: typing.Callable = None, 158 | add_logger_postgres: bool = False, 159 | ): 160 | """This part is only used for testing, 161 | we create a fake "pool" that just starts a connecion, 162 | that does a transaction inside it""" 163 | conn = await asyncpg.connect(dsn=dsn) 164 | pool = SingleConnectionTestingPool( 165 | conn, initialize=initialize, add_logger_postgres=add_logger_postgres 166 | ) 167 | return pool 168 | 169 | 170 | class ConAcquireContext: 171 | def __init__(self, conn, manager): 172 | self._conn = conn 173 | self.manager = manager 174 | 175 | async def __aenter__(self): 176 | if not self.manager.tx: 177 | await self.manager.start() 178 | self.tr = self._conn.transaction() 179 | await self.tr.start() 180 | return self._conn 181 | 182 | async def __aexit__(self, exc_type, exc, tb): 183 | if exc_type: 184 | await self.tr.rollback() 185 | else: 186 | await self.tr.commit() 187 | -------------------------------------------------------------------------------- /fastapi_asyncpg/sql.py: -------------------------------------------------------------------------------- 1 | """ 2 | helper function to scope sql to postgresql schema 3 | """ 4 | 5 | 6 | async def get(conn, table, condition="1 = 1", args=None, fields="*"): 7 | args = args or [] 8 | sql = f"select {fields} from {table} where {condition}" 9 | return await conn.fetchrow(sql, *args) 10 | 11 | 12 | async def select(conn, table, condition="1 = 1", args=None, fields="*"): 13 | args = args or [] 14 | sql = f"select {fields} from {table} where {condition}" 15 | return await conn.fetch(sql, *args) 16 | 17 | 18 | async def count(conn, table, where="1=1", args=None): 19 | args = args or [] 20 | return await conn.fetchval( 21 | f"select count(*) from {table} WHERE {where}", *args 22 | ) 23 | 24 | 25 | async def insert(conn, table, values): 26 | qs = "insert into {table} ({columns}) values ({values}) returning *".format( 27 | table=table, 28 | values=",".join([f"${p + 1}" for p in range(len(values.values()))]), 29 | columns=",".join(list(values.keys())), 30 | ) 31 | return await conn.fetchrow(qs, *list(values.values())) 32 | 33 | 34 | async def update(conn, table, conditions: dict, values: dict): 35 | qs = "update {table} set {columns} where {cond} returning *" 36 | counter = 1 37 | params = [] 38 | cond = [] 39 | vals = [] 40 | for column, value in conditions.items(): 41 | cond.append(f"{column}=${counter}") 42 | params.append(value) 43 | counter += 1 44 | for column, value in values.items(): 45 | vals.append(f"{column}=${counter}") 46 | params.append(value) 47 | counter += 1 48 | sql = qs.format( 49 | table=table, columns=" ,".join(vals), cond=" AND ".join(cond) 50 | ) 51 | return await conn.fetchrow(sql, *params) 52 | 53 | 54 | async def delete(db, table, condition, args=None): 55 | args = args or [] 56 | await db.execute(f"DELETE FROM {table} WHERE {condition}", *args) 57 | 58 | 59 | def query_to_json(query, name): 60 | """This query is useful to fetch a complex join 61 | with some aggregations as a single blob, and later, 62 | just hydrate it without having to iterate over the resultset 63 | 64 | .. Example: 65 | SELECT 66 | u.id::varchar, 67 | to_jsonb(array_agg(scopes)) as scopes, 68 | FROM auth.auth_user u 69 | LEFT join LATERAL ( 70 | SELECT id, scope 71 | FROM auth.auth_user_scope 72 | WHERE user_id=u.id 73 | ) scopes on true 74 | WHERE user_id = ANY($1) 75 | GROUP BY u.user_id; 76 | 77 | This query will fetch a list of users, and aggregate it's 78 | scopes as an array of dicts 79 | """ 80 | 81 | return f""" 82 | select 83 | array_to_json(array_agg(row_to_json(t))) as {name} 84 | from ( 85 | {query} 86 | ) as t 87 | """ 88 | 89 | 90 | async def load_sqlfile(db, file): 91 | fs = file.open("r") 92 | data = fs.read() 93 | fs.close() 94 | await db.execute(data) 95 | -------------------------------------------------------------------------------- /mypy.ini: -------------------------------------------------------------------------------- 1 | [mypy] 2 | namespace_packages=True 3 | ignore_missing_imports = True 4 | follow_imports=skip 5 | -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | [flake8] 2 | max_line_length = 120 3 | no-accept-encodings = True 4 | ignore = 5 | E302 6 | W391 7 | E701 8 | W504 9 | F901 10 | E252 11 | W503 12 | E203 13 | BLK100 14 | exclude = $*,.git,__pycache__,docs/source/conf.py,old,build,dist 15 | 16 | 17 | [mypy] 18 | namespace_packages=True 19 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import find_packages 2 | from setuptools import setup 3 | 4 | 5 | with open("README.md", "r") as f: 6 | readme = f.read() 7 | 8 | 9 | setup( 10 | name="fastapi_asyncpg", 11 | version="1.0.1", 12 | url="https://github.com/jordic/fastapi_asyncpg", 13 | license="MIT", 14 | author="Jordi collell", 15 | author_email="jordic@gmail.com", 16 | description="FastAPI integration for asyncpg", 17 | long_description=readme, 18 | long_description_content_type="text/markdown", 19 | packages=find_packages(exclude=("tests",)), 20 | install_requires=["fastapi", "asyncpg"], 21 | package_data={"fastapi": ["py.typed"]}, 22 | extras_require={ 23 | "dev": [ 24 | "black", 25 | "isort", 26 | "flake8", 27 | "tox", 28 | ], 29 | "docs": ["sphinx", "recommonmark"], 30 | "test": [ 31 | "pytest-docker-fixtures[pg]", 32 | "pytest", 33 | "pytest", 34 | "async_asgi_testclient", 35 | "pytest-asyncio", 36 | ], 37 | "publish": ["twine"], 38 | }, 39 | classifiers=[ 40 | "License :: OSI Approved :: MIT License", 41 | "Programming Language :: Python", 42 | "Programming Language :: Python :: 3", 43 | "Programming Language :: Python :: 3.6", 44 | "Programming Language :: Python :: 3.7", 45 | "Programming Language :: Python :: 3.8", 46 | "Programming Language :: Python :: 3.9", 47 | ], 48 | ) 49 | -------------------------------------------------------------------------------- /tests/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jordic/fastapi_asyncpg/8ab05b1dbeec039e52c676d2235b5e2e79da560a/tests/__init__.py -------------------------------------------------------------------------------- /tests/conftest.py: -------------------------------------------------------------------------------- 1 | pytest_plugins = [ 2 | "pytest_docker_fixtures", 3 | ] 4 | -------------------------------------------------------------------------------- /tests/data.sql: -------------------------------------------------------------------------------- 1 | 2 | 3 | CREATE TABLE test ( 4 | id serial primary key, 5 | item varchar, 6 | val varchar 7 | ); 8 | -------------------------------------------------------------------------------- /tests/test_con_rollback.py: -------------------------------------------------------------------------------- 1 | from async_asgi_testclient import TestClient 2 | from fastapi import Depends 3 | from fastapi import FastAPI 4 | from fastapi_asyncpg import configure_asyncpg 5 | from fastapi_asyncpg import create_pool_test 6 | from fastapi_asyncpg import sql 7 | from pathlib import Path 8 | from pytest_docker_fixtures import images 9 | 10 | import json 11 | import pydantic as pd 12 | import pytest 13 | import typing 14 | 15 | pytestmark = pytest.mark.asyncio 16 | 17 | dir = Path(__file__).parent 18 | 19 | images.configure( 20 | "postgresql", "postgres", "11.1", env={"POSTGRES_DB": "test_db"} 21 | ) 22 | 23 | 24 | @pytest.fixture 25 | async def pool(pg): 26 | host, port = pg 27 | url = f"postgresql://postgres@{host}:{port}/test_db" 28 | 29 | async def initialize(conn): 30 | await sql.load_sqlfile(conn, dir / "data.sql") 31 | 32 | pool = await create_pool_test(url, initialize=initialize) 33 | yield pool 34 | if pool._conn.is_closed(): 35 | return 36 | await pool.release() 37 | 38 | 39 | async def test_testing_pool_works(pool): 40 | async with pool.acquire() as db: 41 | await sql.insert(db, "test", {"item": "test", "val": "value"}) 42 | assert await sql.count(db, "test") == 1 43 | 44 | 45 | async def test_the_db_is_empty_again(pool): 46 | async with pool.acquire() as db: 47 | assert await sql.count(db, "test") == 0 48 | 49 | 50 | async def test_sql(pool): 51 | """ sql.py contains poor man sql helpers to work with sql and asyncpg """ 52 | async with pool.acquire() as db: 53 | res = await sql.insert(db, "test", {"item": "test", "val": "value"}) 54 | result = await sql.get(db, "test", "id=$1", args=[res["id"]]) 55 | assert dict(res) == dict(result) 56 | elements = await sql.select(db, "test") 57 | assert len(elements) == 1 58 | assert dict(elements[0]) == dict(result) 59 | elements = await sql.select(db, "test", condition="id=$1", args=[1]) 60 | assert dict(elements[0]) == dict(result) 61 | updated = await sql.update(db, "test", {"id": 1}, {"val": "value2"}) 62 | assert dict(updated) != dict(result) 63 | assert updated["val"] == "value2" 64 | 65 | res = await db.fetchrow(sql.query_to_json("SELECT * from test", "row")) 66 | data = json.loads(res["row"]) 67 | assert data[0] == dict(updated) 68 | await sql.delete(db, "test", "id=$1", args=[1]) 69 | assert await sql.count(db, "test") == 0 70 | 71 | 72 | async def test_app_with_fixture(pool): 73 | """ 74 | Things are more interesting when you want to test some 75 | data, and you want to setup the db state 76 | """ 77 | async with pool.acquire() as db: 78 | # we setup the db at a desired state 79 | await sql.insert(db, "test", {"item": "test", "val": "value"}) 80 | 81 | app = FastAPI() 82 | bdd = configure_asyncpg(app, "", pool=pool) 83 | 84 | @app.get("/") 85 | async def get(conn=Depends(bdd.connection)): 86 | res = await conn.fetch("SELECT * from test") 87 | return [dict(r) for r in res] 88 | 89 | async with TestClient(app) as client: 90 | res = await client.get("/") 91 | assert res.status_code == 200 92 | data = res.json() 93 | assert data[0]["item"] == "test" 94 | assert data[0]["val"] == "value" 95 | 96 | 97 | # we can go a bit crazy with pydantic 98 | # and simulate an orm wiht it 99 | # this could be a mixin, that you add to your schemas 100 | # also you need some __property__ with your primary key 101 | # and check it 102 | class Schema(pd.BaseModel): 103 | __tablename__ = "test" 104 | 105 | item: str 106 | val: str 107 | id: typing.Optional[int] 108 | 109 | async def save(self, db): 110 | if self.id is None: 111 | result = await sql.insert( 112 | db, self.__tablename__, self.dict(exclude_unset=True) 113 | ) 114 | self.id = result["id"] 115 | else: 116 | result = await sql.update_by( 117 | db, 118 | self.__tablename__, 119 | {"id": self.id}, 120 | self.dict(exclude=["id"]), 121 | ) 122 | for key, val in dict(result): 123 | setattr(self, key, val) 124 | 125 | 126 | async def test_experimental(pool): 127 | item = Schema(item="asdf", val="xxxx") 128 | async with pool.acquire() as db: 129 | await item.save(db) 130 | assert await sql.count(db, "test") == 1 131 | -------------------------------------------------------------------------------- /tests/test_db.py: -------------------------------------------------------------------------------- 1 | from async_asgi_testclient import TestClient 2 | from fastapi import Depends 3 | from fastapi import FastAPI 4 | from fastapi.exceptions import HTTPException 5 | from fastapi_asyncpg import configure_asyncpg 6 | from fastapi_asyncpg import sql 7 | from pytest_docker_fixtures import images 8 | from typing import Optional 9 | 10 | import pydantic as pd 11 | import pytest 12 | import asyncio 13 | 14 | images.configure( 15 | "postgresql", "postgres", "11.1", env={"POSTGRES_DB": "test_db"} 16 | ) 17 | 18 | pytestmark = pytest.mark.asyncio 19 | 20 | 21 | class KeyVal(pd.BaseModel): 22 | key: str 23 | value: str 24 | 25 | 26 | SCHEMA = """ 27 | DROP TABLE IF EXISTS keyval; 28 | CREATE TABLE keyval ( 29 | key varchar, 30 | value varchar, 31 | UNIQUE(key) 32 | ); 33 | """ 34 | 35 | 36 | @pytest.fixture(scope="function") 37 | async def asgiapp(pg): 38 | host, port = pg 39 | url = f"postgresql://postgres@{host}:{port}/test_db" 40 | app = FastAPI() 41 | bdd = configure_asyncpg(app, url, min_size=1, max_size=2) 42 | 43 | @bdd.on_init 44 | async def on_init(conn): 45 | await conn.execute(SCHEMA) 46 | 47 | @app.post("/", response_model=KeyVal) 48 | async def add_resource(data: KeyVal, db=Depends(bdd.connection)): 49 | result = await db.fetchrow( 50 | """ 51 | INSERT into keyval values ($1, $2) returning * 52 | """, 53 | data.key, 54 | data.value, 55 | ) 56 | return dict(result) 57 | 58 | @app.get("/transaction") 59 | async def with_transaction( 60 | q: Optional[int] = 0, db=Depends(bdd.transaction) 61 | ): 62 | for i in range(10): 63 | await db.execute( 64 | "INSERT INTO keyval values ($1, $2)", f"t{i}", f"t{i}" 65 | ) 66 | if q == 1: 67 | raise HTTPException(412) 68 | return dict(result="ok") 69 | 70 | @app.get("/{key:str}", response_model=KeyVal) 71 | async def get_resouce(key: str, db=Depends(bdd.connection)): 72 | result = await db.fetchrow( 73 | """ 74 | SELECT * from keyval where key=$1 75 | """, 76 | key, 77 | ) 78 | if result: 79 | return dict(result) 80 | 81 | yield app, bdd 82 | 83 | 84 | async def test_dependency(asgiapp): 85 | app, db = asgiapp 86 | async with TestClient(app) as client: 87 | res = await client.post("/", json={"key": "test", "value": "val1"}) 88 | assert res.status_code == 200 89 | res = await client.get("/test") 90 | assert res.status_code == 200 91 | assert res.json()["key"] == "test" 92 | assert res.json()["value"] == "val1" 93 | 94 | 95 | async def test_transaction(asgiapp): 96 | app, _ = asgiapp 97 | async with TestClient(app) as client: 98 | res = await client.get("/transaction") 99 | assert res.status_code == 200 100 | async with app.state.pool.acquire() as db: 101 | await sql.count(db, "keyval") == 10 102 | 103 | 104 | async def test_transaction_fails(asgiapp): 105 | app, _ = asgiapp 106 | async with TestClient(app) as client: 107 | res = await client.get("/transaction?q=1") 108 | assert res.status_code == 412 109 | async with app.state.pool.acquire() as db: 110 | await sql.count(db, "keyval") == 0 111 | 112 | 113 | async def test_pool_releases_connections(asgiapp): 114 | app, db = asgiapp 115 | async with TestClient(app) as client: 116 | res = await client.post("/", json={"key": "test", "value": "val1"}) 117 | assert res.status_code == 200 118 | tasks = [] 119 | for i in range(20): 120 | tasks.append(client.get("/test")) 121 | 122 | await asyncio.gather(*tasks) 123 | async with app.state.pool.acquire() as db: 124 | result = await db.fetchval( 125 | "SELECT sum(numbackends) FROM pg_stat_database;" 126 | ) 127 | assert result == 2 128 | -------------------------------------------------------------------------------- /tox.ini: -------------------------------------------------------------------------------- 1 | [tox] 2 | envlist = py37,py38,py39 3 | 4 | [testenv] 5 | commands = py.test fastapi_resource 6 | deps = pytest 7 | --------------------------------------------------------------------------------