├── .github
├── ISSUE_TEMPLATE
│ ├── bug_report.md
│ └── feature_request.md
└── pull_request_template.md
├── .gitignore
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── Dockerfile
├── Jenkinsfile
├── Licence
├── Readme.md
├── alembic.ini
├── alembic
├── README
├── env.py
├── script.py.mako
└── versions
│ ├── 0a2609ed2b6b_second_migration.py
│ └── f380633efe61_first_migration.py
├── docker-compose.yml
├── envs
├── .env
├── .env.dev
└── .env.test
├── load_env.py
├── main.py
├── main_dev.py
├── populate.py
├── requirements.txt
├── src
├── app.py
└── graphql
│ ├── core
│ └── config.py
│ ├── db
│ └── session.py
│ ├── fragments
│ ├── stickynotes_fragments.py
│ └── user_fragments.py
│ ├── helpers
│ └── helper.py
│ ├── models
│ ├── __init__.py
│ ├── stickynotes_model.py
│ └── user_model.py
│ ├── resolvers
│ ├── stickynote_resolver.py
│ └── user_resolver.py
│ ├── scalars
│ ├── stickynotes_scalar.py
│ └── user_scalar.py
│ └── schemas
│ ├── mutation_schema.py
│ └── query_schema.py
├── test.py
└── tests
├── conftest.py
├── graphql
├── db.py
├── mutations.py
├── queries.py
└── test_user_stickynotes.py
└── load_test_env.py
/.github/ISSUE_TEMPLATE/bug_report.md:
--------------------------------------------------------------------------------
1 | ---
2 | name: Bug report
3 | about: Create a report to help us improve
4 | title: ''
5 | labels: ''
6 | assignees: ''
7 |
8 | ---
9 |
10 | **Describe the bug**
11 | A clear and concise description of what the bug is.
12 |
13 | **To Reproduce**
14 | Steps to reproduce the behavior:
15 | 1. Go to '...'
16 | 2. Click on '....'
17 | 3. Scroll down to '....'
18 | 4. See error
19 |
20 | **Expected behavior**
21 | A clear and concise description of what you expected to happen.
22 |
23 | **Screenshots**
24 | If applicable, add screenshots to help explain your problem.
25 |
26 | **Desktop (please complete the following information):**
27 | - OS: [e.g. iOS]
28 | - Browser [e.g. chrome, safari]
29 | - Version [e.g. 22]
30 |
31 | **Smartphone (please complete the following information):**
32 | - Device: [e.g. iPhone6]
33 | - OS: [e.g. iOS8.1]
34 | - Browser [e.g. stock browser, safari]
35 | - Version [e.g. 22]
36 |
37 | **Additional context**
38 | Add any other context about the problem here.
39 |
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/feature_request.md:
--------------------------------------------------------------------------------
1 | ---
2 | name: Feature request
3 | about: Suggest an idea for this project
4 | title: ''
5 | labels: ''
6 | assignees: ''
7 |
8 | ---
9 |
10 | **Is your feature request related to a problem? Please describe.**
11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
12 |
13 | **Describe the solution you'd like**
14 | A clear and concise description of what you want to happen.
15 |
16 | **Describe alternatives you've considered**
17 | A clear and concise description of any alternative solutions or features you've considered.
18 |
19 | **Additional context**
20 | Add any other context or screenshots about the feature request here.
21 |
--------------------------------------------------------------------------------
/.github/pull_request_template.md:
--------------------------------------------------------------------------------
1 | # Pull Request Template
2 |
3 | ## Description
4 |
5 | Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
6 |
7 | Fixes # (issue)
8 |
9 | ## Type of change
10 |
11 | Please delete options that are not relevant.
12 |
13 | - [ ] Bug fix (non-breaking change which fixes an issue)
14 | - [ ] New feature (non-breaking change which adds functionality)
15 | - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
16 | - [ ] This change requires a documentation update
17 |
18 | ## How Has This Been Tested?
19 |
20 | Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
21 |
22 | - [ ] Test A
23 | - [ ] Test B
24 |
25 | **Test Configuration**:
26 | * Firmware version:
27 | * Hardware:
28 | * Toolchain:
29 | * SDK:
30 |
31 | ## Checklist:
32 |
33 | - [ ] My code follows the style guidelines of this project
34 | - [ ] I have performed a self-review of my own code
35 | - [ ] I have commented my code, particularly in hard-to-understand areas
36 | - [ ] I have made corresponding changes to the documentation
37 | - [ ] My changes generate no new warnings
38 | - [ ] I have added tests that prove my fix is effective or that my feature works
39 | - [ ] New and existing unit tests pass locally with my changes
40 | - [ ] Any dependent changes have been merged and published in downstream modules
41 | - [ ] I have checked my code and corrected any misspellings
42 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | env/
2 | __pycache__/
3 | *.db
4 | .coverage
5 |
--------------------------------------------------------------------------------
/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------
1 | # Contributor Covenant Code of Conduct
2 |
3 | ## Our Pledge
4 |
5 | We as members, contributors, and leaders pledge to make participation in our
6 | community a harassment-free experience for everyone, regardless of age, body
7 | size, visible or invisible disability, ethnicity, sex characteristics, gender
8 | identity and expression, level of experience, education, socio-economic status,
9 | nationality, personal appearance, race, religion, or sexual identity
10 | and orientation.
11 |
12 | We pledge to act and interact in ways that contribute to an open, welcoming,
13 | diverse, inclusive, and healthy community.
14 |
15 | ## Our Standards
16 |
17 | Examples of behavior that contributes to a positive environment for our
18 | community include:
19 |
20 | * Demonstrating empathy and kindness toward other people
21 | * Being respectful of differing opinions, viewpoints, and experiences
22 | * Giving and gracefully accepting constructive feedback
23 | * Accepting responsibility and apologizing to those affected by our mistakes,
24 | and learning from the experience
25 | * Focusing on what is best not just for us as individuals, but for the
26 | overall community
27 |
28 | Examples of unacceptable behavior include:
29 |
30 | * The use of sexualized language or imagery, and sexual attention or
31 | advances of any kind
32 | * Trolling, insulting or derogatory comments, and personal or political attacks
33 | * Public or private harassment
34 | * Publishing others' private information, such as a physical or email
35 | address, without their explicit permission
36 | * Other conduct which could reasonably be considered inappropriate in a
37 | professional setting
38 |
39 | ## Enforcement Responsibilities
40 |
41 | Community leaders are responsible for clarifying and enforcing our standards of
42 | acceptable behavior and will take appropriate and fair corrective action in
43 | response to any behavior that they deem inappropriate, threatening, offensive,
44 | or harmful.
45 |
46 | Community leaders have the right and responsibility to remove, edit, or reject
47 | comments, commits, code, wiki edits, issues, and other contributions that are
48 | not aligned to this Code of Conduct, and will communicate reasons for moderation
49 | decisions when appropriate.
50 |
51 | ## Scope
52 |
53 | This Code of Conduct applies within all community spaces, and also applies when
54 | an individual is officially representing the community in public spaces.
55 | Examples of representing our community include using an official e-mail address,
56 | posting via an official social media account, or acting as an appointed
57 | representative at an online or offline event.
58 |
59 | ## Enforcement
60 |
61 | Instances of abusive, harassing, or otherwise unacceptable behavior may be
62 | reported to the community leaders responsible for enforcement at
63 | syedfaisalsaleem.100@gmail.com.
64 | All complaints will be reviewed and investigated promptly and fairly.
65 |
66 | All community leaders are obligated to respect the privacy and security of the
67 | reporter of any incident.
68 |
69 | ## Enforcement Guidelines
70 |
71 | Community leaders will follow these Community Impact Guidelines in determining
72 | the consequences for any action they deem in violation of this Code of Conduct:
73 |
74 | ### 1. Correction
75 |
76 | **Community Impact**: Use of inappropriate language or other behavior deemed
77 | unprofessional or unwelcome in the community.
78 |
79 | **Consequence**: A private, written warning from community leaders, providing
80 | clarity around the nature of the violation and an explanation of why the
81 | behavior was inappropriate. A public apology may be requested.
82 |
83 | ### 2. Warning
84 |
85 | **Community Impact**: A violation through a single incident or series
86 | of actions.
87 |
88 | **Consequence**: A warning with consequences for continued behavior. No
89 | interaction with the people involved, including unsolicited interaction with
90 | those enforcing the Code of Conduct, for a specified period of time. This
91 | includes avoiding interactions in community spaces as well as external channels
92 | like social media. Violating these terms may lead to a temporary or
93 | permanent ban.
94 |
95 | ### 3. Temporary Ban
96 |
97 | **Community Impact**: A serious violation of community standards, including
98 | sustained inappropriate behavior.
99 |
100 | **Consequence**: A temporary ban from any sort of interaction or public
101 | communication with the community for a specified period of time. No public or
102 | private interaction with the people involved, including unsolicited interaction
103 | with those enforcing the Code of Conduct, is allowed during this period.
104 | Violating these terms may lead to a permanent ban.
105 |
106 | ### 4. Permanent Ban
107 |
108 | **Community Impact**: Demonstrating a pattern of violation of community
109 | standards, including sustained inappropriate behavior, harassment of an
110 | individual, or aggression toward or disparagement of classes of individuals.
111 |
112 | **Consequence**: A permanent ban from any sort of public interaction within
113 | the community.
114 |
115 | ## Attribution
116 |
117 | This Code of Conduct is adapted from the [Contributor Covenant][homepage],
118 | version 2.0, available at
119 | https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
120 |
121 | Community Impact Guidelines were inspired by [Mozilla's code of conduct
122 | enforcement ladder](https://github.com/mozilla/diversity).
123 |
124 | [homepage]: https://www.contributor-covenant.org
125 |
126 | For answers to common questions about this code of conduct, see the FAQ at
127 | https://www.contributor-covenant.org/faq. Translations are available at
128 | https://www.contributor-covenant.org/translations.
129 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing
2 |
3 | When contributing to this repository, please first discuss the change you wish to make via issue,
4 | email, or any other method with the owners of this repository before making a change.
5 |
6 | Please note we have a code of conduct, please follow it in all your interactions with the project.
7 |
8 | ## Pull Request Process
9 |
10 | 1. Ensure any install or build dependencies are removed before the end of the layer when doing a
11 | build.
12 | 2. Update the README.md with details of changes to the interface, this includes new environment
13 | variables, exposed ports, useful file locations and container parameters.
14 | 3. Increase the version numbers in any examples files and the README.md to the new version that this
15 | Pull Request would represent. The versioning scheme we use is [SemVer](http://semver.org/).
16 | 4. You may merge the Pull Request in once you have the sign-off of two other developers, or if you
17 | do not have permission to do that, you may request the second reviewer to merge it for you.
18 |
19 | ## Code of Conduct
20 |
21 | ### Our Pledge
22 |
23 | In the interest of fostering an open and welcoming environment, we as
24 | contributors and maintainers pledge to making participation in our project and
25 | our community a harassment-free experience for everyone, regardless of age, body
26 | size, disability, ethnicity, gender identity and expression, level of experience,
27 | nationality, personal appearance, race, religion, or sexual identity and
28 | orientation.
29 |
30 | ### Our Standards
31 |
32 | Examples of behavior that contributes to creating a positive environment
33 | include:
34 |
35 | * Using welcoming and inclusive language
36 | * Being respectful of differing viewpoints and experiences
37 | * Gracefully accepting constructive criticism
38 | * Focusing on what is best for the community
39 | * Showing empathy towards other community members
40 |
41 | Examples of unacceptable behavior by participants include:
42 |
43 | * The use of sexualized language or imagery and unwelcome sexual attention or
44 | advances
45 | * Trolling, insulting/derogatory comments, and personal or political attacks
46 | * Public or private harassment
47 | * Publishing others' private information, such as a physical or electronic
48 | address, without explicit permission
49 | * Other conduct which could reasonably be considered inappropriate in a
50 | professional setting
51 |
52 | ### Our Responsibilities
53 |
54 | Project maintainers are responsible for clarifying the standards of acceptable
55 | behavior and are expected to take appropriate and fair corrective action in
56 | response to any instances of unacceptable behavior.
57 |
58 | Project maintainers have the right and responsibility to remove, edit, or
59 | reject comments, commits, code, wiki edits, issues, and other contributions
60 | that are not aligned to this Code of Conduct, or to ban temporarily or
61 | permanently any contributor for other behaviors that they deem inappropriate,
62 | threatening, offensive, or harmful.
63 |
64 | ### Scope
65 |
66 | This Code of Conduct applies both within project spaces and in public spaces
67 | when an individual is representing the project or its community. Examples of
68 | representing a project or community include using an official project e-mail
69 | address, posting via an official social media account, or acting as an appointed
70 | representative at an online or offline event. Representation of a project may be
71 | further defined and clarified by project maintainers.
72 |
73 | ### Enforcement
74 |
75 | Instances of abusive, harassing, or otherwise unacceptable behavior may be
76 | reported by contacting the project team at `syedfaisalsaleem.100@gmail.com`. All
77 | complaints will be reviewed and investigated and will result in a response that
78 | is deemed necessary and appropriate to the circumstances. The project team is
79 | obligated to maintain confidentiality with regard to the reporter of an incident.
80 | Further details of specific enforcement policies may be posted separately.
81 |
82 | Project maintainers who do not follow or enforce the Code of Conduct in good
83 | faith may face temporary or permanent repercussions as determined by other
84 | members of the project's leadership.
85 |
86 | ### Attribution
87 |
88 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
89 | available at [http://contributor-covenant.org/version/1/4][version]
90 |
91 | [homepage]: http://contributor-covenant.org
92 | [version]: http://contributor-covenant.org/version/1/4/
93 |
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
1 | FROM python:3.8-slim-buster
2 | # Add a work directory
3 | WORKDIR /GraphQL_FastAPI_server
4 | # Cache and Install dependencies
5 | COPY requirements.txt requirements.txt
6 | RUN apt update
7 | RUN apt-get install -y build-essential
8 | RUN apt-get install -y python3-psycopg2
9 | # RUN apt-get install -y python3-pip
10 | RUN apt install -y postgresql postgresql-contrib
11 | RUN apt install -y python3-dev libpq-dev
12 | RUN apt install -y git
13 | RUN pip3 install -r requirements.txt
14 | COPY . .
--------------------------------------------------------------------------------
/Jenkinsfile:
--------------------------------------------------------------------------------
1 | pipeline {
2 | agent {label 'linux'}
3 | stages {
4 | stage('verify tooling') {
5 | steps {
6 | sh '''
7 | docker info
8 | docker version
9 | docker-compose version
10 | curl --version
11 | '''
12 | }
13 | }
14 | stage('Start container'){
15 | steps {
16 | sh 'docker-compose down'
17 | sh 'docker-compose up -d --build'
18 | sh 'docker ps'
19 | }
20 | }
21 | stage('Run tests') {
22 | steps {
23 | sh 'curl http://localhost:80/'
24 | }
25 | }
26 | }
27 |
28 | }
--------------------------------------------------------------------------------
/Licence:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2022 Syed Faisal Saleem
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/Readme.md:
--------------------------------------------------------------------------------
1 |
Fast Api Strawberry GraphQL Async SQL Alchemy Boiler Plate
2 |
3 |
4 | ## Description
5 |
6 | This code is a boiler plate for the implementation of GraphQL with Fast Api using Strawberry Library.
7 | For GraphQL server we have used Strawberry.
8 |
9 | ## Features
10 |
11 | - Production ready Python web server using Uvicorn and Gunicorn.
12 |
13 | - Python **FastAPI** backend
14 |
15 | - Async Connection of SQL Alchemy with POSTGRESQL DataBase.
16 |
17 | - CRUD Operations of GraphQL using Strawberry Library.
18 |
19 | - Written Async Unit Tests using Pytest to test GraphQL queries and mutations.
20 |
21 | - Boiler Plate directory struture for GraphQL Python.
22 |
23 | - Get the data only from the columns using SQL Alchmey which are specified in GraphQL Query.
24 |
25 | - Deployment using Docker Container through Docker Compose file.
26 |
27 | - Deployed code at specific endpoint to test GraphQL.
28 |
29 | - Alembic migrations.
30 |
31 | - Jenkins (continuous integration).
32 | ## Installation
33 |
34 | To run the project in your local environment::
35 |
36 | 1. Clone the repository::
37 | ```
38 | $ git clone https://github.com/syedfaisalsaleeem/FastApi-Strawberry-GraphQL-SqlAlchemy-BoilerPlate.git
39 | $ cd FastApi-Strawberry-GraphQL-SqlAlchemy-BoilerPlate
40 | ```
41 | 2. Create and activate a virtual environment::
42 | ```
43 | $ virtualenv env -p python3
44 | $ source env/bin/activate
45 | ```
46 | 3. Install requirements::
47 | ```
48 | $ pip install -r requirements.txt
49 | ```
50 | 4. Run the application::
51 | ```
52 | $ python main_dev.py
53 | ```
54 | To run the project using Docker Container:
55 |
56 | 1. Clone the repository::
57 | ```
58 | $ git clone https://github.com/syedfaisalsaleeem/FastApi-Strawberry-GraphQL-SqlAlchemy-BoilerPlate.git
59 | $ cd FastApi-Strawberry-GraphQL-SqlAlchemy-BoilerPlate
60 | ```
61 | 2. Run this command on CMD::
62 | ```
63 | $ docker-compose up -d --build
64 | ```
65 | ## Usage Examples
66 |
67 | Launch the fast api server at specified port default 5000 (open the UI at http://localhost:5000/graphql): ::
68 |
69 | $ python main_dev.py
70 |
71 | Launch using docker: ::
72 |
73 | $ docker-compose up -d --build
74 |
75 | ## Tests
76 |
77 | Test are run with *pytest*. If you are not familiar with this package you can get some more info from `their website `_.
78 |
79 | To run the tests, from the project directory, simply::
80 |
81 | ```
82 | $ pip install -r requirements.txt
83 | $ python test.py
84 | ```
85 |
86 | You should see output similar to::
87 | ```
88 | ----------- coverage: platform win32, python 3.8.8-final-0 -----------
89 | Name Stmts Miss Cover
90 | -------------------------------------------------------
91 | tests\conftest.py 18 4 78%
92 | tests\graphql\mutations.py 3 0 100%
93 | tests\graphql\queries.py 2 0 100%
94 | tests\graphql\test_stickynotes.py 0 0 100%
95 | tests\graphql\test_user.py 43 0 100%
96 | tests\load_test_env.py 4 4 0%
97 | -------------------------------------------------------
98 | TOTAL 70 8 89%
99 |
100 |
101 | =================== 8 passed in 0.59s =================
102 | ```
103 | ## Migrations
104 |
105 | To run the project in your local environment::
106 |
107 | ```
108 | $ alembic revision --autogenerate -m "migration string"
109 | $ alembic upgrade head
110 | ```
111 | ## License
112 |
113 | This project is licensed under the terms of the MIT license. If you have any question about this opinionated list, do not hesitate to contact me [@SyedFaisal](https://www.linkedin.com/in/syedfaisalsaleem/) on Linkedin or open an issue on GitHub.
114 |
--------------------------------------------------------------------------------
/alembic.ini:
--------------------------------------------------------------------------------
1 | # A generic, single database configuration.
2 |
3 | [alembic]
4 | # path to migration scripts
5 | script_location = alembic
6 |
7 | # template used to generate migration files
8 | # file_template = %%(rev)s_%%(slug)s
9 |
10 | # sys.path path, will be prepended to sys.path if present.
11 | # defaults to the current working directory.
12 | prepend_sys_path = .
13 |
14 | # timezone to use when rendering the date within the migration file
15 | # as well as the filename.
16 | # If specified, requires the python-dateutil library that can be
17 | # installed by adding `alembic[tz]` to the pip requirements
18 | # string value is passed to dateutil.tz.gettz()
19 | # leave blank for localtime
20 | # timezone =
21 |
22 | # max length of characters to apply to the
23 | # "slug" field
24 | # truncate_slug_length = 40
25 |
26 | # set to 'true' to run the environment during
27 | # the 'revision' command, regardless of autogenerate
28 | # revision_environment = false
29 |
30 | # set to 'true' to allow .pyc and .pyo files without
31 | # a source .py file to be detected as revisions in the
32 | # versions/ directory
33 | # sourceless = false
34 |
35 | # version location specification; This defaults
36 | # to alembic/versions. When using multiple version
37 | # directories, initial revisions must be specified with --version-path.
38 | # The path separator used here should be the separator specified by "version_path_separator" below.
39 | # version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
40 |
41 | # version path separator; As mentioned above, this is the character used to split
42 | # version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
43 | # If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
44 | # Valid values for version_path_separator are:
45 | #
46 | # version_path_separator = :
47 | # version_path_separator = ;
48 | # version_path_separator = space
49 | version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
50 |
51 | # the output encoding used when revision files
52 | # are written from script.py.mako
53 | # output_encoding = utf-8
54 |
55 | ; sqlalchemy.url = postgresql://postgres:faisal@localhost:5432/graphql_db
56 |
57 |
58 | [post_write_hooks]
59 | # post_write_hooks defines scripts or Python functions that are run
60 | # on newly generated revision scripts. See the documentation for further
61 | # detail and examples
62 |
63 | # format using "black" - use the console_scripts runner, against the "black" entrypoint
64 | # hooks = black
65 | # black.type = console_scripts
66 | # black.entrypoint = black
67 | # black.options = -l 79 REVISION_SCRIPT_FILENAME
68 |
69 | # Logging configuration
70 | [loggers]
71 | keys = root,sqlalchemy,alembic
72 |
73 | [handlers]
74 | keys = console
75 |
76 | [formatters]
77 | keys = generic
78 |
79 | [logger_root]
80 | level = WARN
81 | handlers = console
82 | qualname =
83 |
84 | [logger_sqlalchemy]
85 | level = WARN
86 | handlers =
87 | qualname = sqlalchemy.engine
88 |
89 | [logger_alembic]
90 | level = INFO
91 | handlers =
92 | qualname = alembic
93 |
94 | [handler_console]
95 | class = StreamHandler
96 | args = (sys.stderr,)
97 | level = NOTSET
98 | formatter = generic
99 |
100 | [formatter_generic]
101 | format = %(levelname)-5.5s [%(name)s] %(message)s
102 | datefmt = %H:%M:%S
103 |
--------------------------------------------------------------------------------
/alembic/README:
--------------------------------------------------------------------------------
1 | Generic single-database configuration.
--------------------------------------------------------------------------------
/alembic/env.py:
--------------------------------------------------------------------------------
1 | from __future__ import with_statement
2 | from logging.config import fileConfig
3 | import load_env
4 | from sqlalchemy import engine_from_config
5 | from sqlalchemy import pool
6 |
7 | from alembic import context
8 |
9 | # this is the Alembic Config object, which provides
10 | # access to the values within the .ini file in use.
11 | config = context.config
12 |
13 | # Interpret the config file for Python logging.
14 | # This line sets up loggers basically.
15 | # if config.config_file_name is not None:
16 | fileConfig(config.config_file_name)
17 |
18 |
19 | # add your model's MetaData object here
20 | # for 'autogenerate' support
21 | # from myapp import mymodel
22 | # target_metadata = mymodel.Base.metadata
23 | target_metadata = None
24 |
25 | # other values from the config, defined by the needs of env.py,
26 | # can be acquired:
27 | # my_important_option = config.get_main_option("my_important_option")
28 | # ... etc.
29 | from src.graphql.models import Base
30 | from src.graphql.models import StickyNotes, User
31 | from src.graphql.core.config import settings
32 | target_metadata = [Base.metadata]
33 |
34 | database_url = f"postgresql://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_SERVER}:{settings.POSTGRES_PORT}/{settings.POSTGRES_DB}"
35 | config.set_main_option(
36 | "sqlalchemy.url", database_url)
37 | # other values from the config, defined by the needs of env.py,
38 | # can be acquired:
39 | # my_important_option = config.get_main_option("my_important_option")
40 | # ... etc.
41 |
42 | def get_url():
43 | """
44 | This function is used to get the database url from the environment variable
45 | """
46 | return database_url
47 |
48 | def run_migrations_offline():
49 | """Run migrations in 'offline' mode.
50 |
51 | This configures the context with just a URL
52 | and not an Engine, though an Engine is acceptable
53 | here as well. By skipping the Engine creation
54 | we don't even need a DBAPI to be available.
55 |
56 | Calls to context.execute() here emit the given string to the
57 | script output.
58 |
59 | """
60 | url = get_url()
61 | context.configure(
62 | url=url,
63 | target_metadata=target_metadata,
64 | literal_binds=True,
65 | dialect_opts={"paramstyle": "named"},
66 | )
67 |
68 | with context.begin_transaction():
69 | context.run_migrations()
70 |
71 |
72 | def run_migrations_online():
73 | """Run migrations in 'online' mode.
74 |
75 | In this scenario we need to create an Engine
76 | and associate a connection with the context.
77 |
78 | """
79 | # pass
80 | connectable = engine_from_config(
81 | config.get_section(config.config_ini_section),
82 | prefix="sqlalchemy.",
83 | poolclass=pool.NullPool,
84 | )
85 |
86 | with connectable.connect() as connection:
87 | context.configure(
88 | connection=connection, target_metadata=target_metadata
89 | )
90 |
91 | with context.begin_transaction():
92 | context.run_migrations()
93 |
94 |
95 |
96 | if context.is_offline_mode():
97 | run_migrations_offline()
98 | else:
99 | run_migrations_online()
100 |
--------------------------------------------------------------------------------
/alembic/script.py.mako:
--------------------------------------------------------------------------------
1 | """${message}
2 |
3 | Revision ID: ${up_revision}
4 | Revises: ${down_revision | comma,n}
5 | Create Date: ${create_date}
6 |
7 | """
8 | from alembic import op
9 | import sqlalchemy as sa
10 | ${imports if imports else ""}
11 |
12 | # revision identifiers, used by Alembic.
13 | revision = ${repr(up_revision)}
14 | down_revision = ${repr(down_revision)}
15 | branch_labels = ${repr(branch_labels)}
16 | depends_on = ${repr(depends_on)}
17 |
18 |
19 | def upgrade():
20 | ${upgrades if upgrades else "pass"}
21 |
22 |
23 | def downgrade():
24 | ${downgrades if downgrades else "pass"}
25 |
--------------------------------------------------------------------------------
/alembic/versions/0a2609ed2b6b_second_migration.py:
--------------------------------------------------------------------------------
1 | """second migration
2 |
3 | Revision ID: 0a2609ed2b6b
4 | Revises: f380633efe61
5 | Create Date: 2022-07-12 00:58:13.171644
6 |
7 | """
8 | from alembic import op
9 | import sqlalchemy as sa
10 |
11 |
12 | # revision identifiers, used by Alembic.
13 | revision = '0a2609ed2b6b'
14 | down_revision = 'f380633efe61'
15 | branch_labels = None
16 | depends_on = None
17 |
18 |
19 | def upgrade():
20 | # ### commands auto generated by Alembic - please adjust! ###
21 | pass
22 | # ### end Alembic commands ###
23 |
24 |
25 | def downgrade():
26 | # ### commands auto generated by Alembic - please adjust! ###
27 | pass
28 | # ### end Alembic commands ###
29 |
--------------------------------------------------------------------------------
/alembic/versions/f380633efe61_first_migration.py:
--------------------------------------------------------------------------------
1 | """first migration
2 |
3 | Revision ID: f380633efe61
4 | Revises:
5 | Create Date: 2022-07-12 00:25:10.920066
6 |
7 | """
8 | from alembic import op
9 | import sqlalchemy as sa
10 |
11 |
12 | # revision identifiers, used by Alembic.
13 | revision = 'f380633efe61'
14 | down_revision = None
15 | branch_labels = None
16 | depends_on = None
17 |
18 |
19 | def upgrade():
20 | pass
21 |
22 |
23 | def downgrade():
24 | pass
25 |
--------------------------------------------------------------------------------
/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: "3.8"
2 |
3 | services:
4 | db_graphql:
5 | image: postgres:13-alpine
6 | env_file:
7 | - envs/.env
8 | volumes:
9 | - postgres_data:/var/lib/postgresql/data/
10 | ports:
11 | - 5432:5432
12 |
13 | app:
14 | build:
15 | context: .
16 | ports:
17 | - 5000:5000
18 | env_file:
19 | - envs/.env
20 | volumes:
21 | - .:/app:rw
22 | command:
23 | sh -c "python3 main.py"
24 | depends_on:
25 | - db_graphql
26 |
27 | volumes:
28 | postgres_data:
--------------------------------------------------------------------------------
/envs/.env:
--------------------------------------------------------------------------------
1 | HOST_URL=0.0.0.0
2 | HOST_PORT=5000
3 | POSTGRES_USER=postgres
4 | POSTGRES_PASSWORD=faisal
5 | POSTGRES_SERVER=db_graphql
6 | POSTGRES_PORT=5432
7 | POSTGRES_DB=graphql_db
--------------------------------------------------------------------------------
/envs/.env.dev:
--------------------------------------------------------------------------------
1 | HOST_URL=localhost
2 | HOST_PORT=5000
3 | POSTGRES_USER=postgres
4 | POSTGRES_PASSWORD=faisal
5 | POSTGRES_SERVER=localhost
6 | POSTGRES_PORT=5432
7 | POSTGRES_DB=graphql_db
--------------------------------------------------------------------------------
/envs/.env.test:
--------------------------------------------------------------------------------
1 | HOST_URL=localhost
2 | HOST_PORT=5000
3 | POSTGRES_USER=postgres
4 | POSTGRES_PASSWORD=faisal
5 | POSTGRES_SERVER=localhost
6 | POSTGRES_PORT=5432
7 | POSTGRES_DB=graphql_test_db
--------------------------------------------------------------------------------
/load_env.py:
--------------------------------------------------------------------------------
1 | from pathlib import Path
2 | from dotenv import load_dotenv
3 |
4 | env_path = Path(".") / "envs/.env.dev"
5 | load_dotenv(dotenv_path=env_path)
6 |
--------------------------------------------------------------------------------
/main.py:
--------------------------------------------------------------------------------
1 | from src.graphql.core.config import settings
2 | import asyncio
3 | import uvicorn
4 | from populate import create_tables
5 | from src.graphql.db.session import engine
6 | from src.app import create_app
7 |
8 | application = create_app()
9 |
10 | if __name__ == "__main__":
11 | print("Populating database...")
12 | asyncio.run(create_tables(engine))
13 | print("Database populated.")
14 |
15 | print("Starting server...")
16 | uvicorn.run("main:application", host=settings.HOST_URL, port=settings.HOST_PORT, reload=True)
--------------------------------------------------------------------------------
/main_dev.py:
--------------------------------------------------------------------------------
1 | import load_env
2 | from src.graphql.core.config import settings
3 | import asyncio
4 | import uvicorn
5 | from populate import create_tables
6 | from src.graphql.db.session import engine
7 | from src.app import create_app
8 |
9 | application = create_app()
10 |
11 | if __name__ == "__main__":
12 | print("Populating database...")
13 | asyncio.run(create_tables(engine))
14 | print("Database populated.")
15 |
16 | print("Starting server...")
17 | uvicorn.run("main:application", host=settings.HOST_URL, port=settings.HOST_PORT, reload=True)
--------------------------------------------------------------------------------
/populate.py:
--------------------------------------------------------------------------------
1 | from src.graphql.models import Base
2 |
3 | async def create_tables(engine):
4 | async with engine.begin() as conn:
5 | await conn.run_sync(Base.metadata.drop_all)
6 | await conn.run_sync(Base.metadata.create_all)
7 | await engine.dispose()
8 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/syedfaisalsaleeem/FastApi-Strawberry-GraphQL-SqlAlchemy-BoilerPlate/b724a6f14607e4a5858ff70174d0e993725d27cd/requirements.txt
--------------------------------------------------------------------------------
/src/app.py:
--------------------------------------------------------------------------------
1 | from fastapi import FastAPI
2 | import strawberry
3 | from strawberry.fastapi import GraphQLRouter
4 | from strawberry.schema.config import StrawberryConfig
5 | from src.graphql.schemas.mutation_schema import Mutation
6 | from src.graphql.schemas.query_schema import Query
7 |
8 | schema = strawberry.Schema(query=Query,mutation=Mutation,config=StrawberryConfig(auto_camel_case=True))
9 |
10 | def create_app():
11 |
12 | app = FastAPI()
13 | graphql_app = GraphQLRouter(schema)
14 | app.include_router(graphql_app, prefix="/graphql")
15 |
16 | return app
--------------------------------------------------------------------------------
/src/graphql/core/config.py:
--------------------------------------------------------------------------------
1 | import os
2 | from pydantic import BaseSettings
3 |
4 | class Settings(BaseSettings):
5 | PROJECT_TITLE: str = "Fast Api GraphQL Strawberry"
6 | PROJECT_VERSION: str = "0.0.1"
7 | HOST_HTTP: str = os.environ.get("HOST_HTTP","http://")
8 | HOST_URL: str = os.environ.get("HOST_URL")
9 | HOST_PORT: int = int(os.environ.get("HOST_PORT"))
10 | BASE_URL: str = HOST_HTTP+HOST_URL+":"+str(HOST_PORT)
11 | POSTGRES_USER: str = os.environ.get("POSTGRES_USER",)
12 | POSTGRES_PASSWORD: str = os.environ.get("POSTGRES_PASSWORD")
13 | POSTGRES_SERVER: str = os.environ.get("POSTGRES_SERVER")
14 | POSTGRES_PORT: int = int(os.environ.get("POSTGRES_PORT", 5432))
15 | POSTGRES_DB: str = os.environ.get("POSTGRES_DB")
16 | DATABASE_URL = f"postgresql://{POSTGRES_USER}:{POSTGRES_PASSWORD}@{POSTGRES_SERVER}:{POSTGRES_PORT}/{POSTGRES_DB}"
17 |
18 | settings = Settings()
--------------------------------------------------------------------------------
/src/graphql/db/session.py:
--------------------------------------------------------------------------------
1 | from contextlib import asynccontextmanager
2 | from typing import AsyncGenerator
3 | from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
4 | from sqlalchemy.orm import sessionmaker
5 |
6 | from src.graphql.core.config import settings
7 |
8 | engine = create_async_engine(
9 | f"postgresql+asyncpg://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_SERVER}:{settings.POSTGRES_PORT}/{settings.POSTGRES_DB}"
10 | )
11 | async_session = sessionmaker(
12 | bind=engine,
13 | class_=AsyncSession,
14 | expire_on_commit=False,
15 | autocommit=False,
16 | autoflush=False,
17 | )
18 |
19 | @asynccontextmanager
20 | async def get_session() -> AsyncGenerator[AsyncSession, None]:
21 | async with async_session() as session:
22 | async with session.begin():
23 | try:
24 | yield session
25 | finally:
26 | await session.close()
--------------------------------------------------------------------------------
/src/graphql/fragments/stickynotes_fragments.py:
--------------------------------------------------------------------------------
1 | import strawberry
2 |
3 | from src.graphql.scalars.stickynotes_scalar import StickyNotes, StickyNotesDeleted, StickyNotesNotFound
4 | from src.graphql.scalars.user_scalar import UserNameMissing, UserNotFound
5 |
6 |
7 | AddStickyNotesResponse = strawberry.union("AddStickyNotesResponse", (StickyNotes, UserNotFound, UserNameMissing))
8 | UpdateStickyNotesResponse = strawberry.union("UpdateStickyNotesResponse", (StickyNotes, StickyNotesNotFound))
9 | DeleteStickyNotesResponse = strawberry.union("DeleteStickyNotesResponse", (StickyNotesDeleted, StickyNotesNotFound))
--------------------------------------------------------------------------------
/src/graphql/fragments/user_fragments.py:
--------------------------------------------------------------------------------
1 | import strawberry
2 |
3 | from src.graphql.scalars.user_scalar import AddUser, UserDeleted, UserExists, UserIdMissing, UserNotFound
4 |
5 |
6 | AddUserResponse = strawberry.union("AddUserResponse", (AddUser, UserExists))
7 | DeleteUserResponse = strawberry.union("DeleteUserResponse", (UserDeleted,UserNotFound, UserIdMissing))
--------------------------------------------------------------------------------
/src/graphql/helpers/helper.py:
--------------------------------------------------------------------------------
1 | from sqlalchemy.inspection import inspect
2 | import re
3 |
4 | def convert_camel_case(name):
5 | pattern = re.compile(r'(? AddStickyNotesResponse:
14 | """ Add sticky note """
15 | add_stickynotes_resp = await add_stickynotes(text, user_id)
16 | return add_stickynotes_resp
17 |
18 | @strawberry.mutation
19 | async def add_user(self, name: str) -> AddUserResponse:
20 | """ Add user """
21 | add_user_resp = await add_user(name)
22 | return add_user_resp
23 |
24 | @strawberry.mutation
25 | async def delete_user(self, user_id: int) -> DeleteUserResponse:
26 | """ Delete user """
27 | delete_user_resp = await delete_user(user_id)
28 | return delete_user_resp
29 |
30 | @strawberry.mutation
31 | async def delete_stickynote(self, stickynote_id: int) -> DeleteStickyNotesResponse:
32 | """ Delete Sticky Notes """
33 | delete_stickynote_resp = await delete_stickynotes(stickynote_id)
34 | return delete_stickynote_resp
35 |
36 | @strawberry.mutation
37 | async def update_stickynote(self, stickynote_id: int, text: str) -> UpdateStickyNotesResponse:
38 | """ Update Sticky Notes """
39 | update_stickynote_resp = await update_stickynotes(stickynote_id, text)
40 | return update_stickynote_resp
--------------------------------------------------------------------------------
/src/graphql/schemas/query_schema.py:
--------------------------------------------------------------------------------
1 | import strawberry
2 | from pydantic import typing
3 | from strawberry.types import Info
4 | from src.graphql.scalars.stickynotes_scalar import StickyNotes
5 |
6 | from src.graphql.resolvers.stickynote_resolver import get_stickynote, get_stickynotes
7 | from src.graphql.resolvers.user_resolver import get_user, get_users
8 | from src.graphql.scalars.user_scalar import User
9 |
10 | @strawberry.type
11 | class Query:
12 |
13 | @strawberry.field
14 | async def users(self, info:Info) -> typing.List[User]:
15 | """ Get all users """
16 | users_data_list = await get_users(info)
17 | return users_data_list
18 |
19 | @strawberry.field
20 | async def user(self, info:Info, user_id: int) -> User:
21 | """ Get user by id """
22 | user_dict = await get_user(user_id, info)
23 | return user_dict
24 |
25 | @strawberry.field
26 | async def stickynotes(self, info:Info) -> typing.List[StickyNotes]:
27 | """ Get all stickynotes """
28 | stickynotes_data_list = await get_stickynotes(info)
29 | return stickynotes_data_list
30 |
31 | @strawberry.field
32 | async def stickynote(self, info:Info, stickynote_id: int) -> StickyNotes:
33 | """ Get stickynote by id """
34 | stickynote_dict = await get_stickynote(stickynote_id, info)
35 | return stickynote_dict
--------------------------------------------------------------------------------
/test.py:
--------------------------------------------------------------------------------
1 | ## test graphql
2 |
3 | import asyncio
4 | from tests import load_test_env
5 | import pytest
6 | from populate import create_tables
7 | from tests.graphql.db import engine
8 |
9 | if __name__ == "__main__":
10 | asyncio.run(create_tables(engine))
11 | pytest.main(args=[ "--cov=tests","tests/graphql/test_user_stickynotes.py","-s","--disable-warnings"])
--------------------------------------------------------------------------------
/tests/conftest.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import unittest
3 | import pytest
4 | from src.graphql.db.session import get_session
5 | from tests.graphql.db import overide_get_session
6 | from main import application as app
7 | app.dependency_overrides[get_session] = overide_get_session
8 |
9 | @pytest.mark.usefixtures("event_loop_instance")
10 | class TestAsynchronously(unittest.TestCase):
11 |
12 | def get_async_result(self, coro):
13 | """ Run a coroutine synchronously. """
14 | return self.event_loop.run_until_complete(coro)
15 |
16 | @pytest.fixture(scope="class")
17 | def event_loop_instance(request):
18 | """ Add the event_loop as an attribute to the unittest style test class. """
19 | request.cls.event_loop = asyncio.get_event_loop_policy().new_event_loop()
20 | yield
21 | request.cls.event_loop.close()
22 |
--------------------------------------------------------------------------------
/tests/graphql/db.py:
--------------------------------------------------------------------------------
1 | from contextlib import asynccontextmanager
2 | from typing import AsyncGenerator
3 | from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
4 | from sqlalchemy.orm import sessionmaker
5 |
6 | from src.graphql.core.config import settings
7 |
8 | SQLALCHEMY_DATABASE_URL = f"postgresql+asyncpg://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_SERVER}:{settings.POSTGRES_PORT}/{settings.POSTGRES_DB}"
9 |
10 | engine = create_async_engine(SQLALCHEMY_DATABASE_URL)
11 |
12 | testing_async_session = sessionmaker(
13 | bind=engine,
14 | class_=AsyncSession,
15 | expire_on_commit=False,
16 | autocommit=False,
17 | autoflush=False,
18 | )
19 |
20 | @asynccontextmanager
21 | async def overide_get_session() -> AsyncGenerator[AsyncSession, None]:
22 | async with testing_async_session() as session:
23 | async with session.begin():
24 | try:
25 | yield session
26 | finally:
27 | await session.close()
--------------------------------------------------------------------------------
/tests/graphql/mutations.py:
--------------------------------------------------------------------------------
1 | create_user = """
2 | mutation MyMutation {
3 | addUser(name: "test") {
4 | ... on AddUser {
5 | id
6 | name
7 | }
8 | ... on UserExists {
9 | message
10 | }
11 | }
12 | }
13 | """
14 |
15 | create_sticky_notes = """
16 | mutation MyMutation {
17 | addStickynotes(text: "demo text", userId: 1) {
18 | ... on StickyNotes {
19 | id
20 | createdDatetime
21 | text
22 | userId
23 | }
24 | ... on UserNotFound {
25 | message
26 | }
27 | ... on UserNameMissing {
28 | message
29 | }
30 | }
31 | }
32 | """
33 |
34 | delete_specific_user = """
35 | mutation MyMutation {
36 | deleteUser(userId: 1) {
37 | ... on UserDeleted {
38 | message
39 | }
40 | ... on UserNotFound {
41 | message
42 | }
43 | ... on UserIdMissing {
44 | message
45 | }
46 | }
47 | }
48 | """
49 |
50 | update_specific_sticky_note = """
51 | mutation MyMutation3 {
52 | updateStickynote(stickynoteId: 1, text: "new text") {
53 | ... on StickyNotes {
54 | id
55 | createdDatetime
56 | text
57 | userId
58 | }
59 | ... on StickyNotesNotFound {
60 | message
61 | }
62 | }
63 | }"""
64 |
65 | delete_specific_sticky_note = """
66 | mutation MyMutation4 {
67 | deleteStickynote(stickynoteId: 1) {
68 | ... on StickyNotesDeleted {
69 | message
70 | }
71 | ... on StickyNotesNotFound {
72 | message
73 | }
74 | }
75 | }
76 | """
--------------------------------------------------------------------------------
/tests/graphql/queries.py:
--------------------------------------------------------------------------------
1 | get_users_query = """
2 | query MyQuery {
3 | users {
4 | id
5 | name
6 | stickynotes {
7 | createdDatetime
8 | id
9 | text
10 | userId
11 | }
12 | }
13 | }
14 | """
15 |
16 | get_specific_user_query = """
17 | query MyQuery2 {
18 | user(userId: 1) {
19 | id
20 | name
21 | stickynotes {
22 | id
23 | text
24 | createdDatetime
25 | userId
26 | }
27 | }
28 | }
29 | """
30 |
31 | get_sticky_notes_query = """
32 | query MyQuery3 {
33 | stickynotes {
34 | createdDatetime
35 | id
36 | text
37 | userId
38 | }
39 | }
40 | """
41 |
42 | get_specific_sticky_note_query = """
43 | query MyQuery4 {
44 | stickynote(stickynoteId: 1) {
45 | createdDatetime
46 | id
47 | text
48 | userId
49 | }
50 | }"""
--------------------------------------------------------------------------------
/tests/graphql/test_user_stickynotes.py:
--------------------------------------------------------------------------------
1 | from tests.conftest import TestAsynchronously
2 | from tests.graphql.queries import get_users_query, get_specific_user_query, get_sticky_notes_query, get_specific_sticky_note_query
3 | from tests.graphql.mutations import create_user, create_sticky_notes, delete_specific_user, update_specific_sticky_note, delete_specific_sticky_note
4 | from src.app import schema
5 |
6 | class TestUsers(TestAsynchronously):
7 |
8 | def test_01_an_async_get_all_users(self):
9 | resp = self.get_async_result(schema.execute(
10 | get_users_query,
11 | ))
12 | assert resp.data["users"] == []
13 |
14 | def test_02_an_async_create_users(self):
15 | resp = self.get_async_result(schema.execute(
16 | create_user,
17 | ))
18 | assert resp.data["addUser"] == {'id': 1, 'name': 'test'}
19 |
20 | def test_03_an_async_create_users_again(self):
21 | resp = self.get_async_result(schema.execute(
22 | create_user,
23 | ))
24 | assert resp.data == {'addUser': {'message': 'User with this name already exists'}}
25 |
26 | def test_04_an_async_create_sticky_notes_relative_to_user(self):
27 | resp = self.get_async_result(schema.execute(
28 | create_sticky_notes,
29 | ))
30 |
31 | for key_sticky_note in resp.data['addStickynotes'].keys():
32 | assert key_sticky_note in ['id','createdDatetime','text','userId']
33 |
34 | def test_05_an_async_get_all_users_with_created_user(self):
35 | resp = self.get_async_result(schema.execute(
36 | get_users_query,
37 | ))
38 | assert len(resp.data["users"]) == 1
39 | for key_user in resp.data["users"][0].keys():
40 | assert key_user in ['id','name','stickynotes']
41 |
42 | for key_sticky_note in resp.data["users"][0]['stickynotes'][0].keys():
43 | assert key_sticky_note in ['id','createdDatetime','text','userId']
44 |
45 | def test_06_an_async_get_specific_user(self):
46 | resp = self.get_async_result(schema.execute(
47 | get_specific_user_query,
48 | ))
49 | for key_user in resp.data["user"].keys():
50 | assert key_user in ['id','name','stickynotes']
51 |
52 | for key_sticky_note in resp.data["user"]['stickynotes'][0].keys():
53 | assert key_sticky_note in ['id','createdDatetime','text','userId']
54 |
55 | def test_07_an_async_get_all_stickynotes(self):
56 | resp = self.get_async_result(schema.execute(
57 | get_sticky_notes_query,
58 | ))
59 |
60 | assert len(resp.data["stickynotes"]) == 1
61 | for key_sticky_note in resp.data['stickynotes'][0].keys():
62 | assert key_sticky_note in ['id','createdDatetime','text','userId']
63 |
64 | def test_08_an_async_get_specific_stickynote(self):
65 | resp = self.get_async_result(schema.execute(
66 | get_specific_sticky_note_query,
67 | ))
68 |
69 | for key_sticky_note in resp.data['stickynote'].keys():
70 | assert key_sticky_note in ['id','createdDatetime','text','userId']
71 |
72 | def test_09_an_async_update_specific_stickynote(self):
73 | resp = self.get_async_result(schema.execute(
74 | update_specific_sticky_note,
75 | variable_values={
76 | "stickynoteId": 1,
77 | "text": "new text",
78 | }
79 | ))
80 |
81 | assert resp.data["updateStickynote"]["text"] == "new text"
82 | for key_sticky_note in resp.data["updateStickynote"].keys():
83 | assert key_sticky_note in ['id','createdDatetime','text','userId']
84 |
85 | def test_10_an_async_delete_specific_stickynote(self):
86 | resp = self.get_async_result(schema.execute(
87 | delete_specific_sticky_note,
88 | ))
89 |
90 | assert resp.data == {"deleteStickynote": {"message": "Sticky Notes deleted"}}
91 |
92 | resp = self.get_async_result(schema.execute(
93 | delete_specific_sticky_note,
94 | ))
95 | assert resp.data == {"deleteStickynote": { "message": "Couldn't find sticky notes with the supplied id"}}
96 |
97 | def test_11_an_async_delete_specific_user(self):
98 | resp = self.get_async_result(schema.execute(
99 | delete_specific_user,
100 | ))
101 | assert resp.data == {"deleteUser": {"message": "User deleted"}}
102 | resp = self.get_async_result(schema.execute(
103 | delete_specific_user,
104 | ))
105 | assert resp.data == {"deleteUser": { "message": "Couldn't find user with the supplied id"}}
106 |
107 | def test_12_an_async_get_all_users(self):
108 | resp = self.get_async_result(schema.execute(
109 | get_users_query,
110 | ))
111 | assert resp.data["users"] == []
--------------------------------------------------------------------------------
/tests/load_test_env.py:
--------------------------------------------------------------------------------
1 | from pathlib import Path
2 | from dotenv import load_dotenv
3 |
4 | env_path = Path(".") / "envs/.env.test"
5 | load_dotenv(dotenv_path=env_path)
6 |
--------------------------------------------------------------------------------