├── .gitignore ├── README.md ├── poetry.lock ├── pyproject.toml └── src ├── bento_crew_demo ├── __init__.py ├── config │ ├── agents.yaml │ └── tasks.yaml ├── crew.py ├── main.py ├── service.py └── tools │ ├── __init__.py │ └── custom_tool.py ├── bentofile.yaml └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | share/python-wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | MANIFEST 28 | 29 | # PyInstaller 30 | # Usually these files are written by a python script from a template 31 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 32 | *.manifest 33 | *.spec 34 | 35 | # Installer logs 36 | pip-log.txt 37 | pip-delete-this-directory.txt 38 | 39 | # Unit test / coverage reports 40 | htmlcov/ 41 | .tox/ 42 | .nox/ 43 | .coverage 44 | .coverage.* 45 | .cache 46 | nosetests.xml 47 | coverage.xml 48 | *.cover 49 | *.py,cover 50 | .hypothesis/ 51 | .pytest_cache/ 52 | cover/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | .pybuilder/ 76 | target/ 77 | 78 | # Jupyter Notebook 79 | .ipynb_checkpoints 80 | 81 | # IPython 82 | profile_default/ 83 | ipython_config.py 84 | 85 | # pyenv 86 | # For a library or package, you might want to ignore these files since the code is 87 | # intended to run in multiple environments; otherwise, check them in: 88 | # .python-version 89 | 90 | # pipenv 91 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 92 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 93 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 94 | # install all needed dependencies. 95 | #Pipfile.lock 96 | 97 | # poetry 98 | # Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control. 99 | # This is especially recommended for binary packages to ensure reproducibility, and is more 100 | # commonly ignored for libraries. 101 | # https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control 102 | #poetry.lock 103 | 104 | # pdm 105 | # Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control. 106 | #pdm.lock 107 | # pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it 108 | # in version control. 109 | # https://pdm.fming.dev/latest/usage/project/#working-with-version-control 110 | .pdm.toml 111 | .pdm-python 112 | .pdm-build/ 113 | 114 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm 115 | __pypackages__/ 116 | 117 | # Celery stuff 118 | celerybeat-schedule 119 | celerybeat.pid 120 | 121 | # SageMath parsed files 122 | *.sage.py 123 | 124 | # Environments 125 | .env 126 | .venv 127 | env/ 128 | venv/ 129 | ENV/ 130 | env.bak/ 131 | venv.bak/ 132 | 133 | # Spyder project settings 134 | .spyderproject 135 | .spyproject 136 | 137 | # Rope project settings 138 | .ropeproject 139 | 140 | # mkdocs documentation 141 | /site 142 | 143 | # mypy 144 | .mypy_cache/ 145 | .dmypy.json 146 | dmypy.json 147 | 148 | # Pyre type checker 149 | .pyre/ 150 | 151 | # pytype static type analyzer 152 | .pytype/ 153 | 154 | # Cython debug symbols 155 | cython_debug/ 156 | 157 | # PyCharm 158 | # JetBrains specific template is maintained in a separate JetBrains.gitignore that can 159 | # be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore 160 | # and can be added to the global gitignore or merged into this file. For a more nuclear 161 | # option (not recommended) you can uncomment the following to ignore the entire idea folder. 162 | .idea/ 163 | 164 | # CrewAI 165 | report.md 166 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # BentoCrewAI: Serving CrewAI Agent with BentoML 2 | 3 | Welcome to the BentoCrewAI project. This project demonstrates how to serve and deploy a [CrewAI](https://github.com/crewAIInc/crewAI) multi-agent application with the [BentoML](https://github.com/bentoml/BentoML) serving framework. 4 | 5 | See [here](https://docs.bentoml.com/en/latest/examples/overview.html) for a full list of BentoML example projects. 6 | 7 | ## Getting Started 8 | 9 | This project is a reference implementation designed to be hackable. Download the source code and use it as a playground to build your own agent APIs: 10 | 11 | Download source code: 12 | ```bash 13 | git clone https://github.com/bentoml/BentoCrewAI.git 14 | cd BentoCrewAI/src 15 | ``` 16 | 17 | Ensure you have Python >=3.10 <=3.13 installed on your system. Install dependencies: 18 | ```bash 19 | # Create virtual env 20 | pip install virtualenv 21 | python -m venv venv 22 | source ./venv/bin/activate 23 | 24 | # Install dependencies 25 | pip install -r requirements.txt --no-deps 26 | ``` 27 | 28 | Set your **`OPENAI_API_KEY`** environment variable: 29 | ```bash 30 | export OPENAI_API_KEY='your_openai_key' 31 | ``` 32 | 33 | 34 | ## Launching the API server 35 | 36 | ```bash 37 | ./venv/bin/bentoml serve bento_crew_demo.service:CrewAgent 38 | ``` 39 | 40 | ## Calling the API 41 | 42 | ```bash 43 | curl -X POST http://localhost:3000/run \ 44 | -H 'Content-Type: application/json' \ 45 | -d '{"topic": "BentoML"}' 46 | ``` 47 | 48 | The `/run` API endpoint takes the "topic" input from client, and returns the final results. 49 | 50 | Use the `/stream` endpoint for streaming all intermediate results from the Crew Agent for full context of planning and thinking process: 51 | 52 | ```bash 53 | curl -X POST http://localhost:3000/stream \ 54 | -H 'Content-Type: application/json' \ 55 | -d '{"topic": "Model Inference"}' 56 | ``` 57 | 58 | ## Containerize 59 | 60 | Make sure you have Docker installed and running. Build a docker container image for deployment with BentoML: 61 | 62 | ```bash 63 | bentoml build . --version dev 64 | bentoml containerize crew_agent:dev 65 | ``` 66 | 67 | Follow CLI output instruction to run the generated container image. E.g.: 68 | 69 | ```bash 70 | docker run --rm \ 71 | -e OPENAI_API_KEY=$OPENAI_API_KEY \ 72 | -p 3000:3000 \ 73 | crew_agent:dev 74 | ``` 75 | 76 | 77 | ## Customizing 78 | 79 | Follow CrewAI docs on how to customize your Agents and tasks. 80 | 81 | - Modify `src/bento_crew_demo/config/agents.yaml` to define your agents 82 | - Modify `src/bento_crew_demo/config/tasks.yaml` to define your tasks 83 | - Modify `src/bento_crew_demo/crew.py` to add your own logic, tools and specific args 84 | - Modify `src/bento_crew_demo/main.py` to add custom inputs for your agents and tasks 85 | 86 | ## Using Open-Source LLMs 87 | 88 | We recommend using [OpenLLM](https://github.com/bentoml/OpenLLM) on [BentoCloud](https://bentoml.com/) 89 | for fast and efficient private LLM deployment: 90 | ```bash 91 | # Install libraries 92 | pip install -U openllm bentoml 93 | openllm repo update 94 | 95 | # Login/Signup BentoCloud 96 | bentoml cloud login 97 | 98 | # Deploy mistral 7B 99 | openllm deploy mistral:7b-4bit --instance-type gpu.t4.1.8x32 100 | ``` 101 | Follow CLI output instructions to view deployment details on BentoCloud UI, and copy your 102 | deployed endpoint URL. 103 | 104 | > 💡 For other open-source LLMs, try running `openllm hello` command to explore more. 105 | 106 | Next, add the following custom LLM definition to the `BentoCrewDemoCrew` class, replace with your deployed API endpoint URL: 107 | ```python 108 | from crewai import Agent, Crew, Process, Task, LLM 109 | from crewai.project import CrewBase, agent, crew, task, llm 110 | 111 | @CrewBase 112 | class BentoCrewDemoCrew(): 113 | ... 114 | 115 | @llm 116 | def mistral(self) -> LLM: 117 | model_name="TheBloke/Mistral-7B-Instruct-v0.1-AWQ" 118 | return LLM( 119 | # add `openai/` prefix to model so litellm knows this is an openai 120 | # compatible endpoint and route to use OpenAI API Client 121 | model=f"openai/{model_name}", 122 | api_key="na", 123 | base_url="https:///v1" 124 | ) 125 | ``` 126 | 127 | And modify the `config/agent.yaml` file where you want to use this LLM, e.g.: 128 | 129 | ```diff 130 | researcher: 131 | role: > 132 | {topic} Senior Data Researcher 133 | goal: > 134 | Uncover cutting-edge developments in {topic} 135 | backstory: > 136 | You're a seasoned researcher with a knack for uncovering the latest 137 | developments in {topic}. Known for your ability to find the most relevant 138 | information and present it in a clear and concise manner. 139 | + llm: mistral 140 | ``` 141 | 142 | 143 | ## Trouble shooting 144 | 145 | BentoML 1.3.x requires opentelemetry-api==1.20.0 while CrewAI requires opentelemetry-api>=1.27.0; You may ignore the dependency resolver issue and proceed with the 1.27 version that CrewAi requires. BentoML team will update the package to support the newer version of opentelemetry libraries. 146 | 147 | 148 | ```bash 149 | # Create virtual env 150 | pip install virtualenv 151 | python -m venv venv 152 | source ./venv/bin/activate 153 | 154 | # Install CrewAI after BentoML to override conflict dependency versions 155 | pip install -U bentoml aiofiles 156 | pip install -U crewai "crewai[tools]" 157 | 158 | # Export dependencies list 159 | pip freeze > requirements.txt 160 | ``` 161 | 162 | ## Community 163 | 164 | Join the [BentoML developer community](https://l.bentoml.com/join-slack) on Slack for more support and discussions! 165 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [tool.poetry] 2 | name = "bento_crew_demo" 3 | version = "0.1.0" 4 | description = "bento-crew-demo using crewAI" 5 | authors = ["Your Name "] 6 | 7 | [tool.poetry.dependencies] 8 | python = ">=3.10,<=3.13" 9 | crewai = { extras = ["tools"], version = ">=0.67.1,<1.0.0" } 10 | opentelemetry-api = ">=1.22.0,<2.0.0" 11 | 12 | [tool.poetry.scripts] 13 | bento_crew_demo = "bento_crew_demo.main:run" 14 | run_crew = "bento_crew_demo.main:run" 15 | train = "bento_crew_demo.main:train" 16 | replay = "bento_crew_demo.main:replay" 17 | test = "bento_crew_demo.main:test" 18 | serve = "bento_crew_demo.main:serve" 19 | 20 | [build-system] 21 | requires = ["poetry-core"] 22 | build-backend = "poetry.core.masonry.api" 23 | -------------------------------------------------------------------------------- /src/bento_crew_demo/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/bentoml/BentoCrewAI/facafdb202252e2f68b4b7888a276204e69aa625/src/bento_crew_demo/__init__.py -------------------------------------------------------------------------------- /src/bento_crew_demo/config/agents.yaml: -------------------------------------------------------------------------------- 1 | researcher: 2 | role: > 3 | {topic} Senior Data Researcher 4 | goal: > 5 | Uncover cutting-edge developments in {topic} 6 | backstory: > 7 | You're a seasoned researcher with a knack for uncovering the latest 8 | developments in {topic}. Known for your ability to find the most relevant 9 | information and present it in a clear and concise manner. 10 | # llm: mistral 11 | 12 | reporting_analyst: 13 | role: > 14 | {topic} Reporting Analyst 15 | goal: > 16 | Create detailed reports based on {topic} data analysis and research findings 17 | backstory: > 18 | You're a meticulous analyst with a keen eye for detail. You're known for 19 | your ability to turn complex data into clear and concise reports, making 20 | it easy for others to understand and act on the information you provide. 21 | # llm: mistral -------------------------------------------------------------------------------- /src/bento_crew_demo/config/tasks.yaml: -------------------------------------------------------------------------------- 1 | research_task: 2 | description: > 3 | Conduct a thorough research about {topic} 4 | Make sure you find any interesting and relevant information given 5 | the current year is 2024. 6 | expected_output: > 7 | A list with 10 bullet points of the most relevant information about {topic} 8 | agent: researcher 9 | 10 | reporting_task: 11 | description: > 12 | Review the context you got and expand each topic into a full section for a report. 13 | Make sure the report is detailed and contains any and all relevant information. 14 | expected_output: > 15 | A fully fledge reports with the mains topics, each with a full section of information. 16 | Formatted as markdown without '```' 17 | agent: reporting_analyst 18 | -------------------------------------------------------------------------------- /src/bento_crew_demo/crew.py: -------------------------------------------------------------------------------- 1 | from crewai import Agent, Crew, Process, Task, LLM 2 | from crewai.project import CrewBase, agent, crew, task, llm 3 | 4 | # Uncomment the following line to use an example of a custom tool 5 | # from bento_crew_demo.tools.custom_tool import MyCustomTool 6 | 7 | # Check our tools documentations for more information on how to use them 8 | # from crewai_tools import SerperDevTool 9 | 10 | @CrewBase 11 | class BentoCrewDemoCrew(): 12 | """BentoCrewDemo crew""" 13 | @agent 14 | def researcher(self) -> Agent: 15 | return Agent( 16 | config=self.agents_config['researcher'], 17 | # tools=[MyCustomTool()], # Example of custom tool, loaded on the beginning of file 18 | verbose=True 19 | ) 20 | 21 | @agent 22 | def reporting_analyst(self) -> Agent: 23 | return Agent( 24 | config=self.agents_config['reporting_analyst'], 25 | verbose=True 26 | ) 27 | 28 | @task 29 | def research_task(self) -> Task: 30 | return Task( 31 | config=self.tasks_config['research_task'], 32 | ) 33 | 34 | @task 35 | def reporting_task(self) -> Task: 36 | return Task( 37 | config=self.tasks_config['reporting_task'], 38 | output_file='report.md' 39 | ) 40 | 41 | # # Uncomment the code below for using private deployed open-source LLM 42 | # @llm 43 | # def mistral(self) -> LLM: 44 | # model_name="TheBloke/Mistral-7B-Instruct-v0.1-AWQ" 45 | # return LLM( 46 | # # add `openai/` prefix to model so litellm knows this is an openai 47 | # # compatible endpoint and route to use OpenAI API Client 48 | # model=f"openai/{model_name}", 49 | # api_key="na", 50 | # base_url="https:///v1" 51 | # ) 52 | 53 | @crew 54 | def crew(self) -> Crew: 55 | """Creates the BentoCrewDemo crew""" 56 | return Crew( 57 | agents=self.agents, # Automatically created by the @agent decorator 58 | tasks=self.tasks, # Automatically created by the @task decorator 59 | process=Process.sequential, 60 | verbose=True, 61 | # process=Process.hierarchical, # In case you wanna use that instead https://docs.crewai.com/how-to/Hierarchical/ 62 | ) -------------------------------------------------------------------------------- /src/bento_crew_demo/main.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import os 3 | import subprocess 4 | import sys 5 | from bento_crew_demo.crew import BentoCrewDemoCrew 6 | 7 | # This main file is intended to be a way for your to run your 8 | # crew locally, so refrain from adding necessary logic into this file. 9 | # Replace with inputs you want to test with, it will automatically 10 | # interpolate any tasks and agents information 11 | 12 | def run(): 13 | """ 14 | Run the crew. 15 | """ 16 | inputs = { 17 | 'topic': 'AI LLMs' 18 | } 19 | BentoCrewDemoCrew().crew().kickoff(inputs=inputs) 20 | 21 | 22 | def train(): 23 | """ 24 | Train the crew for a given number of iterations. 25 | """ 26 | inputs = { 27 | "topic": "AI LLMs" 28 | } 29 | try: 30 | BentoCrewDemoCrew().crew().train(n_iterations=int(sys.argv[1]), filename=sys.argv[2], inputs=inputs) 31 | 32 | except Exception as e: 33 | raise Exception(f"An error occurred while training the crew: {e}") 34 | 35 | def replay(): 36 | """ 37 | Replay the crew execution from a specific task. 38 | """ 39 | try: 40 | BentoCrewDemoCrew().crew().replay(task_id=sys.argv[1]) 41 | 42 | except Exception as e: 43 | raise Exception(f"An error occurred while replaying the crew: {e}") 44 | 45 | def test(): 46 | """ 47 | Test the crew execution and returns the results. 48 | """ 49 | inputs = { 50 | "topic": "AI LLMs" 51 | } 52 | try: 53 | BentoCrewDemoCrew().crew().test(n_iterations=int(sys.argv[1]), openai_model_name=sys.argv[2], inputs=inputs) 54 | 55 | except Exception as e: 56 | raise Exception(f"An error occurred while replaying the crew: {e}") 57 | 58 | def serve(): 59 | """ 60 | Serve the crew as an REST API for client invocation 61 | """ 62 | working_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), '..')) 63 | 64 | # Construct the command 65 | command = f"bentoml serve bento_crew_demo.service:CrewAgent --working-dir {working_dir}" 66 | 67 | try: 68 | subprocess.run(command, shell=True, check=True) 69 | except subprocess.CalledProcessError as e: 70 | raise Exception(f"An error occurred while serving the crew: {e}") -------------------------------------------------------------------------------- /src/bento_crew_demo/service.py: -------------------------------------------------------------------------------- 1 | import os 2 | import asyncio 3 | import aiofiles 4 | from contextlib import redirect_stdout 5 | from typing import AsyncGenerator 6 | 7 | import bentoml 8 | from pydantic import Field 9 | 10 | from bento_crew_demo.crew import BentoCrewDemoCrew 11 | 12 | 13 | @bentoml.service( 14 | workers=1, 15 | resources={ 16 | "cpu": "2000m" 17 | }, 18 | traffic={ 19 | "concurrency": 16, 20 | "external_queue": True 21 | } 22 | ) 23 | class CrewAgent: 24 | 25 | @bentoml.task 26 | def run(self, topic: str = Field(default="LLM Agent")) -> str: 27 | return BentoCrewDemoCrew().crew().kickoff(inputs={"topic": topic}).raw 28 | 29 | # Streams the full Crew output to the client, including all intermediate steps. 30 | @bentoml.api 31 | async def stream( 32 | self, topic: str = Field(default="LLM Agent") 33 | ) -> AsyncGenerator[str, None]: 34 | read_fd, write_fd = os.pipe() 35 | 36 | async def kickoff(): 37 | with os.fdopen(write_fd, "w", buffering=1) as write_file: 38 | with redirect_stdout(write_file): 39 | await BentoCrewDemoCrew().crew().kickoff_async( 40 | inputs={"topic": topic} 41 | ) 42 | 43 | asyncio.create_task(kickoff()) 44 | 45 | # Yield CrewAgent logs as they are written to the pipe 46 | async with aiofiles.open(read_fd, mode='r') as read_file: 47 | async for line in read_file: 48 | if not line: 49 | break 50 | yield line 51 | -------------------------------------------------------------------------------- /src/bento_crew_demo/tools/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/bentoml/BentoCrewAI/facafdb202252e2f68b4b7888a276204e69aa625/src/bento_crew_demo/tools/__init__.py -------------------------------------------------------------------------------- /src/bento_crew_demo/tools/custom_tool.py: -------------------------------------------------------------------------------- 1 | from crewai_tools import BaseTool 2 | 3 | 4 | class MyCustomTool(BaseTool): 5 | name: str = "Name of my tool" 6 | description: str = ( 7 | "Clear description for what this tool is useful for, you agent will need this information to use it." 8 | ) 9 | 10 | def _run(self, argument: str) -> str: 11 | # Implementation goes here 12 | return "this is an example of a tool output, ignore it and move along." 13 | -------------------------------------------------------------------------------- /src/bentofile.yaml: -------------------------------------------------------------------------------- 1 | service: "bento_crew_demo.service:CrewAgent" 2 | labels: 3 | author: "bentoml-team" 4 | project: "crewai-example" 5 | include: 6 | - "bento_crew_demo/**/*.py" 7 | - "bento_crew_demo/**/*.yaml" 8 | python: 9 | lock_packages: false 10 | pip_args: "--no-deps" 11 | requirements_txt: "./requirements.txt" 12 | envs: 13 | - name: OPENAI_API_KEY 14 | docker: 15 | python_version: "3.11" 16 | 17 | -------------------------------------------------------------------------------- /src/requirements.txt: -------------------------------------------------------------------------------- 1 | aiofiles==24.1.0 2 | aiohappyeyeballs==2.4.3 3 | aiohttp==3.10.9 4 | aiosignal==1.3.1 5 | aiosqlite==0.20.0 6 | alembic==1.13.3 7 | aniso8601==9.0.1 8 | annotated-types==0.7.0 9 | anyio==4.6.0 10 | appdirs==1.4.4 11 | asgiref==3.8.1 12 | asttokens==2.4.1 13 | async-timeout==4.0.3 14 | attrs==23.2.0 15 | auth0-python==4.7.2 16 | backoff==2.2.1 17 | bcrypt==4.2.0 18 | beautifulsoup4==4.12.3 19 | bentoml==1.3.7 20 | blinker==1.8.2 21 | boto3==1.35.34 22 | botocore==1.35.34 23 | build==1.2.2.post1 24 | CacheControl==0.14.0 25 | cachetools==5.5.0 26 | cattrs==23.1.2 27 | certifi==2024.8.30 28 | cffi==1.17.1 29 | charset-normalizer==3.3.2 30 | chroma-hnswlib==0.7.3 31 | chromadb==0.4.24 32 | circus==0.18.0 33 | cleo==2.1.0 34 | click==8.1.7 35 | click-option-group==0.5.6 36 | cloudpickle==2.2.1 37 | cohere==5.11.0 38 | coloredlogs==15.0.1 39 | contourpy==1.3.0 40 | crashtest==0.4.1 41 | crewai==0.67.1 42 | crewai-tools==0.12.1 43 | cryptography==43.0.1 44 | cycler==0.12.1 45 | databricks-sdk==0.33.0 46 | dataclasses-json==0.6.7 47 | decorator==5.1.1 48 | deepmerge==2.0 49 | Deprecated==1.2.14 50 | deprecation==2.1.0 51 | dill==0.3.9 52 | distlib==0.3.8 53 | distro==1.9.0 54 | docker==7.1.0 55 | docstring_parser==0.16 56 | docx2txt==0.8 57 | dulwich==0.21.7 58 | durationpy==0.9 59 | embedchain==0.1.122 60 | exceptiongroup==1.2.2 61 | executing==2.1.0 62 | fastapi==0.115.0 63 | fastavro==1.9.7 64 | fastjsonschema==2.20.0 65 | filelock==3.16.1 66 | Flask==3.0.3 67 | flatbuffers==24.3.25 68 | fonttools==4.54.1 69 | frozenlist==1.4.1 70 | fs==2.4.16 71 | fsspec==2024.9.0 72 | gitdb==4.0.11 73 | GitPython==3.1.43 74 | google-api-core==2.20.0 75 | google-auth==2.35.0 76 | google-cloud-aiplatform==1.69.0 77 | google-cloud-bigquery==3.26.0 78 | google-cloud-core==2.4.1 79 | google-cloud-resource-manager==1.12.5 80 | google-cloud-storage==2.18.2 81 | google-crc32c==1.6.0 82 | google-pasta==0.2.0 83 | google-resumable-media==2.7.2 84 | googleapis-common-protos==1.65.0 85 | gptcache==0.1.44 86 | graphene==3.3 87 | graphql-core==3.2.4 88 | graphql-relay==3.2.0 89 | grpc-google-iam-v1==0.13.1 90 | grpcio==1.66.2 91 | grpcio-status==1.62.3 92 | grpcio-tools==1.62.3 93 | gunicorn==23.0.0 94 | h11==0.14.0 95 | h2==4.1.0 96 | hpack==4.0.0 97 | httpcore==1.0.6 98 | httptools==0.6.1 99 | httpx==0.27.2 100 | httpx-sse==0.4.0 101 | httpx-ws==0.6.1 102 | huggingface-hub==0.25.1 103 | humanfriendly==10.0 104 | hyperframe==6.0.1 105 | idna==3.10 106 | importlib-metadata==6.11.0 107 | importlib_resources==6.4.5 108 | inflection==0.5.1 109 | iniconfig==2.0.0 110 | installer==0.7.0 111 | instructor==1.3.3 112 | ipython==8.28.0 113 | itsdangerous==2.2.0 114 | jaraco.classes==3.4.0 115 | jedi==0.19.1 116 | Jinja2==3.1.4 117 | jiter==0.4.2 118 | jmespath==1.0.1 119 | joblib==1.4.2 120 | json_repair==0.25.3 121 | jsonpatch==1.33 122 | jsonpickle==3.3.0 123 | jsonpointer==3.0.0 124 | jsonref==1.1.0 125 | jsonschema==4.23.0 126 | jsonschema-specifications==2023.12.1 127 | keyring==24.3.1 128 | kiwisolver==1.4.7 129 | kubernetes==31.0.0 130 | lancedb==0.5.7 131 | langchain==0.2.16 132 | langchain-cohere==0.1.9 133 | langchain-community==0.2.17 134 | langchain-core==0.2.41 135 | langchain-experimental==0.0.65 136 | langchain-openai==0.1.25 137 | langchain-text-splitters==0.2.4 138 | langsmith==0.1.131 139 | litellm==1.48.17 140 | Mako==1.3.5 141 | Markdown==3.7 142 | markdown-it-py==3.0.0 143 | MarkupSafe==2.1.5 144 | marshmallow==3.22.0 145 | matplotlib==3.9.2 146 | matplotlib-inline==0.1.7 147 | mdurl==0.1.2 148 | mem0ai==0.1.17 149 | mlflow==2.16.2 150 | mlflow-skinny==2.16.2 151 | mmh3==5.0.1 152 | mock==4.0.3 153 | monotonic==1.6 154 | more-itertools==10.5.0 155 | mpmath==1.3.0 156 | msgpack==1.1.0 157 | multidict==6.1.0 158 | multiprocess==0.70.17 159 | mypy-extensions==1.0.0 160 | neo4j==5.25.0 161 | networkx==3.3 162 | nodeenv==1.9.1 163 | numpy==1.26.4 164 | nvidia-ml-py==11.525.150 165 | oauthlib==3.2.2 166 | onnxruntime==1.19.2 167 | openai==1.51.0 168 | opentelemetry-api==1.27.0 169 | opentelemetry-exporter-otlp-proto-common==1.27.0 170 | opentelemetry-exporter-otlp-proto-grpc==1.27.0 171 | opentelemetry-exporter-otlp-proto-http==1.27.0 172 | opentelemetry-instrumentation==0.48b0 173 | opentelemetry-instrumentation-aiohttp-client==0.41b0 174 | opentelemetry-instrumentation-asgi==0.48b0 175 | opentelemetry-instrumentation-fastapi==0.48b0 176 | opentelemetry-proto==1.27.0 177 | opentelemetry-sdk==1.27.0 178 | opentelemetry-semantic-conventions==0.48b0 179 | opentelemetry-util-http==0.48b0 180 | orjson==3.10.7 181 | outcome==1.3.0.post0 182 | overrides==7.7.0 183 | packaging==24.1 184 | pandas==2.2.3 185 | parameterized==0.9.0 186 | parso==0.8.4 187 | pathos==0.3.3 188 | pathspec==0.12.1 189 | pexpect==4.9.0 190 | pillow==10.4.0 191 | pip-requirements-parser==32.0.1 192 | pkginfo==1.11.1 193 | platformdirs==4.3.6 194 | pluggy==1.5.0 195 | poetry==1.8.3 196 | poetry-core==1.9.0 197 | poetry-plugin-export==1.8.0 198 | portalocker==2.10.1 199 | posthog==3.7.0 200 | pox==0.3.5 201 | ppft==1.7.6.9 202 | prometheus_client==0.21.0 203 | prompt_toolkit==3.0.48 204 | proto-plus==1.24.0 205 | protobuf==4.25.5 206 | psutil==6.0.0 207 | ptyprocess==0.7.0 208 | pulsar-client==3.5.0 209 | pure_eval==0.2.3 210 | py==1.11.0 211 | pyarrow==17.0.0 212 | pyasn1==0.6.1 213 | pyasn1_modules==0.4.1 214 | pycparser==2.22 215 | pydantic==2.9.2 216 | pydantic_core==2.23.4 217 | Pygments==2.18.0 218 | PyJWT==2.9.0 219 | pylance==0.9.18 220 | pyparsing==3.1.4 221 | pypdf==4.3.1 222 | PyPika==0.48.9 223 | pyproject_hooks==1.2.0 224 | pyright==1.1.383 225 | pysbd==0.3.4 226 | PySocks==1.7.1 227 | pytest==8.3.3 228 | python-dateutil==2.9.0.post0 229 | python-dotenv==1.0.1 230 | python-json-logger==2.0.7 231 | python-multipart==0.0.12 232 | pytube==15.0.0 233 | pytz==2024.2 234 | pyvis==0.3.2 235 | PyYAML==6.0.2 236 | pyzmq==26.2.0 237 | qdrant-client==1.11.3 238 | questionary==2.0.1 239 | rank-bm25==0.2.2 240 | RapidFuzz==3.10.0 241 | ratelimiter==1.2.0.post0 242 | referencing==0.35.1 243 | regex==2024.9.11 244 | requests==2.32.3 245 | requests-oauthlib==2.0.0 246 | requests-toolbelt==1.0.0 247 | retry==0.9.2 248 | rich==13.9.2 249 | rpds-py==0.20.0 250 | rsa==4.9 251 | s3transfer==0.10.2 252 | sagemaker==2.232.2 253 | sagemaker-core==1.0.10 254 | sagemaker-mlflow==0.1.0 255 | schema==0.7.7 256 | scikit-learn==1.5.2 257 | scipy==1.14.1 258 | selenium==4.25.0 259 | semver==3.0.2 260 | shapely==2.0.6 261 | shellingham==1.5.4 262 | simple-di==0.1.5 263 | six==1.16.0 264 | smdebug-rulesconfig==1.0.1 265 | smmap==5.0.1 266 | sniffio==1.3.1 267 | sortedcontainers==2.4.0 268 | soupsieve==2.6 269 | SQLAlchemy==2.0.35 270 | sqlparse==0.5.1 271 | stack-data==0.6.3 272 | starlette==0.38.6 273 | sympy==1.13.3 274 | tabulate==0.9.0 275 | tblib==3.0.0 276 | tenacity==8.5.0 277 | threadpoolctl==3.5.0 278 | tiktoken==0.7.0 279 | tokenizers==0.20.0 280 | tomli==2.0.2 281 | tomli_w==1.0.0 282 | tomlkit==0.13.2 283 | tornado==6.4.1 284 | tqdm==4.66.5 285 | traitlets==5.14.3 286 | trio==0.26.2 287 | trio-websocket==0.11.1 288 | trove-classifiers==2024.9.12 289 | typer==0.12.5 290 | types-requests==2.32.0.20240914 291 | typing-inspect==0.9.0 292 | typing_extensions==4.12.2 293 | tzdata==2024.2 294 | urllib3==2.2.3 295 | uv==0.4.18 296 | uvicorn==0.31.0 297 | uvloop==0.20.0 298 | virtualenv==20.26.6 299 | watchfiles==0.24.0 300 | wcwidth==0.2.13 301 | websocket-client==1.8.0 302 | websockets==13.1 303 | Werkzeug==3.0.4 304 | wrapt==1.16.0 305 | wsproto==1.2.0 306 | xattr==1.1.0 307 | yarl==1.13.1 308 | zipp==3.20.2 309 | --------------------------------------------------------------------------------