├── .gitignore ├── LICENSE ├── README.md ├── example.env ├── gemma.jpeg ├── langchain_gemma_ollama.py └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | .venv/ 2 | .env 3 | __pycache__/ 4 | .chainlit/ 5 | .files/ 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Sudarshan Koirala 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # langchain-gemma-ollama-chainlit 2 | Simple Chat UI using Gemma model via Ollama, LangChain and Chainlit 3 | 4 | ### Open Source in Action 🚀 5 | - [Gemma](https://ai.google.dev/gemma/docs/model_card) as Large Language model via [Ollama](https://ollama.com/) 6 | - [LangChain](https://www.langchain.com/) as a Framework for LLM 7 | - [LangSmith](https://smith.langchain.com/) for developing, collaborating, testing, deploying, and monitoring LLM applications. 8 | - [Chainlit](https://docs.chainlit.io/langchain) for deploying. 9 | 10 | ## System Requirements 11 | 12 | You must have Python 3.10 or later installed. Earlier versions of python may not compile. 13 | 14 | ## Steps to Replicate 15 | 16 | 1. Fork this repository and create a codespace in GitHub as I showed you in the youtube video OR Clone it locally. 17 | ``` 18 | git clone https://github.com/sudarshan-koirala/langchain-gemma-ollama-chainlit.git 19 | cd langchain-gemma-ollama-chainlit 20 | ``` 21 | 22 | 2. Create a virtualenv and activate it 23 | ``` 24 | python3 -m venv .venv && source .venv/bin/activate 25 | ``` 26 | 27 | 3. OPTIONAL - Rename example.env to .env with `cp example.env .env`and input the environment variables from [LangSmith](https://smith.langchain.com/). You need to create an account in LangSmith website if you haven't already. 28 | ``` 29 | LANGCHAIN_TRACING_V2=true 30 | LANGCHAIN_ENDPOINT="https://api.smith.langchain.com" 31 | LANGCHAIN_API_KEY="your-api-key" 32 | LANGCHAIN_PROJECT="your-project" 33 | ``` 34 | 35 | 4. Run the following command in the terminal to install necessary python packages: 36 | ``` 37 | pip install -r requirements.txt 38 | ``` 39 | 40 | 5. Run the following command in your terminal to start the chat UI: 41 | ``` 42 | chainlit run langchain_gemma_ollama.py 43 | ``` 44 | 45 | ## Disclaimer 46 | This is test project and is presented in my youtube video to learn new stuffs using the available open source projects and model. It is not meant to be used in production as it's not production ready. You can modify the code and use for your usecases ✌️ 47 | -------------------------------------------------------------------------------- /example.env: -------------------------------------------------------------------------------- 1 | LANGCHAIN_TRACING_V2=true 2 | LANGCHAIN_ENDPOINT="https://api.smith.langchain.com" 3 | LANGCHAIN_API_KEY="your-api-key" 4 | LANGCHAIN_PROJECT="your-project" -------------------------------------------------------------------------------- /gemma.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sudarshan-koirala/langchain-gemma-ollama-chainlit/d6c90eb7cc100446c68db9eba1290e47e7210590/gemma.jpeg -------------------------------------------------------------------------------- /langchain_gemma_ollama.py: -------------------------------------------------------------------------------- 1 | from langchain_community.llms import Ollama 2 | from langchain.prompts import ChatPromptTemplate 3 | from langchain.schema import StrOutputParser 4 | from langchain.schema.runnable import Runnable 5 | from langchain.schema.runnable.config import RunnableConfig 6 | 7 | import chainlit as cl 8 | 9 | 10 | @cl.on_chat_start 11 | async def on_chat_start(): 12 | 13 | # Sending an image with the local file path 14 | elements = [ 15 | cl.Image(name="image1", display="inline", path="gemma.jpeg") 16 | ] 17 | await cl.Message(content="Hello there, I am Gemma. How can I help you ?", elements=elements).send() 18 | model = Ollama(model="gemma:2b") 19 | prompt = ChatPromptTemplate.from_messages( 20 | [ 21 | ( 22 | "system", 23 | "You're a very knowledgeable historian who provides accurate and eloquent answers to historical questions.", 24 | ), 25 | ("human", "{question}"), 26 | ] 27 | ) 28 | runnable = prompt | model | StrOutputParser() 29 | cl.user_session.set("runnable", runnable) 30 | 31 | 32 | @cl.on_message 33 | async def on_message(message: cl.Message): 34 | runnable = cl.user_session.get("runnable") # type: Runnable 35 | 36 | msg = cl.Message(content="") 37 | 38 | async for chunk in runnable.astream( 39 | {"question": message.content}, 40 | config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]), 41 | ): 42 | await msg.stream_token(chunk) 43 | 44 | await msg.send() 45 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | langchain 2 | chainlit 3 | openai 4 | --------------------------------------------------------------------------------