├── LLMs.png ├── LLMs.pptx ├── LLMs.csv ├── README.md └── LICENSE /LLMs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/continuedev/what-llm-to-use/HEAD/LLMs.png -------------------------------------------------------------------------------- /LLMs.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/continuedev/what-llm-to-use/HEAD/LLMs.pptx -------------------------------------------------------------------------------- /LLMs.csv: -------------------------------------------------------------------------------- 1 | Name,Creator,Base,ParameterCount1,ParameterCount2,ParameterCount3,ParameterCount4,License,URL 2 | Claude 2,Anthropic,Claude,Unknown,N/A,N/A,N/A,Proprietary,https://www.anthropic.com/index/claude-2 3 | NexusRaven-13B,Nexusflow,CodeLlama,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/Nexusflow/NexusRaven-13B 4 | Codellama-13b-oasst-sft-v10,OpenAssistant,CodeLlama,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/OpenAssistant/codellama-13b-oasst-sft-v10 5 | Phind-CodeLlama-34B-Python-v1,Phind,CodeLlama,34B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/Phind/Phind-CodeLlama-34B-Python-v1 6 | Phind-CodeLlama-34B-v2,Phind,CodeLlama,34B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/Phind/Phind-CodeLlama-34B-v2 7 | WizardCoder-Python-V1.0,WizardLM,CodeLlama,34B,13B,7B,N/A,Llama 2 Community,https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0 8 | Falcon,TII UAE,Falcon,40B,7B,N/A,N/A,Apache 2.0,https://huggingface.co/collections/tiiuae/falcon-64fb432660017eeec9837b5a 9 | Falcon Instruct,TII UAE,Falcon,40B,7B,N/A,N/A,Apache 2.0,https://huggingface.co/collections/tiiuae/falcon-64fb432660017eeec9837b5a 10 | GPT-3.5 Turbo,OpenAI,GPT,Unknown,N/A,N/A,N/A,Proprietary,https://openai.com/blog/gpt-3-5-turbo-fine-tuning-and-api-updates 11 | GPT-4,OpenAI,GPT,Unknown,N/A,N/A,N/A,Proprietary,https://openai.com/research/gpt-4 12 | CodeUp-Llama-2,DeepSE,Llama 2,13B,7B,N/A,N/A,OpenRAIL++,https://huggingface.co/deepse/CodeUp-Llama-2-13b-chat-hf 13 | Vicuna-v1.5,LM Sys,Llama 2,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/lmsys/vicuna-13b-v1.5 14 | Llama-2,Meta,Llama 2,70B,13B,7B,N/A,Llama 2 Community,https://ai.meta.com/llama/ 15 | Llama 2 Chat,Meta,Llama 2,70B,13B,7B,N/A,Llama 2 Community,https://ai.meta.com/llama/ 16 | Nous-Hermes-Llama2,Nous Research,Llama 2,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/NousResearch/Nous-Hermes-13b 17 | Redmond-Puffin,Nous Research,Llama 2,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/NousResearch/Redmond-Puffin-13B 18 | NSQL-Llama-2-7B,Number Station,Llama 2,7B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/NumbersStation/nsql-llama-2-7B 19 | LLaMA 2 SFT v10 ,OpenAssistant,Llama 2,70B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/OpenAssistant/llama2-70b-oasst-sft-v10 20 | OpenChat V3.2,OpenChat,Llama 2,13B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/openchat/openchat_v3.2 21 | Llama 2 32K,Together,Llama 2,7B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/togethercomputer/LLaMA-2-7B-32K 22 | Llama-2-7B-32K-Instruct,Together,Llama 2,7B,N/A,N/A,N/A,Llama 2 Community,https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct 23 | WizardLM V1.2,WizardLM,Llama 2,70B,13B,7B,N/A,Llama 2 Community,https://huggingface.co/WizardLM/WizardLM-13B-V1.2 24 | CodeLlama,Meta,Llama2,34B,13B,7B,N/A,Llama 2 Community,https://ai.meta.com/blog/code-llama-large-language-model-coding/ 25 | CodeLlama-Instruct,Meta,Llama2,34B,13B,7B,N/A,Llama 2 Community,https://ai.meta.com/blog/code-llama-large-language-model-coding/ 26 | CodeLlama-Python,Meta,Llama2,34B,13B,7B,N/A,Llama 2 Community,https://ai.meta.com/blog/code-llama-large-language-model-coding/ 27 | Mistral-7B-OpenOrca,Alignment Lab AI,Mistral,7B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca 28 | SQLCoder-7B,Defog,Mistral,7B,N/A,N/A,N/A,CC-BY-SA-4.0,https://huggingface.co/defog/sqlcoder-7b 29 | Mistral-7B,Mistral AI,Mistral,7B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/mistralai/Mistral-7B-v0.1 30 | Mistral-7b-Instruct,Mistral AI,Mistral,7B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF 31 | Mistralic-7B-1,SkunkworksAI,Mistral,7B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/SkunkworksAI/Mistralic-7B-1 32 | PaLM 2,Google,PaLM,340B,N/A,N/A,N/A,Proprietary,https://ai.google/discover/palm2/ 33 | Pythia,EleutherAI,Pythia,12B,7B,3B,1B,Apache 2.0,https://huggingface.co/EleutherAI/pythia-12b 34 | Open-Assistant Pythia SFT-4,OpenAssistant,Pythia,12B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 35 | Pythia-Chat-Base,Together,Pythia,7B,N/A,N/A,N/A,Apache 2.0,https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B 36 | Qwen,Alibaba Cloud,Qwen,14B,7B,N/A,N/A,Tongyi Qianwen,https://huggingface.co/Qwen/Qwen-14B 37 | Qwen-Chat,Alibaba Cloud,Qwen,14B,7B,N/A,N/A,Tongyi Qianwen,https://github.com/QwenLM/Qwen 38 | StableCode,Stability AI,Stable,3B,N/A,N/A,N/A,StableCode Research License,https://huggingface.co/collections/stabilityai/stablecode-64f9dfb4ebc8a1be0a3f7650 39 | Stable LM,Stability AI,Stable,7B,3B,N/A,N/A,CC-BY-SA-4.0,https://huggingface.co/collections/stabilityai/stable-lm-650852cfd55dd4e15cdcb30a 40 | Open-Assistant StableLM SFT-7,OpenAssistant,Stable LM,7B,N/A,N/A,N/A,CC-BY-SA-4.0,https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3 41 | StarCoder,BigCode,StarCoder,15B,N/A,N/A,N/A,BigCode OpenRAIL-M,https://huggingface.co/bigcode/starcoder 42 | SQLCoder2,Defog,Starcoder,15B,N/A,N/A,N/A,CC-BY-SA-4.0,https://huggingface.co/defog/sqlcoder2 43 | WizardCoder-15B-V1.0 ,WizardLM,StarCoder,15B,N/A,N/A,N/A,BigCode OpenRAIL-M,https://huggingface.co/WizardLM/WizardCoder-15B-V1.0 -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # What LLM to use? A perspective from the DevAI space 2 | 3 | With how fast things are moving in the DevAI space, a shorthand for the community of developers building software with the help of large language models (LLMs), it can be challenging to figure out which model to use. 4 | 5 | We started this repository based on our experiences as part of the [Continue community](https://github.com/continuedev/continue). Feel free to suggest improvements and help us keep it up-to-date by opening a pull request! 6 | 7 | ## What LLMs are there? 8 | 9 | There are A LOT of LLMs. We’ve decided to focus on the ones that we see folks using now: 10 | 11 | ![LLMs graphic](LLMs.png) 12 | 13 | You can find a CSV that includes all of these models and information about them [here](./LLMs.csv). 14 | 15 | ## What LLMs are being used while coding? 16 | 17 | ### How do folks decide? 18 | 19 | The first choice you typically make is whether you are going to use an **open-source** or a **commercial** model: 20 | 21 | - You usually select an **open-source** LLM when you want to keep your code within your environment, have enough available memory, want to keep your costs low, or want to be able to manage and optimize everything end-to-end. 22 | - You usually select a **commercial** LLM when you want the best model, prefer an easy and reliable setup, don’t have a lot of available memory, don’t mind your code leaving your environment, or are not deterred by cost concerns. 23 | 24 | If you decide to use an **open-source** LLM, your next decision is whether to set up the model on your local machine or on a hosted model provider: 25 | 26 | - You usually opt to use an open-source LLM on your _local machine_ when you have enough available memory, want free usage, or want to be able to use the model without needing an Internet connection. 27 | - You usually opt to use an open-source LLM on a _hosted provider_ when you want the best open-source model, don’t have a lot of available memory on your local machine, or want the model to serve multiple people. 28 | 29 | We maintain a guide on how to deploy an open-source code LLM for your team [here](https://github.com/continuedev/deploy-os-code-llm). 30 | 31 | If you decide to use a **commercial** LLM, you'll typically obtain API keys and play with multiple of them for comparison. Both the quality of the suggestions and the cost to use can be important criteria. 32 | 33 | ### Open Source 34 | 35 | This is a list of the **open-source** LLMs that developers are using while coding, roughly ordered from most popular to least popular, as of October 2023. 36 | 37 | #### 1. Code Llama 38 | 39 | [Code Llama](https://about.fb.com/news/2023/08/code-llama-ai-for-coding/) is an LLM trained by Meta for generating and discussing code. It is built on top of Llama 2. Even though it is below WizardCoder and Phind-CodeLlama on the [Big Code Models Leaderboard](https://huggingface.co/spaces/bigcode/bigcode-models-leaderboard), it is the base model for both of them. It also comes in a variety of sizes: 7B, 13B, and 34B, which makes it popular to use on local machines as well as with hosted providers. At this point, it is the most well-known open-source base model for coding and is leading the open-source effort to create coding capable LLMs. 40 | 41 |
42 | Details 43 | 44 | Creator: Meta 45 | Date released: August 24th, 2023 46 | License: Llama 2 Community 47 | Base model: Llama 2 48 | Parameters: 7B, 13B, 34B 49 | 50 |
51 | 52 | #### 2. WizardCoder 53 | 54 | [WizardCoder](https://github.com/nlpxucan/WizardLM/tree/main/WizardCoder) is an LLM built on top of Code Llama by the WizardLM team. The [Evol-Instruct method](https://x.com/WizardLM_AI/status/1705551243421090207?s=20) is adapted for coding tasks to create a training dataset, which is used to fine-tune Code Llama. It comes in the same sizes as Code Llama: 7B, 13B, and 34B. As a result, it is the most popular open-source instruction-tuned LLM so far. 55 | 56 |
57 | Details 58 | 59 | Creator: WizardLM 60 | Date released: August 26th, 2023 61 | License: Llama 2 Community 62 | Base model: Code Llama 63 | Parameters: 7B, 13B, 34B 64 | 65 |
66 | 67 | #### 3. Phind-CodeLlama 68 | 69 | [Phind-CodeLlama](https://www.phind.com/blog/code-llama-beats-gpt4) is an LLM built on top of Code Llama by Phind. A proprietary dataset of ~80k high-quality programming problems and solutions was used to fine-tune Code Llama. That fine-tuned model was then further fine-tuned on 1.5B additional tokens. It currently leads on the [Big Code Models Leaderboard](https://huggingface.co/spaces/bigcode/bigcode-models-leaderboard). However, it is only available as a 34B parameter model, so it requires more available memory to be used. 70 | 71 |
72 | Details 73 | 74 | Creator: Phind 75 | Date released: August 28th, 2023 76 | License: Llama 2 Community 77 | Base model: Code Llama 78 | Parameters: 34B 79 | 80 |
81 | 82 | #### 4. Mistral 83 | 84 | [Mistral](https://mistral.ai/news/announcing-mistral-7b) is a 7B parameter LLM trained by Mistal AI. It is the most recently released model on this list, having dropped at the end of September. Mistal AI says that it “approaches CodeLlama 7B performance on code, while remaining good at English tasks”. Despite only being available in the one small size, people are quite excited about it in the first couple weeks after release. The first fine-tuned LLMs that use it as their base are now beginning to emerge, and we are likely to see more going forward. 85 | 86 |
87 | Details 88 | 89 | Creator: Mistral AI 90 | Date released: September 27th, 2023 91 | License: Apache 2.0 92 | Base model: Mistral 93 | Parameters: 7B 94 | 95 |
96 | 97 | #### 5. StarCoder 98 | 99 | [StarCoder](https://huggingface.co/blog/starcoder) is a 15B parameter LLM trained by BigCode, which was ahead of its time when it was released in May. It was trained on 80+ programming languages from The Stack (v1.2) with opt-out requests excluded. It is not an instruction model and commands like "Write a function that computes the square root" do not work well. However, by using the [Tech Assistant prompt](https://huggingface.co/datasets/bigcode/ta-prompt) you can make it more helpful. 100 | 101 |
102 | Details 103 | 104 | Creator: BigCode 105 | Date released: May 4th, 2023 106 | License: OpenRAIL-M 107 | Base model: StarCoder 108 | Parameters: 15B 109 | 110 |
111 | 112 | #### 6. DeepSeek Coder 113 | 114 | [DeepSeek Coder](deepseekcoder.github.io) is an LLM trained by DeepSeek AI on 2 trillion tokens. With a dataset made up of over more than 80 programming languages, it's the newest model on this list and has been reported to score quite high on various coding-related benchmarks. 115 | 116 |
117 | Details 118 | 119 | Creator: DeepSeek AI 120 | Date released: November 3rd, 2023 121 | License: DeepSeek License Agreement 122 | Base model: DeepSeek Coder 123 | Parameters: 1.3B, 6.7B, 33B 124 | 125 |
126 | 127 | #### 7. Llama2 128 | 129 | [Llama 2](https://ai.meta.com/llama/#inside-the-model) is an LLM trained by Meta on 2 trillion tokens. It is the most popular open source LLM overall, so some developers use it, despite it not being as good as many of the models above at making code edits. It is also important because Code Llama, the most popular LLM for coding, is built on top of it, which in turn is the foundation for WizardCoder and Phind-CodeLlama. 130 | 131 |
132 | Details 133 | 134 | Creator: Meta 135 | Date released: July 18th, 2023 136 | License: Llama 2 Community 137 | Base model: Llama 2 138 | Parameters: 7B, 13B, 70B 139 | 140 |
141 | 142 | ### Commercial 143 | 144 | This is a list of the **commercial** LLMs that developers are using while coding, roughly ordered from most popular to least popular, as of October 2023. 145 | 146 | #### 1. GPT-4 147 | 148 | [GPT-4](https://openai.com/research/gpt-4) from OpenAI is generally considered to be the best LLM to use while coding. It is quite helpful when generating and discussing code. However, it requires you to send your code to OpenAI via their API and can be quite expensive. Nevertheless, it is the most popular LLM for coding overall and the majority of developers use it while coding at this point. All OpenAI API users who made a successful payment of $1 or more before July 6, 2023 were given access to GPT-4, and they plan to open up access to all developers soon. 149 | 150 | #### 2. GPT-4 Turbo 151 | 152 | [GPT-4 Turbo](https://openai.com/blog/new-models-and-developer-products-announced-at-devday) from OpenAI is cheaper and faster than GPT-4. It has knowledge cutoff of April 2023 and has a 128k context window. It is currently in preview, as of November 2023, but anyone with an OpenAI API account and existing GPT-4 access can use it. 153 | 154 | #### 3. GPT-3.5 Turbo 155 | 156 | [GPT-3.5 Turbo](https://platform.openai.com/docs/models/gpt-3-5) from OpenAI is cheaper and faster than GPT-4; however, its suggestions are not nearly as helpful. It also requires you to send your code to OpenAI via their API. It is the second most popular LLM for coding overall so far. All developers can use it now after signing up for an OpenAI account. 157 | 158 | #### 4. Claude 2 159 | 160 | [Claude 2](https://www.anthropic.com/index/claude-2) is an LLM trained by Anthropic, which has greatly improved coding skills compared to the first version of Claude. It especially excels, relative to other LLMs, when you provide a lot of context. It requires you to send your code to Anthropic via their API. You must apply to get access to Claude 2 at this point. 161 | 162 | #### 5. PaLM 2 163 | 164 | [PaLM 2](https://ai.google/discover/palm2) is an LLM trained by Google. To try it out, you must send your code to Google via the PaLM API after obtaining an API key via MakerSuite, both of which are currently in public preview. 165 | 166 | ## Contributing 167 | 168 | If you see a model missing or want to share an opinion, we welcome you to open a PR or an issue! We hope to maintain a community-driven and up-to-date index of the most helpful language models for coding. 169 | 170 | *If you liked this blog post and want to read more about DevAI–the community of folks building software with the help of LLMs–in the future, join our monthly newsletter [here](https://continue.dev#newsletter).* 171 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | --------------------------------------------------------------------------------