├── LICENSE └── README.md /LICENSE: -------------------------------------------------------------------------------- 1 | Creative Commons Legal Code 2 | 3 | CC0 1.0 Universal 4 | 5 | CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE 6 | LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN 7 | ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS 8 | INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES 9 | REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS 10 | PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM 11 | THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED 12 | HEREUNDER. 13 | 14 | Statement of Purpose 15 | 16 | The laws of most jurisdictions throughout the world automatically confer 17 | exclusive Copyright and Related Rights (defined below) upon the creator 18 | and subsequent owner(s) (each and all, an "owner") of an original work of 19 | authorship and/or a database (each, a "Work"). 20 | 21 | Certain owners wish to permanently relinquish those rights to a Work for 22 | the purpose of contributing to a commons of creative, cultural and 23 | scientific works ("Commons") that the public can reliably and without fear 24 | of later claims of infringement build upon, modify, incorporate in other 25 | works, reuse and redistribute as freely as possible in any form whatsoever 26 | and for any purposes, including without limitation commercial purposes. 27 | These owners may contribute to the Commons to promote the ideal of a free 28 | culture and the further production of creative, cultural and scientific 29 | works, or to gain reputation or greater distribution for their Work in 30 | part through the use and efforts of others. 31 | 32 | For these and/or other purposes and motivations, and without any 33 | expectation of additional consideration or compensation, the person 34 | associating CC0 with a Work (the "Affirmer"), to the extent that he or she 35 | is an owner of Copyright and Related Rights in the Work, voluntarily 36 | elects to apply CC0 to the Work and publicly distribute the Work under its 37 | terms, with knowledge of his or her Copyright and Related Rights in the 38 | Work and the meaning and intended legal effect of CC0 on those rights. 39 | 40 | 1. Copyright and Related Rights. A Work made available under CC0 may be 41 | protected by copyright and related or neighboring rights ("Copyright and 42 | Related Rights"). Copyright and Related Rights include, but are not 43 | limited to, the following: 44 | 45 | i. the right to reproduce, adapt, distribute, perform, display, 46 | communicate, and translate a Work; 47 | ii. moral rights retained by the original author(s) and/or performer(s); 48 | iii. publicity and privacy rights pertaining to a person's image or 49 | likeness depicted in a Work; 50 | iv. rights protecting against unfair competition in regards to a Work, 51 | subject to the limitations in paragraph 4(a), below; 52 | v. rights protecting the extraction, dissemination, use and reuse of data 53 | in a Work; 54 | vi. database rights (such as those arising under Directive 96/9/EC of the 55 | European Parliament and of the Council of 11 March 1996 on the legal 56 | protection of databases, and under any national implementation 57 | thereof, including any amended or successor version of such 58 | directive); and 59 | vii. other similar, equivalent or corresponding rights throughout the 60 | world based on applicable law or treaty, and any national 61 | implementations thereof. 62 | 63 | 2. Waiver. To the greatest extent permitted by, but not in contravention 64 | of, applicable law, Affirmer hereby overtly, fully, permanently, 65 | irrevocably and unconditionally waives, abandons, and surrenders all of 66 | Affirmer's Copyright and Related Rights and associated claims and causes 67 | of action, whether now known or unknown (including existing as well as 68 | future claims and causes of action), in the Work (i) in all territories 69 | worldwide, (ii) for the maximum duration provided by applicable law or 70 | treaty (including future time extensions), (iii) in any current or future 71 | medium and for any number of copies, and (iv) for any purpose whatsoever, 72 | including without limitation commercial, advertising or promotional 73 | purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each 74 | member of the public at large and to the detriment of Affirmer's heirs and 75 | successors, fully intending that such Waiver shall not be subject to 76 | revocation, rescission, cancellation, termination, or any other legal or 77 | equitable action to disrupt the quiet enjoyment of the Work by the public 78 | as contemplated by Affirmer's express Statement of Purpose. 79 | 80 | 3. Public License Fallback. Should any part of the Waiver for any reason 81 | be judged legally invalid or ineffective under applicable law, then the 82 | Waiver shall be preserved to the maximum extent permitted taking into 83 | account Affirmer's express Statement of Purpose. In addition, to the 84 | extent the Waiver is so judged Affirmer hereby grants to each affected 85 | person a royalty-free, non transferable, non sublicensable, non exclusive, 86 | irrevocable and unconditional license to exercise Affirmer's Copyright and 87 | Related Rights in the Work (i) in all territories worldwide, (ii) for the 88 | maximum duration provided by applicable law or treaty (including future 89 | time extensions), (iii) in any current or future medium and for any number 90 | of copies, and (iv) for any purpose whatsoever, including without 91 | limitation commercial, advertising or promotional purposes (the 92 | "License"). The License shall be deemed effective as of the date CC0 was 93 | applied by Affirmer to the Work. Should any part of the License for any 94 | reason be judged legally invalid or ineffective under applicable law, such 95 | partial invalidity or ineffectiveness shall not invalidate the remainder 96 | of the License, and in such case Affirmer hereby affirms that he or she 97 | will not (i) exercise any of his or her remaining Copyright and Related 98 | Rights in the Work or (ii) assert any associated claims and causes of 99 | action with respect to the Work, in either case contrary to Affirmer's 100 | express Statement of Purpose. 101 | 102 | 4. Limitations and Disclaimers. 103 | 104 | a. No trademark or patent rights held by Affirmer are waived, abandoned, 105 | surrendered, licensed or otherwise affected by this document. 106 | b. Affirmer offers the Work as-is and makes no representations or 107 | warranties of any kind concerning the Work, express, implied, 108 | statutory or otherwise, including without limitation warranties of 109 | title, merchantability, fitness for a particular purpose, non 110 | infringement, or the absence of latent or other defects, accuracy, or 111 | the present or absence of errors, whether or not discoverable, all to 112 | the greatest extent permissible under applicable law. 113 | c. Affirmer disclaims responsibility for clearing rights of other persons 114 | that may apply to the Work or any use thereof, including without 115 | limitation any person's Copyright and Related Rights in the Work. 116 | Further, Affirmer disclaims responsibility for obtaining any necessary 117 | consents, permissions or other rights required for any use of the 118 | Work. 119 | d. Affirmer understands and acknowledges that Creative Commons is not a 120 | party to this document and has no duty or obligation with respect to 121 | this CC0 or use of the Work. 122 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![join community](https://pysemtec.org/img/join-community.svg "join community")](https://pysemtec.org) 2 | # Semantic Python Overview 3 | 4 | This repository aims to collect and curate a list of projects which are related both to python and semantic technologies (RDF, OWL, SPARQL, Reasoning, ...). It is inspired by collections like [awesome lists](https://github.com/sindresorhus/awesome#readme). The list might be incomplete and biased, due to the limited knowledge of its authors. Improvements are very welcome. Feel free to file an issue or a pull request. Every section is alphabetically sorted. 5 | 6 | Furthermore, this repository might serve as a **cristallization point for a community** interested in such projects – and how they might productively interact. See [this discussion](https://github.com/cknoll/semantic-python-overview/discussions/1) for more information. 7 | 8 | 9 | ## Established Projects 10 | 11 | - [Bioregistry](https://github.com/biopragmatics/bioregistry) - The Bioregistry 12 | - docs: https://bioregistry.readthedocs.io 13 | - website: https://bioregistry.io/ 14 | - features: 15 | - Open source (and CC 0) repository of prefixes, their associated metadata, and mappings to external registries' prefixes 16 | - Standarization of prefixes and CURIEs 17 | - Interconversion between CURIEs and IRIs 18 | - Generation of context-specific prefix maps for usage in RDF, LinkML, SSSOM, OWL, etc. 19 | - [brickschema](https://github.com/BrickSchema/py-brickschema) – Brick Ontology Python package 20 | - Brick is an open-source effort to standardize semantic descriptions of the physical, logical and virtual assets in buildings and the relationships between them. 21 | - docs: https://brickschema.readthedocs.io/en/latest/ 22 | - website: https://brickschema.org/ 23 | - features: 24 | - basic inference with different reasoners 25 | - web based interaction (by means of [Yasgui](https://github.com/TriplyDB/Yasgui)) 26 | - Translations from different formats (Haystack, VBIS) 27 | - [Cooking with Python and KBpedia](https://www.mkbergman.com/cooking-with-python-and-kbpedia/) 28 | - Tutorial series on "how to pick tools and then use Python for using and manipulating the KBpedia knowledge graph" 29 | - [Material in form of Jupyter Notebooks](https://github.com/Cognonto/CWPK), 30 | - accompanying python package [cowpoke](https://github.com/Cognonto/cowpoke), 31 | - [CubicWeb](https://www.cubicweb.org/) a framework to build semantic web applications 32 | - website: https://www.cubicweb.org 33 | - docs: https://cubicweb.readthedocs.io/en/latest/ 34 | - features: 35 | - An engine driven by the explicit data model of the application 36 | - RQL, an intuitive query language close to the business vocabulary 37 | - An architecture that separates data selection and visualisation 38 | - Data security by design 39 | - An efficient data storage 40 | 41 | - [Eddy](https://github.com/obdasystems/eddy) - graphical ontology editor 42 | - website: https://www.obdasystems.com/eddy 43 | - features: 44 | - graphical ontology editing 45 | - uses bespoke Graphol format but has an OWL2 export 46 | - visualization built on PyQt5 47 | - literature references: 48 | - [*Lembo, D and Pantaleone, D and Santarelli, V and Savo, DF: **Eddy: A Graphical Editor for OWL 2 Ontologies**. IJCAI 2016; 4252-4253*](https://cs.unibg.it/savo/papers/LPSS-IJCAI-16.pdf) 49 | - [fastobo-py](https://github.com/fastobo/fastobo-py): Python bindings for *fastobo* (rust library to parse OBO 1.4) 50 | - features: 51 | - load, edit and serialize ontologies in the OBO 1.4 format 52 | - [FunOwl](https://github.com/hsolbrig/funowl) – functional OWL syntax for Python 53 | - features: 54 | - provide a pythonic API that follows the OWL functional model for constructing OWL 55 | - [Gastrodon](https://github.com/paulhoule/gastrodon) - puts RDF data on your fingertips in Pandas; gateway to matplotlib, scikit-learn and other visualization tools. 56 | - features: 57 | - interpolate variables into SPARQL queries 58 | - access local RDFlib graphs and remote SPARQL protocol endpoints 59 | - convert SPARQL result set to pandas dataframes 60 | - understandable error messages 61 | - input/output graphs in Turtle form 62 | - conversion between RDF collections and Python collections 63 | - Sphinx domain to incorporate RDF data into documentation 64 | - [gizmos](https://github.com/ontodev/gizmos) – Utilities for ontology development 65 | - features: 66 | - modules for "export", "extract", "tree"-rendering 67 | - [Jabberwocky](https://github.com/sap218/jabberwocky) – a toolkit for ontologies 68 | - features: 69 | - associated text mining using an ontology terms & synonyms 70 | - tf-idf for synonym curation then adding those synonyms into an ontology 71 | - [kglab](https://github.com/DerwenAI/kglab) - Graph Data Science 72 | - docs: https://derwen.ai/docs/kgl/ 73 | - tutorial: https://derwen.ai/docs/kgl/tutorial/ 74 | - features: 75 | - an abstraction layer in Python for building knowledge graphs, integrated with popular graph libraries 76 | - perspective: there are several "camps" of graph technologies, with little discussion between them 77 | - focus on supporting "Hybrid AI" approaches that combine two or more graph technologies with other ML work 78 | - PyData stack – e.g., Pandas, scikit-learn, etc. – allows for graph work within data science workflows 79 | - scale-out tools – e.g., RAPIDS, Arrow/Parquet, Dask – provide for scaling graph computation (not necessarily databases) 80 | - graph algorithm libraries include NetworkX, iGraph, cuGraph – plus related visualization libraries in PyVis, Cairo, etc. 81 | - W3C libraries in Py also lacked full integration: RDFlib, pySHACL, OWL-RL, etc. 82 | - pslpython provides for _probabilistic soft logic_, working with uncertainty in probabilistic graphs 83 | - additional integration paths and examples show how to work with deep learning (PyG) 84 | - import paths from graph databases, such as Neo4j 85 | - import paths from note-taking tools, such as Roam Research 86 | - usage in [MkRefs](https://github.com/DerwenAI/mkrefs) to add semantic features into MkDocs so that open source projects can federate bibliographies, shared glossaries, etc. 87 | - kglab team provides hands-on workshops at technology conferences for people to gain experience with these different graph approaches 88 | - [KGX](https://github.com/biolink/kgx) - Library for building and exchanging knowledge graphs 89 | - docs: https://kgx.readthedocs.io/ 90 | - features: 91 | - Load graphs into an in-memory model to facilitate data integration, validation, and graph operations 92 | - Provides an easy way to bring data into Biolink Model, a a high-level data model for biomedical knowledge graphs 93 | - The core data structure is a Property Graph (PG), represented internally using a `networkx.MultiDiGraph` 94 | - Supports various input and output formats including, 95 | - RDF serializations 96 | - SPARQL endpoints 97 | - Neo4j endpoints 98 | - CSV/TSV and JSON 99 | - OWL 100 | - OBOGraph JSON format 101 | - SSSOM 102 | - [LangChain](https://github.com/langchain-ai/langchain)'s GraphSparqlQAChain – A LangChain module for making RDF and OWL accessible via natural language 103 | - docs: https://python.langchain.com/docs/use_cases/graph/graph_sparql_qa 104 | - features: 105 | - Generates SPARQL SELECT and UPDATE queries from natural language 106 | - Runs the generated queries against local files, endpoints, or triple stores 107 | - Returns natural language responses 108 | - [LinkML](https://github.com/linkml/linkml) – Linked Open Data Modeling Language 109 | - features: 110 | - A high level simple way of specifying data models, optionally enhanced with semantic annotations 111 | - A python framework for compiling these data models to json-ld, json-schema, shex, shacl, owl, sql-ddl 112 | - A python framework for data conversion and validation, as well as generated Python dataclasses 113 | - [Macleod](https://github.com/thahmann/macleod) – Ontology development environment for Common Logic (CL) 114 | - features: 115 | - Translating a CLIF file to formats supported by FOL reasoners 116 | - Extracting an OWL approximation of a CLIF ontology 117 | - Verifying (non-trivial) logical consistency of a CLIF ontology 118 | - Proving theorems/lemmas, such as properties of concepts and relations or competency questions 119 | - GUI (alpha state) 120 | - [Morph-KGC](https://github.com/oeg-upm/morph-kgc) – System to create RDF and RDF-star knowledge graphs from heterogeneous sources with R2RML, RML and RML-star 121 | - docs: https://morph-kgc.readthedocs.io 122 | - features: 123 | - support for relational databases, tabular files (e.g. CSV, Excel, Parquet) and hierarchical files (XML and JSON) 124 | - generates RDF and RDF-star knowledge graphs by running through the command line or as a library 125 | - integrates with RDFlib and Oxigraph to load the generated RDF directly to those libraries 126 | - [nxontology](https://github.com/related-sciences/nxontology) – NetworkX-based library for representing ontologies 127 | - features: 128 | - load ontologies into a `networkx.DiGraph` or `MultiDiGraph` from `.obo`, `.json`, or `.owl` formats 129 | (powered by pronto / fastobo) 130 | - compute information content scores for nodes and semantic similarity scores for node pairs 131 | - [obonet](https://github.com/dhimmel/obonet) – read OBO-formatted ontologies into NetworkX 132 | - features: 133 | - Load an `.obo` file into a `networkx.MultiDiGraph` 134 | - Users should try [nxontology](https://github.com/related-sciences/nxontology) first, as a more general purpose successor to this project 135 | - [OnToology](https://github.com/OnToology/OnToology) – System for collaborative ontology development process 136 | - docs: http://ontoology.linkeddata.es/stepbystep 137 | - live version: http://ontoology.linkeddata.es/ 138 | - citable reference: https://doi.org/10.1016/j.websem.2018.09.003 139 | - [OntoPilot](https://github.com/stuckyb/ontopilot) – software for ontology development and deployment 140 | - docs: https://github.com/stuckyb/ontopilot/wiki 141 | - features: 142 | - support end users in ontology development, documentation and maintainance 143 | - convert spreadsheet data (one entity per row) to owl files 144 | - call a reasoner before triple-store insertion 145 | - [ontospy](https://github.com/lambdamusic/Ontospy) – Python library and command-line interface for inspecting and visualizing RDF models 146 | - docs: http://lambdamusic.github.io/Ontospy/ 147 | - features: 148 | - extract and print out any ontology-related information 149 | - convert different OWL syntax variants 150 | - generate html documentation for an ontology 151 | - [ontor](https://github.com/felixocker/ontor) – Python library for manipulating and vizualizing OWL ontologies in Python 152 | - features: 153 | - tool set based on owlready2 and networkx 154 | - [owlready2](https://bitbucket.org/jibalamy/owlready2/src/master/README.rst) – ontology oriented programming in Python 155 | - docs: https://owlready2.readthedocs.io/en/latest/index.html 156 | - features: 157 | - parse owl files (RDF/XML or OWL/XML) 158 | - parse SWRL rules 159 | - call reasoner (via java) 160 | - literature references: 161 | - [*Lamy, JB: Owlready: **Ontology-oriented programming in Python with automatic classification and high level constructs for biomedical ontologies**. Artificial Intelligence In Medicine 2017;80:11-28*](http://www.lesfleursdunormal.fr/_downloads/article_owlready_aim_2017.pdf) 162 | - [*Lamy, JB: **Ontologies with Python**, Apress, 2020*](https://www.apress.com/fr/book/9781484265512) 163 | - accompanying material: 164 | - [Oxrdflib](https://github.com/oxigraph/oxrdflib) – Oxrdflib provides rdflib stores using pyoxigraph (rust-based) 165 | - could be used as drop-in replacements of the rdflib default ones 166 | - [pronto](https://github.com/althonos/pronto): library to parse, browse, create, and export ontologies 167 | - features: 168 | -supports several ontology languages and formats 169 | - docs: https://pronto.readthedocs.io/en/latest/api.html 170 | - [pyfactxx](https://github.com/tilde-lab/pyfactxx) – Python bindings for FaCT++ OWL 2 C++ reasoner 171 | - features: 172 | - well-optimized reasoner for SROIQ(D) description logic, with additional improvements 173 | - [rdflib](https://github.com/RDFLib/rdflib) integration 174 | - easy cross-platform installation 175 | - [PyFuseki](https://github.com/yubinCloud/pyfuseki) – Library that interact with Jena Fuseki (SPARQL server): 176 | - docs: https://yubincloud.github.io/pyfuseki/ 177 | 178 | - [PyKEEN](https://github.com/pykeen/pykeen) (**Py**thon **K**nowl**E**dge **E**mbeddi**N**gs) – Python package to train and evaluate knowledge graph embedding models 179 | - features: 180 | - 44 Models 181 | - 37 Datasets 182 | - 5 Inductive Datasets 183 | - support for multi-modal information 184 | - [PyLD](https://github.com/digitalbazaar/pyld) - A JSON-LD processor written in Python 185 | - conforms: 186 | - JSON-LD 1.1, W3C Candidate Recommendation, 2019-12-12 or newer 187 | - JSON-LD 1.1 Processing Algorithms and API, W3C Candidate Recommendation, 2019-12-12 or newer 188 | - JSON-LD 1.1 Framing, W3C Candidate Recommendation, 2019-12-12 or newer 189 | - [pyLoDStorage](https://github.com/WolfgangFahl/pyLoDStorage) – python library to interchange data between SPARQL-, JSON and SQL-endpoints 190 | - features: 191 | - Integration of [tabulate library](https://pypi.org/project/tabulate/) 192 | - QueryManager class for handling named queries 193 | - Basic data structure: **l**ists of **d**icts (thus: "LoD") 194 | - docs: https://wiki.bitplan.com/index.php/PyLoDStorage 195 | - [PyOBO](https://github.com/pyobo/pyobo) 196 | - docs: https://pyobo.readthedocs.io 197 | - features: 198 | - Provides unified, high-level access to names, descriptions, synonyms, xrefs, hierarchies, properties, relationships, etc. in ontologies from many sources listed in the Bioregistry 199 | - Converts databases into OWL and OBO ontologies 200 | - Wrapper around ROBOT for using Java tooling to convert between OBO and OWL 201 | - Internal DSL for generating OBO ontology 202 | - [Pyoxigraph](https://oxigraph.org/pyoxigraph/stable/index.html) – Python graph database library implementing the SPARQL standard. 203 | - built on top of [Oxigraph](https://github.com/oxigraph/oxigraph) using [PyO3](https://pyo3.rs/) 204 | - docs: https://oxigraph.org/pyoxigraph/stable/index.html 205 | - two stores with SPARQL 1.1 capabilities. in-memory/disk based 206 | - [PyRes](https://github.com/eprover/PyRes) 207 | - resolution-based theorem provers for first-order logic 208 | - focus on good comprehensibility of the code 209 | - Literature: [Teaching Automated Theorem Proving by Example](https://link.springer.com/chapter/10.1007/978-3-030-51054-1_9) 210 | - [pystardog](https://github.com/stardog-union/pystardog) 211 | - Python bindings for the [Stardog Knowledge Graph platform](https://www.stardog.com/) 212 | - [Quit Store](https://github.com/AKSW/QuitStore) – workspace for distributed collaborative Linked Data knowledge engineering ("Quads in Git") 213 | - features: 214 | - read and write RDF Datasets 215 | - create multiple branches of the Dataset 216 | - literature references: 217 | - [*Decentralized Collaborative Knowledge Management using Git*](https://natanael.arndt.xyz/bib/arndt-n-2018--jws) 218 | by Natanael Arndt, Patrick Naumann, Norman Radtke, Michael Martin, and Edgard Marx in Journal of Web Semantics, 2018 219 | [[@sciencedirect](https://www.sciencedirect.com/science/article/pii/S1570826818300416)] [[@arXiv](https://arxiv.org/abs/1805.03721)] 220 | 221 | - [RaiseWikibase](https://github.com/UB-Mannheim/RaiseWikibase) – A tool for speeding up multilingual knowledge graph construction with Wikibase 222 | - fast inserts into a Wikibase instance: creates up to a million entities and wikitexts per hour 223 | - docs: https://ub-mannheim.github.io/RaiseWikibase/ 224 | - ships with `docker-compose.yml` for Wikibase (Database, PHP-code) 225 | - publication: https://link.springer.com/chapter/10.1007%2F978-3-030-80418-3_11 226 | - [Reasonable](https://github.com/gtfierro/reasonable) – An OWL 2 RL reasoner with reasonable performance 227 | - written in Rust with Python-Bindings (via [pyo3](https://pyo3.rs/)) 228 | - [ROBOT](https://github.com/ontodev/robot) – Java-tool for automating ontology workflow with several reasoners (ELK, Hermite, ...) and Python interface 229 | - General docs: https://robot.obolibrary.org/ 230 | - Python interfaces: https://robot.obolibrary.org/python 231 | - Docs on reasoning: https://robot.obolibrary.org/reason 232 | - [rdflib](https://github.com/RDFLib/rdflib) – Python package for working with RDF 233 | - docs: https://rdflib.readthedocs.io/ 234 | - graphical package overview: https://rdflib.dev/ 235 | - features: 236 | - parsers and serializers for RDF/XML, NTriples, Turtle, JSON-LD and more 237 | - a graph interface which can be backed by any one of a number of store implementations 238 | - store implementations for in-memory storage and persistent storage 239 | - a SPARQL 1.1 implementation – supporting SPARQL 1.1 Queries and Update statements 240 | - [rdflib-endpoint](https://github.com/vemonet/rdflib-endpoint) – Python package for easily deploying SPARQL endpoints for RDFLib Graphs 241 | - features: 242 | - exposing machine learning models or any other logic implemented in Python through a SPARQL endpoint, using custom functions 243 | - serving local RDF files using the command line interface 244 | - [serd](https://gitlab.com/drobilla/python-serd) – Python serd module, providing bindings for Serd, a lightweight C library for working with RDF data 245 | - docs: https://drobilla.gitlab.io/python-serd/singlehtml/ 246 | - [ sparqlfun](https://github.com/linkml/sparqlfun) 247 | - LinkML based SPARQL template library and execution engine 248 | - modularized core library of SPARQL templates 249 | - Fully FAIR description of templates 250 | - Rich expressive language for moedeling templates 251 | - uses [LinkML](https://linkml.io/linkml/) as base language 252 | - optional python bindings / [object model](https://github.com/linkml/sparqlfun/blob/main/sparqlfun/model.py) using LinkML 253 | - supports both SELECT and CONSTRUCT 254 | - optional export to TSV, JSON, YAML, RDF 255 | - extensive [endpoint metadata](https://github.com/linkml/sparqlfun/tree/main/sparqlfun/config) 256 | - [SPARQL kernel](https://github.com/paulovn/sparql-kernel) for Jupyter 257 | - features: 258 | - sending queries to an SPARQL endpoint 259 | - fetching and presenting the results in a notebook 260 | - [SPARQLing Unicorn QGIS Plugin](https://github.com/sparqlunicorn/sparqlunicornGoesGIS) – QGIS plugin which adds a GeoJSON layer from SPARQL enpoint queries 261 | - docs: https://sparqlunicorn.github.io/sparqlunicornGoesGIS/ 262 | - QGIS plugin page: https://plugins.qgis.org/plugins/sparqlunicorn/ 263 | - features: 264 | - Querying geospatial vector layers from SPARQL endpoints 265 | - Conversion of geoformats (GeoJSON, SHP, KML, GML, etc.) to geospatial RDF 266 | - Conversion of RDF geodata (GeoSPARQL-formatted) from one coordinate reference system to another 267 | - SHACL validation of geospatial RDF graphs including validation of geoliteral (WKT, GML) contents 268 | - [SPARQLWrapper](https://github.com/RDFLib/sparqlwrapper) – A wrapper for a remote SPARQL endpoint 269 | - docs: https://sparqlwrapper.readthedocs.io/en/latest/index.html 270 | - features: 271 | - Creating a query invocation 272 | - Optionally converting the result into a more manageable format 273 | - [WikidataIntegrator](https://github.com/SuLab/WikidataIntegrator) – Library for reading and writing to Wikidata/Wikibase 274 | - features: 275 | - high integration with the Wikidata SPARQL endpoint 276 | 277 | 278 | ## Probably Stalled or Outdated Projects 279 | 280 | - [Athene](https://github.com/dityas/Athene) DL reasoner in pure python 281 | - "[C]urrent version is a beta and only supports ALC. But it can easily be extended by adding tableau rules." 282 | - Last update: 2017 283 | - [cwm](https://en.wikipedia.org/wiki/Cwm_(software)) 284 | - Self description: "\[cwm is a\] forward chaining semantic reasoner that can be used for querying, checking, transforming and filtering information". 285 | - Created in 2000 by Tim Berners-Lee and Dan Connolly, see [w3.org](https://www.w3.org/2000/10/swap/doc/cwm) 286 | - [air-reasoner](https://github.com/mit-dig/air-reasoner) 287 | - Self description: "Reasoner for the AIR policy language, based on cwm" 288 | - based on cwm 289 | - Last update: 2013 290 | - [FuXi](https://pypi.org/project/FuXi/) 291 | - Self description: "An OWL / N3-based in-memory, logic reasoning system for RDF" 292 | - based on cwm 293 | - Last update: 2013 294 | - see also (hg-repo) 295 | - [pysumo](https://github.com/pySUMO/pysumo) 296 | - Ontology IDE for the Sugested Upper Merged Ontology (SUMO) 297 | - Docs: https://pysumo.readthedocs.io/ 298 | - Last update: 2015 299 | 300 | 301 | ## Further Projects / Links 302 | 303 | - [ontology](https://github.com/ozekik/awesome-ontology) – A curated list of ontology things (with some python-related entries) 304 | - [awesome-semantic-web#python](https://github.com/semantalytics/awesome-semantic-web#python) Python section of awesome list for semantic-web-related projects 305 | - [github-semantic-web-python](https://github.com/topics/semantic-web?l=python) – github project search with `topic=semantic-web` and `language=python` 306 | - "Graph Thinking" – Talk by Paco Nathan ([@ceteri](https://github.com/ceteri)) PyData Global 2021; [slides](https://derwen.ai/s/kcgh#84), [video](https://www.youtube.com/watch?v=bqku2a7ScXg) 307 | - [Hydra Ecosystem](https://github.com/HTTP-APIs) - Semantically Linked REST APIs 308 | - docs: https://www.hydraecosystem.org/ 309 | - tutorials: the stack has three major layers ([server](https://github.com/HTTP-APIs/hydrus), [client](https://github.com/HTTP-APIs/hydra-python-agent), [GUI](https://github.com/HTTP-APIs/hydra-python-agent-gui)); each repo has it own README 310 | - features: 311 | - deploy a server automatically from API Documentation (JSON-LD and W3C Hydra) 312 | - client automatically reads the documentation and provides access to endpoints 313 | - GUI allows visualization of the network generated by the servers and external resources 314 | - a [parser](https://github.com/HTTP-APIs/hydra-openapi-parser) for OpenAPI specs translation 315 | - notes: 316 | - under development, experimental 317 | - part of Google Summer of Code 318 | - [Pywikibot](https://github.com/wikimedia/pywikibot) 319 | - Library to interact with Wikidata and Wikimedia API 320 | - see also: https://www.wikidata.org/wiki/Wikidata:Creating_a_bot#Pywikibot 321 | - [semantic](https://github.com/crm416/semantic) – Python library for extracting semantic information from text, such as dates and numbers 322 | - [Solving Einstein Puzzle](https://github.com/cknoll/demo-material/blob/main/expertise_system/einstein-zebra-puzzle-owlready-solution1.ipynb) – jupyter notebook demonstrating how to use owlready2 to solve a logic puzzle 323 | - [W3C-Link-List1](https://www.w3.org/2001/sw/wiki/SemanticWebTools#Python_Developers) – link list "SemanticWebTools", section "Python_Developers" (wiki page) 324 | - might be outdated 325 | - [W3C-Link-List2](https://www.w3.org/2001/sw/wiki/Python) – list of tools usable from, or with, Python (wiki page) 326 | - [wikidata-mayors](https://github.com/njanakiev/wikidata-mayors) 327 | - Python code to ask wikidata for european mayors and where they where born 328 | - Article: https://towardsdatascience.com/where-do-mayors-come-from-querying-wikidata-with-python-and-sparql-91f3c0af22e2 329 | - [yamlpyowl](https://github.com/cknoll/yamlpyowl) – read an yaml-specified ontology into python by means of owlready2 (experimental) 330 | - [Notebook, which generates quiz questions from wikidata](https://gist.github.com/ak314/fc6c6f911cb4f39453b575854cdc4869) 331 | - [related presentation slides](https://www.slideshare.net/robertoturrin/how-to-turn-wikipedia-into-a-quiz-game) 332 | --------------------------------------------------------------------------------