{"id":33133033,"url":"https://github.com/pysemtec/semantic-python-overview","last_synced_at":"2025-11-16T07:00:42.201Z","repository":{"id":37962911,"uuid":"307041542","full_name":"pysemtec/semantic-python-overview","owner":"pysemtec","description":"(subjective) overview of projects which are related both to python and semantic technologies (RDF, OWL, Reasoning, ...)","archived":false,"fork":false,"pushed_at":"2025-06-22T22:22:02.000Z","size":94,"stargazers_count":524,"open_issues_count":3,"forks_count":34,"subscribers_count":20,"default_branch":"main","last_synced_at":"2025-06-22T23:24:54.482Z","etag":null,"topics":["collection","community-driven","datalog","knowledge-graph","ontology","owl","python","rdf","semantic-web","semantics","sparql","swrl"],"latest_commit_sha":null,"homepage":"","language":null,"has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"cc0-1.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pysemtec.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2020-10-25T06:53:21.000Z","updated_at":"2025-06-22T22:22:06.000Z","dependencies_parsed_at":"2023-01-25T14:00:32.083Z","dependency_job_id":"bba24655-2961-4981-8515-41d6c53e1b3f","html_url":"https://github.com/pysemtec/semantic-python-overview","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/pysemtec/semantic-python-overview","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pysemtec%2Fsemantic-python-overview","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pysemtec%2Fsemantic-python-overview/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pysemtec%2Fsemantic-python-overview/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pysemtec%2Fsemantic-python-overview/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pysemtec","download_url":"https://codeload.github.com/pysemtec/semantic-python-overview/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pysemtec%2Fsemantic-python-overview/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":284672648,"owners_count":27044736,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-16T02:00:05.974Z","response_time":65,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["collection","community-driven","datalog","knowledge-graph","ontology","owl","python","rdf","semantic-web","semantics","sparql","swrl"],"created_at":"2025-11-15T10:00:31.644Z","updated_at":"2025-11-16T07:00:42.192Z","avatar_url":"https://github.com/pysemtec.png","language":null,"readme":"[![join community](https://pysemtec.org/img/join-community.svg \"join community\")](https://pysemtec.org)\n# Semantic Python Overview\n\nThis repository aims to collect and curate a list of projects which are related both to python and semantic technologies (RDF, OWL, SPARQL, Reasoning, ...). It is inspired by collections like [awesome lists](https://github.com/sindresorhus/awesome#readme). The list might be incomplete and biased, due to the limited knowledge of its authors. Improvements are very welcome. Feel free to file an issue or a pull request. Every section is alphabetically sorted.\n\nFurthermore, this repository might serve as a **cristallization point for a community** interested in such projects – and how they might productively interact. See [this discussion](https://github.com/cknoll/semantic-python-overview/discussions/1) for more information.\n\n\n## Established Projects\n\n- [Bioregistry](https://github.com/biopragmatics/bioregistry) - The Bioregistry\n  - docs: https://bioregistry.readthedocs.io\n  - website: https://bioregistry.io/\n  - features:\n    - Open source (and CC 0) repository of prefixes, their associated metadata, and mappings to external registries' prefixes\n    - Standarization of prefixes and CURIEs \n    - Interconversion between CURIEs and IRIs\n    - Generation of context-specific prefix maps for usage in RDF, LinkML, SSSOM, OWL, etc.\n- [brickschema](https://github.com/BrickSchema/py-brickschema) – Brick Ontology Python package\n    - Brick is an open-source effort to standardize semantic descriptions of the physical, logical and virtual assets in buildings and the relationships between them.\n    - docs: https://brickschema.readthedocs.io/en/latest/\n    - website: https://brickschema.org/\n    - features:\n        - basic inference with different reasoners\n        - web based interaction (by means of [Yasgui](https://github.com/TriplyDB/Yasgui))\n        - Translations from different formats (Haystack, VBIS)\n- [Cooking with Python and KBpedia](https://www.mkbergman.com/cooking-with-python-and-kbpedia/)\n    - Tutorial series on \"how to pick tools and then use Python for using and manipulating the KBpedia knowledge graph\"\n    - [Material in form of Jupyter Notebooks](https://github.com/Cognonto/CWPK),\n    - accompanying python package [cowpoke](https://github.com/Cognonto/cowpoke),\n- [CubicWeb](https://www.cubicweb.org/) a framework to build semantic web applications\n  - website: https://www.cubicweb.org\n  - docs: https://cubicweb.readthedocs.io/en/latest/\n  - features:\n    - An engine driven by the explicit data model of the application\n    - RQL, an intuitive query language close to the business vocabulary\n    - An architecture that separates data selection and visualisation\n    - Data security by design\n    - An efficient data storage\n\n- [Eddy](https://github.com/obdasystems/eddy) - graphical ontology editor\n  - website: https://www.obdasystems.com/eddy\n  - features:\n    - graphical ontology editing\n    - uses bespoke Graphol format but has an OWL2 export\n    - visualization built on PyQt5\n  - literature references:\n    - [*Lembo, D and Pantaleone, D and Santarelli, V and Savo, DF: **Eddy: A Graphical Editor for OWL 2 Ontologies**. IJCAI 2016; 4252-4253*](https://cs.unibg.it/savo/papers/LPSS-IJCAI-16.pdf)\n- [fastobo-py](https://github.com/fastobo/fastobo-py): Python bindings for *fastobo* (rust library to parse OBO 1.4)\n    - features:\n        - load, edit and serialize ontologies in the OBO 1.4 format\n- [FunOwl](https://github.com/hsolbrig/funowl) – functional OWL syntax for Python\n  - features:\n    - provide a pythonic API that follows the OWL functional model for constructing OWL\n- [Gastrodon](https://github.com/paulhoule/gastrodon) - puts RDF data on your fingertips in Pandas; gateway to matplotlib, scikit-learn and other visualization tools.\n  - features:\n    - interpolate variables into SPARQL queries\n    - access local RDFlib graphs and remote SPARQL protocol endpoints\n    - convert SPARQL result set to pandas dataframes\n    - understandable error messages\n    - input/output graphs in Turtle form\n    - conversion between RDF collections and Python collections\n    - Sphinx domain to incorporate RDF data into documentation\n- [gizmos](https://github.com/ontodev/gizmos) – Utilities for ontology development\n    - features:\n        - modules for \"export\", \"extract\", \"tree\"-rendering\n- [Jabberwocky](https://github.com/sap218/jabberwocky) – a toolkit for ontologies\n    - features:\n        - associated text mining using an ontology terms \u0026 synonyms\n        - tf-idf for synonym curation then adding those synonyms into an ontology\n- [kglab](https://github.com/DerwenAI/kglab) - Graph Data Science\n    - docs: https://derwen.ai/docs/kgl/\n    - tutorial: https://derwen.ai/docs/kgl/tutorial/\n    - features:\n        - an abstraction layer in Python for building knowledge graphs, integrated with popular graph libraries\n\t- perspective: there are several \"camps\" of graph technologies, with little discussion between them\n        - focus on supporting \"Hybrid AI\" approaches that combine two or more graph technologies with other ML work\n\t- PyData stack – e.g., Pandas, scikit-learn, etc. – allows for graph work within data science workflows\n\t- scale-out tools – e.g., RAPIDS, Arrow/Parquet, Dask – provide for scaling graph computation (not necessarily databases)\n\t- graph algorithm libraries include NetworkX, iGraph, cuGraph – plus related visualization libraries in PyVis, Cairo, etc.\n\t- W3C libraries in Py also lacked full integration: RDFlib, pySHACL, OWL-RL, etc.\n        - pslpython provides for _probabilistic soft logic_, working with uncertainty in probabilistic graphs\n        - additional integration paths and examples show how to work with deep learning (PyG)\n\t- import paths from graph databases, such as Neo4j\n        - import paths from note-taking tools, such as Roam Research\n\t- usage in [MkRefs](https://github.com/DerwenAI/mkrefs) to add semantic features into MkDocs so that open source projects can federate bibliographies, shared glossaries, etc.\n\t- kglab team provides hands-on workshops at technology conferences for people to gain experience with these different graph approaches\n- [KGX](https://github.com/biolink/kgx) - Library for building and exchanging knowledge graphs\n    - docs: https://kgx.readthedocs.io/\n    - features:\n        - Load graphs into an in-memory model to facilitate data integration, validation, and graph operations\n        - Provides an easy way to bring data into Biolink Model, a a high-level data model for biomedical knowledge graphs\n        - The core data structure is a Property Graph (PG), represented internally using a `networkx.MultiDiGraph`\n        - Supports various input and output formats including,\n            - RDF serializations\n            - SPARQL endpoints\n            - Neo4j endpoints\n            - CSV/TSV and JSON\n            - OWL\n            - OBOGraph JSON format\n            - SSSOM\n- [LangChain](https://github.com/langchain-ai/langchain)'s GraphSparqlQAChain – A LangChain module for making RDF and OWL accessible via natural language\n    - docs: https://python.langchain.com/docs/use_cases/graph/graph_sparql_qa\n    - features:\n        - Generates SPARQL SELECT and UPDATE queries from natural language\n        - Runs the generated queries against local files, endpoints, or triple stores\n        - Returns natural language responses\n- [LinkML](https://github.com/linkml/linkml) – Linked Open Data Modeling Language\n    - features:\n        - A high level simple way of specifying data models, optionally enhanced with semantic annotations\n        - A python framework for compiling these data models to json-ld, json-schema, shex, shacl, owl, sql-ddl\n        - A python framework for data conversion and validation, as well as generated Python dataclasses\n- [Macleod](https://github.com/thahmann/macleod) – Ontology development environment for Common Logic (CL)\n    - features:\n        - Translating a CLIF file to formats supported by FOL reasoners\n        - Extracting an OWL approximation of a CLIF ontology\n        - Verifying (non-trivial) logical consistency of a CLIF ontology\n        - Proving theorems/lemmas, such as properties of concepts and relations or competency questions\n        - GUI (alpha state)\n- [Morph-KGC](https://github.com/oeg-upm/morph-kgc) – System to create RDF and RDF-star knowledge graphs from heterogeneous sources with R2RML, RML and RML-star\n  - docs: https://morph-kgc.readthedocs.io\n  - features:\n    - support for relational databases, tabular files (e.g. CSV, Excel, Parquet) and hierarchical files (XML and JSON)\n    - generates RDF and RDF-star knowledge graphs by running through the command line or as a library\n    - integrates with RDFlib and Oxigraph to load the generated RDF directly to those libraries\n- [nxontology](https://github.com/related-sciences/nxontology) – NetworkX-based library for representing ontologies\n  - features:\n    - load ontologies into a `networkx.DiGraph` or `MultiDiGraph` from `.obo`, `.json`, or `.owl` formats\n      (powered by pronto / fastobo)\n    - compute information content scores for nodes and semantic similarity scores for node pairs\n- [obonet](https://github.com/dhimmel/obonet) – read OBO-formatted ontologies into NetworkX\n  - features:\n    - Load an `.obo` file into a `networkx.MultiDiGraph`\n    - Users should try [nxontology](https://github.com/related-sciences/nxontology) first, as a more general purpose successor to this project\n- [OnToology](https://github.com/OnToology/OnToology) – System for collaborative ontology development process\n    - docs: http://ontoology.linkeddata.es/stepbystep\n    - live version: http://ontoology.linkeddata.es/\n    - citable reference: https://doi.org/10.1016/j.websem.2018.09.003\n- [OntoPilot](https://github.com/stuckyb/ontopilot) – software for ontology development and deployment\n  - docs: https://github.com/stuckyb/ontopilot/wiki\n  - features:\n    - support end users in ontology development, documentation and maintainance\n    - convert spreadsheet data (one entity per row) to owl files\n    - call a reasoner before triple-store insertion\n- [ontospy](https://github.com/lambdamusic/Ontospy) – Python library and command-line interface for inspecting and visualizing RDF models\n  - docs: http://lambdamusic.github.io/Ontospy/\n  - features:\n    - extract and print out any ontology-related information\n    - convert different OWL syntax variants\n    - generate html documentation for an ontology\n- [ontor](https://github.com/felixocker/ontor) – Python library for manipulating and vizualizing OWL ontologies in Python\n  - features:\n    - tool set based on owlready2 and networkx\n- [owlready2](https://bitbucket.org/jibalamy/owlready2/src/master/README.rst) – ontology oriented programming in Python\n  - docs: https://owlready2.readthedocs.io/en/latest/index.html\n  - features:\n    - parse owl files (RDF/XML or OWL/XML)\n    - parse SWRL rules\n    - call reasoner (via java)\n  - literature references:\n    - [*Lamy, JB: Owlready: **Ontology-oriented programming in Python with automatic classification and high level constructs for biomedical ontologies**. Artificial Intelligence In Medicine 2017;80:11-28*](http://www.lesfleursdunormal.fr/_downloads/article_owlready_aim_2017.pdf)\n    - [*Lamy, JB: **Ontologies with Python**, Apress, 2020*](https://www.apress.com/fr/book/9781484265512)\n        - accompanying material: \u003chttps://github.com/Apress/ontologies-w-python\u003e\n- [Oxrdflib](https://github.com/oxigraph/oxrdflib) – Oxrdflib provides rdflib stores using pyoxigraph (rust-based)\n    - could be used as drop-in replacements of the rdflib default ones\n- [pronto](https://github.com/althonos/pronto): library to parse, browse, create, and export ontologies\n    - features:\n        -supports several ontology languages and formats\n    - docs: https://pronto.readthedocs.io/en/latest/api.html\n- [pycottas](https://github.com/arenas-guerrero-julian/pycottas) – Library for working with compressed COTTAS files\n  - docs: https://pycottas.readthedocs.io\n  - features:\n    - compress RDF files to COTTAS format\n    - evaluate triple patterns over compressed RDF\n    - integrates with RDFlib as a store backend to query COTTAS files with SPARQL\n- [pyfactxx](https://github.com/tilde-lab/pyfactxx) – Python bindings for FaCT++ OWL 2 C++ reasoner\n    - features:\n        - well-optimized reasoner for SROIQ(D) description logic, with additional improvements\n        - [rdflib](https://github.com/RDFLib/rdflib) integration\n        - easy cross-platform installation\n- [PyFuseki](https://github.com/yubinCloud/pyfuseki) – Library that interact with Jena Fuseki (SPARQL server):\n    - docs: https://yubincloud.github.io/pyfuseki/\n\n- [PyKEEN](https://github.com/pykeen/pykeen) (**Py**thon **K**nowl**E**dge **E**mbeddi**N**gs) – Python package to train and evaluate knowledge graph embedding models\n    - features:\n        - 44 Models\n        - 37 Datasets\n        - 5 Inductive Datasets\n        - support for multi-modal information\n- [PyLD](https://github.com/digitalbazaar/pyld) - A JSON-LD processor written in Python\n    - conforms:\n        - JSON-LD 1.1, W3C Candidate Recommendation, 2019-12-12 or newer\n        - JSON-LD 1.1 Processing Algorithms and API, W3C Candidate Recommendation, 2019-12-12 or newer\n        - JSON-LD 1.1 Framing, W3C Candidate Recommendation, 2019-12-12 or newer\n- [pyLoDStorage](https://github.com/WolfgangFahl/pyLoDStorage) – python library to interchange data between SPARQL-, JSON and SQL-endpoints\n    - features:\n        -  Integration of [tabulate library](https://pypi.org/project/tabulate/)\n        -  QueryManager class for handling named queries\n        -  Basic data structure: **l**ists of **d**icts (thus: \"LoD\")\n    - docs: https://wiki.bitplan.com/index.php/PyLoDStorage\n- [PyOBO](https://github.com/pyobo/pyobo)\n  - docs:  https://pyobo.readthedocs.io\n  - features:\n    - Provides unified, high-level access to names, descriptions, synonyms, xrefs, hierarchies, properties, relationships, etc. in ontologies from many sources listed in the Bioregistry\n    - Converts databases into OWL and OBO ontologies\n    - Wrapper around ROBOT for using Java tooling to convert between OBO and OWL\n    - Internal DSL for generating OBO ontology\n- [Pyoxigraph](https://oxigraph.org/pyoxigraph/stable/index.html) – Python graph database library implementing the SPARQL standard.\n    - built on top of [Oxigraph](https://github.com/oxigraph/oxigraph) using [PyO3](https://pyo3.rs/)\n    - docs: https://oxigraph.org/pyoxigraph/stable/index.html\n    - two stores with SPARQL 1.1 capabilities. in-memory/disk based\n- [PyRes](https://github.com/eprover/PyRes)\n    - resolution-based theorem provers for first-order logic\n    - focus on good comprehensibility of the code\n    - Literature: [Teaching Automated Theorem Proving by Example](https://link.springer.com/chapter/10.1007/978-3-030-51054-1_9)\n- [pystardog](https://github.com/stardog-union/pystardog)\n    - Python bindings for the [Stardog Knowledge Graph platform](https://www.stardog.com/)\n- [Quit Store](https://github.com/AKSW/QuitStore) – workspace for distributed collaborative Linked Data knowledge engineering (\"Quads in Git\")\n    - features:\n        - read and write RDF Datasets\n        - create multiple branches of the Dataset\n    - literature references:\n        - [*Decentralized Collaborative Knowledge Management using Git*](https://natanael.arndt.xyz/bib/arndt-n-2018--jws)\nby Natanael Arndt, Patrick Naumann, Norman Radtke, Michael Martin, and Edgard Marx in Journal of Web Semantics, 2018\n[[@sciencedirect](https://www.sciencedirect.com/science/article/pii/S1570826818300416)] [[@arXiv](https://arxiv.org/abs/1805.03721)]\n\n- [RaiseWikibase](https://github.com/UB-Mannheim/RaiseWikibase) – A tool for speeding up multilingual knowledge graph construction with Wikibase\n    - fast inserts into a Wikibase instance: creates up to a million entities and wikitexts per hour\n    - docs: https://ub-mannheim.github.io/RaiseWikibase/\n    - ships with `docker-compose.yml` for Wikibase (Database, PHP-code)\n    - publication: https://link.springer.com/chapter/10.1007%2F978-3-030-80418-3_11\n- [Reasonable](https://github.com/gtfierro/reasonable) – An OWL 2 RL reasoner with reasonable performance\n    - written in Rust with Python-Bindings (via [pyo3](https://pyo3.rs/))\n- [ROBOT](https://github.com/ontodev/robot) – Java-tool for automating ontology workflow with several reasoners (ELK, Hermite, ...) and Python interface\n    - General docs:  https://robot.obolibrary.org/\n    - Python interfaces: https://robot.obolibrary.org/python\n    - Docs on reasoning: https://robot.obolibrary.org/reason\n- [rdflib](https://github.com/RDFLib/rdflib) – Python package for working with RDF\n  - docs: https://rdflib.readthedocs.io/\n  - graphical package overview: https://rdflib.dev/\n  - features:\n    - parsers and serializers for RDF/XML, NTriples, Turtle, JSON-LD and more\n    - a graph interface which can be backed by any one of a number of store implementations\n    - store implementations for in-memory storage and persistent storage\n    - a SPARQL 1.1 implementation – supporting SPARQL 1.1 Queries and Update statements\n- [rdflib-endpoint](https://github.com/vemonet/rdflib-endpoint) – Python package for easily deploying SPARQL endpoints for RDFLib Graphs\n  - features:\n    - exposing machine learning models or any other logic implemented in Python through a SPARQL endpoint, using custom functions\n    - serving local RDF files using the command line interface\n- [serd](https://gitlab.com/drobilla/python-serd) – Python serd module, providing bindings for Serd, a lightweight C library for working with RDF data\n  - docs:  https://drobilla.gitlab.io/python-serd/singlehtml/\n- [ sparqlfun](https://github.com/linkml/sparqlfun)\n    - LinkML based SPARQL template library and execution engine\n        - modularized core library of SPARQL templates\n        - Fully FAIR description of templates\n        - Rich expressive language for moedeling templates\n            - uses [LinkML](https://linkml.io/linkml/) as base language\n        - optional python bindings / [object model](https://github.com/linkml/sparqlfun/blob/main/sparqlfun/model.py) using LinkML\n        - supports both SELECT and CONSTRUCT\n        - optional export to TSV, JSON, YAML, RDF\n        - extensive [endpoint metadata](https://github.com/linkml/sparqlfun/tree/main/sparqlfun/config)\n- [SPARQL kernel](https://github.com/paulovn/sparql-kernel) for Jupyter\n    - features:\n        - sending queries to an SPARQL endpoint\n        - fetching and presenting the results in a notebook\n- [SPARQLing Unicorn QGIS Plugin](https://github.com/sparqlunicorn/sparqlunicornGoesGIS) – QGIS plugin which adds a GeoJSON layer from SPARQL enpoint queries\n    - docs: https://sparqlunicorn.github.io/sparqlunicornGoesGIS/\n    - QGIS plugin page: https://plugins.qgis.org/plugins/sparqlunicorn/\n    - features:\n        - Querying geospatial vector layers from SPARQL endpoints\n        - Conversion of geoformats (GeoJSON, SHP, KML, GML, etc.) to geospatial RDF\n        - Conversion of RDF geodata (GeoSPARQL-formatted) from one coordinate reference system to another\n        - SHACL validation of geospatial RDF graphs including validation of geoliteral (WKT, GML) contents\n- [SPARQLWrapper](https://github.com/RDFLib/sparqlwrapper) – A wrapper for a remote SPARQL endpoint\n    - docs: https://sparqlwrapper.readthedocs.io/en/latest/index.html\n    - features:\n    \t- Creating a query invocation\n    \t- Optionally converting the result into a more manageable format\n- [WikidataIntegrator](https://github.com/SuLab/WikidataIntegrator) – Library for reading and writing to Wikidata/Wikibase\n    - features:\n        - high integration with the Wikidata SPARQL endpoint\n\n\n## Probably Stalled or Outdated Projects\n\n- [Athene](https://github.com/dityas/Athene) DL reasoner in pure python\n    - \"[C]urrent version is a beta and only supports ALC. But it can easily be extended by adding tableau rules.\"\n    - Last update: 2017\n- [cwm](https://en.wikipedia.org/wiki/Cwm_(software))\n    - Self description: \"\\[cwm is a\\] forward chaining semantic reasoner that can be used for querying, checking, transforming and filtering information\".\n    - Created in 2000 by Tim Berners-Lee and Dan Connolly, see [w3.org](https://www.w3.org/2000/10/swap/doc/cwm)\n- [air-reasoner](https://github.com/mit-dig/air-reasoner)\n    - Self description: \"Reasoner for the AIR policy language, based on cwm\"\n    - based on cwm\n    - Last update: 2013\n- [FuXi](https://pypi.org/project/FuXi/)\n    - Self description: \"An OWL / N3-based in-memory, logic reasoning system for RDF\"\n    - based on cwm\n    - Last update: 2013\n    - see also \u003chttp://code.google.com/p/python-dlp/wiki/FuXi\u003e \u003chttp://code.google.com/p/fuxi/source/browse/\u003e (hg-repo)\n- [pysumo](https://github.com/pySUMO/pysumo)\n    - Ontology IDE for the Sugested Upper Merged Ontology (SUMO)\n    - Docs: https://pysumo.readthedocs.io/\n    - Last update: 2015\n\n\n## Further Projects / Links\n\n- [ontology](https://github.com/ozekik/awesome-ontology) – A curated list of ontology things (with some python-related entries)\n- [awesome-semantic-web#python](https://github.com/semantalytics/awesome-semantic-web#python) Python section of awesome list for semantic-web-related projects\n- [github-semantic-web-python](https://github.com/topics/semantic-web?l=python) – github project search with `topic=semantic-web` and `language=python`\n- \"Graph Thinking\" – Talk by Paco Nathan ([@ceteri](https://github.com/ceteri)) PyData Global 2021; [slides](https://derwen.ai/s/kcgh#84), [video](https://www.youtube.com/watch?v=bqku2a7ScXg)\n- [Hydra Ecosystem](https://github.com/HTTP-APIs) - Semantically Linked REST APIs\n    - docs: https://www.hydraecosystem.org/\n    - tutorials: the stack has three major layers ([server](https://github.com/HTTP-APIs/hydrus), [client](https://github.com/HTTP-APIs/hydra-python-agent), [GUI](https://github.com/HTTP-APIs/hydra-python-agent-gui)); each repo has it own README\n    - features:\n    \t- deploy a server automatically from API Documentation (JSON-LD and W3C Hydra)\n    \t- client automatically reads the documentation and provides access to endpoints\n    \t- GUI allows visualization of the network generated by the servers and external resources\n    \t- a [parser](https://github.com/HTTP-APIs/hydra-openapi-parser) for OpenAPI specs translation\n    - notes:\n    \t- under development, experimental\n    \t- part of Google Summer of Code\n- [Pywikibot](https://github.com/wikimedia/pywikibot)\n    - Library to interact with Wikidata and Wikimedia API\n    - see also: https://www.wikidata.org/wiki/Wikidata:Creating_a_bot#Pywikibot\n- [semantic](https://github.com/crm416/semantic) – Python library for extracting semantic information from text, such as dates and numbers\n- [Solving Einstein Puzzle](https://github.com/cknoll/demo-material/blob/main/expertise_system/einstein-zebra-puzzle-owlready-solution1.ipynb) – jupyter notebook demonstrating how to use owlready2 to solve a logic puzzle\n- [W3C-Link-List1](https://www.w3.org/2001/sw/wiki/SemanticWebTools#Python_Developers) – link list \"SemanticWebTools\", section \"Python_Developers\" (wiki page)\n  - might be outdated\n- [W3C-Link-List2](https://www.w3.org/2001/sw/wiki/Python) – list of tools usable from, or with, Python (wiki page)\n- [wikidata-mayors](https://github.com/njanakiev/wikidata-mayors)\n    - Python code to ask wikidata for european mayors and where they where born\n    - Article: https://towardsdatascience.com/where-do-mayors-come-from-querying-wikidata-with-python-and-sparql-91f3c0af22e2\n- [yamlpyowl](https://github.com/cknoll/yamlpyowl) – read an yaml-specified ontology into python by means of owlready2 (experimental)\n- [Notebook, which generates quiz questions from wikidata](https://gist.github.com/ak314/fc6c6f911cb4f39453b575854cdc4869)\n    - [related presentation slides](https://www.slideshare.net/robertoturrin/how-to-turn-wikipedia-into-a-quiz-game)\n","funding_links":[],"categories":["Programming","Related"],"sub_categories":["Python","OWL-aware libraries"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpysemtec%2Fsemantic-python-overview","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpysemtec%2Fsemantic-python-overview","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpysemtec%2Fsemantic-python-overview/lists"}