Dragoman

Rule Interpreter

An interpreter for functional mappings in RML, optimizing execution and transformation of rules into function-free ones.

An Optimized Interpreter for RML Functional Mappings!

GitHub

6 stars
5 watching
6 forks
Language: Python
last commit: over 1 year ago
Linked from 1 awesome list


Backlinks from these awesome lists:

Related projects:

Repository Description Stars
sdm-tib/interpretme An interpretable machine learning pipeline tool integrating knowledge graphs with machine learning methods to generate insights and human-readable decisions. 25
datamllab/xdeep Provides tools for interpreting deep neural networks 42
bogdanp/django_dramatiq An integration layer for Django to work with Dramatiq's task queue system. 349
jianbo-lab/l2x A Python framework for learning to interpret models using information-theoretic methods 124
mthom/shentong An implementation of a Lisp family programming language in Haskell 162
kanaka/warpy An interpreter and runtime environment for WebAssembly in RPython 66
quchen/stgi An interpreter for a visual programming model to help understand Haskell's execution model 527
phipsgabler/operajonal A JavaScript implementation of an operational monad style for recursive program interpretation 6
sicara/tf-explain A library providing interpretability methods for TensorFlow 2.x models 1,018
luogen1996/lavin An open-source implementation of a vision-language instructed large language model 508
digikar99/py4cl2 A Common Lisp library for executing Python code and integrating it with existing Lisp systems. 41
cofinalsubnets/wisp A Haskell-based interpreted Lisp language with features like lexical closures and continuations, designed to be easily embedded in other programs. 115
dr-leo/pandasdmx Provides tools to access and manipulate SDMX-compliant data in various formats 127
svenssonjoel/lispbm An interpreter for a concurrent, functional programming language with message-passing and pattern-matching capabilities. 90
applieddatasciencepartners/xgboostexplainer Provides tools to understand and interpret the decisions made by XGBoost models in machine learning 252