LogicCheckGPT
Logic Checker
An open-source framework for detecting and mitigating object hallucinations in large vision-language models by analyzing logical consistency.
[ACL 2024] Logical Closed Loop: Uncovering Object Hallucinations in Large Vision-Language Models. Detect and mitigate object hallucinations in LVLMs by itself through logical closed loops.
19 stars
4 watching
1 forks
Language: Python
last commit: 8 months ago Related projects:
Repository | Description | Stars |
---|---|---|
| A framework to detect and mitigate hallucinations in multimodal large language models | 48 |
| A tool for writing and validating temporal logic specifications of software behavior | 174 |
| Automates fine-grained hallucination detection in large language model outputs | 325 |
| A method to correct hallucinations in multimodal large language models without requiring retraining | 617 |
| A tool to measure and enforce the complexity of cognitive functions in Python code | 69 |
| A tool to check Python code quality in Jupyter notebooks. | 28 |
| Evaluates answers generated by large vision-language models to assess hallucinations | 27 |
| A tool to check Python docstrings for consistency with function signatures or implementations. | 159 |
| Provides tools and utilities for generating and verifying proofs in a zkSNARK proof system | 159 |
| A module that provides fast collision checking for arbitrary 3D shapes. | 64 |
| Detects potential performance issues in Go code caused by nested contexts in loops or function literals | 13 |
| An approach to reduce object hallucinations in large vision-language models by contrasting output distributions derived from original and distorted visual inputs | 222 |
| A CLI tool that checks for dead hyperlinks in files using multiple markup languages. | 70 |
| A tool for detecting common bugs in binary executables. | 1,155 |
| Improves the performance of large language models by intervening in their internal workings to reduce hallucinations | 83 |