Show HN: Omni-NLI – A multi-interface server for natural language inference

  • Posted 7 hours ago by habedi0
  • 1 points
Hi everyone,

I've made an open-source tool (called Omni-NLI) for natural language inference. It can use different models to check if a piece of text (called a premise) supports another piece of text (a hypothesis). The main application of a tool like this is for soft fact-checking and consistency checking between pieces of texts like sentences.

Currently, Omni-NLI has the following features:

    - Can be installed as a Python package with `pip install omni-nli[huggingface]`.

    - Can be used on your own computer, so your data stays local and private.

    - Has an MCP interface (for agents) and a REST API for conventional use as a microservice.

    - Supports using models from different sources (Ollama, OpenRouter, and HuggingFace).

    - Can be used to check if it seems that a model is contradicting itself.

    - Supports showing the reasoning so you can see why it thinks a claim is wrong.
In any case, if you are interested in knowing more, there is more information in the links below:

Project's GitHub repo: https://github.com/CogitatorTech/omni-nli

Project's documentation: https://cogitatortech.github.io/omni-nli/

0 comments