mmBERT Fact-Check Classifier (Merged for Rust)
This is a merged mmBERT model for fact-check classification, optimized for Rust inference using the candle framework.
Model Details
- Base Model: jhu-clsp/mmBERT-base
- Task: Binary classification (fact-check needed vs not needed)
- Languages: Multilingual (1800+ languages)
- Training: LoRA fine-tuned then merged with base model
- Inference: Optimized for Rust candle-binding
Usage with Rust (candle-binding)
use candle_semantic_router::model_architectures::traditional::TraditionalModernBertClassifier;
let classifier = TraditionalModernBertClassifier::load_from_directory("path/to/model", true)?;
let (class, confidence) = classifier.classify_text("The moon is made of cheese")?;
let needs_fact_check = class == 1;
Classes
| ID | Label |
|---|---|
| 0 | NO_FACT_CHECK_NEEDED |
| 1 | FACT_CHECK_NEEDED |
Training Configuration
- LoRA Rank: 32
- LoRA Alpha: 64
- Epochs: 10
- Batch Size: 64
- Learning Rate: 2e-5
License
Apache 2.0
- Downloads last month
- 10
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for llm-semantic-router/mmbert-fact-check-merged
Base model
jhu-clsp/mmBERT-base