PyLate

This is a PyLate model trained on the train dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.

Model Details

Model Description

  • Model Type: PyLate model
  • Document Length: 512 tokens
  • Query Length: 32 tokens
  • Output Dimensionality: 128 tokens
  • Similarity Function: MaxSim
  • Training Dataset:
  • Language: en

Model Sources

Full Model Architecture

ColBERT(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity', 'use_residual': False})
)

Usage

First install the PyLate library:

pip install -U pylate

Retrieval

Use this model with PyLate to index and retrieve documents. The index uses FastPLAID for efficient similarity search.

Indexing documents

Load the ColBERT model and initialize the PLAID index, then encode and index your documents:

from pylate import indexes, models, retrieve

# Step 1: Load the ColBERT model
model = models.ColBERT(
    model_name_or_path="pylate_model_id",
)

# Step 2: Initialize the PLAID index
index = indexes.PLAID(
    index_folder="pylate-index",
    index_name="index",
    override=True,  # This overwrites the existing index if any
)

# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]

documents_embeddings = model.encode(
    documents,
    batch_size=32,
    is_query=False,  # Ensure that it is set to False to indicate that these are documents, not queries
    show_progress_bar=True,
)

# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
    documents_ids=documents_ids,
    documents_embeddings=documents_embeddings,
)

Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:

# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.PLAID(
    index_folder="pylate-index",
    index_name="index",
)

Retrieving top-k documents for queries

Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:

# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)

# Step 2: Encode the queries
queries_embeddings = model.encode(
    ["query for document 3", "query for document 1"],
    batch_size=32,
    is_query=True,  #  # Ensure that it is set to False to indicate that these are queries
    show_progress_bar=True,
)

# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
    queries_embeddings=queries_embeddings,
    k=10,  # Retrieve the top 10 matches for each query
)

Reranking

If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:

from pylate import rank, models

queries = [
    "query A",
    "query B",
]

documents = [
    ["document A", "document B"],
    ["document 1", "document C", "document B"],
]

documents_ids = [
    [1, 2],
    [1, 3, 2],
]

model = models.ColBERT(
    model_name_or_path="pylate_model_id",
)

queries_embeddings = model.encode(
    queries,
    is_query=True,
)

documents_embeddings = model.encode(
    documents,
    is_query=False,
)

reranked_documents = rank.rerank(
    documents_ids=documents_ids,
    queries_embeddings=queries_embeddings,
    documents_embeddings=documents_embeddings,
)

Evaluation

Metrics

Py Late Information Retrieval

  • Dataset: ['NanoClimateFEVER', 'NanoDBPedia', 'NanoFEVER', 'NanoFiQA2018', 'NanoHotpotQA', 'NanoMSMARCO', 'NanoNFCorpus', 'NanoNQ', 'NanoQuoraRetrieval', 'NanoSCIDOCS', 'NanoArguAna', 'NanoSciFact', 'NanoTouche2020']
  • Evaluated with pylate.evaluation.pylate_information_retrieval_evaluator.PyLateInformationRetrievalEvaluator
Metric NanoClimateFEVER NanoDBPedia NanoFEVER NanoFiQA2018 NanoHotpotQA NanoMSMARCO NanoNFCorpus NanoNQ NanoQuoraRetrieval NanoSCIDOCS NanoArguAna NanoSciFact NanoTouche2020
MaxSim_accuracy@1 0.26 0.78 0.88 0.5 0.92 0.56 0.46 0.48 0.92 0.46 0.2 0.7 0.8163
MaxSim_accuracy@3 0.56 0.9 0.98 0.7 0.98 0.72 0.58 0.72 1.0 0.66 0.56 0.78 0.9592
MaxSim_accuracy@5 0.64 0.9 1.0 0.74 1.0 0.76 0.6 0.8 1.0 0.76 0.64 0.8 0.9592
MaxSim_accuracy@10 0.78 0.94 1.0 0.8 1.0 0.88 0.66 0.9 1.0 0.86 0.82 0.88 0.9796
MaxSim_precision@1 0.26 0.78 0.88 0.5 0.92 0.56 0.46 0.48 0.92 0.46 0.2 0.7 0.8163
MaxSim_precision@3 0.1933 0.66 0.3467 0.32 0.52 0.24 0.38 0.2467 0.4067 0.34 0.1867 0.28 0.7075
MaxSim_precision@5 0.16 0.616 0.212 0.24 0.34 0.152 0.32 0.168 0.26 0.256 0.128 0.18 0.6204
MaxSim_precision@10 0.1 0.55 0.108 0.142 0.174 0.088 0.272 0.094 0.136 0.19 0.082 0.098 0.5102
MaxSim_recall@1 0.1383 0.0908 0.8167 0.3094 0.46 0.56 0.0432 0.46 0.8073 0.0967 0.2 0.675 0.0552
MaxSim_recall@3 0.25 0.169 0.9433 0.4745 0.78 0.72 0.08 0.68 0.9587 0.2107 0.56 0.765 0.1474
MaxSim_recall@5 0.3217 0.2405 0.9633 0.5476 0.85 0.76 0.0945 0.77 0.9827 0.2627 0.64 0.8 0.2144
MaxSim_recall@10 0.3907 0.3687 0.97 0.6148 0.87 0.88 0.1209 0.85 0.9933 0.3877 0.82 0.87 0.3318
MaxSim_ndcg@10 0.3204 0.6662 0.9252 0.545 0.839 0.7162 0.3347 0.6686 0.9579 0.3717 0.495 0.7789 0.5946
MaxSim_mrr@10 0.4228 0.8415 0.935 0.6069 0.9507 0.665 0.526 0.6211 0.96 0.5761 0.3926 0.7514 0.8878
MaxSim_map@100 0.2494 0.5146 0.9004 0.4785 0.775 0.6725 0.1468 0.61 0.9378 0.2775 0.3971 0.7535 0.4401

Nano BEIR

  • Dataset: NanoBEIR_mean
  • Evaluated with pylate.evaluation.nano_beir_evaluator.NanoBEIREvaluator
Metric Value
MaxSim_accuracy@1 0.6105
MaxSim_accuracy@3 0.7769
MaxSim_accuracy@5 0.8153
MaxSim_accuracy@10 0.8846
MaxSim_precision@1 0.6105
MaxSim_precision@3 0.3713
MaxSim_precision@5 0.281
MaxSim_precision@10 0.1957
MaxSim_recall@1 0.3625
MaxSim_recall@3 0.5183
MaxSim_recall@5 0.5729
MaxSim_recall@10 0.6514
MaxSim_ndcg@10 0.6318
MaxSim_mrr@10 0.7028
MaxSim_map@100 0.5502

Training Details

Training Dataset

train

  • Dataset: train at 1a1ffe7
  • Size: 640,000 training samples
  • Columns: query_id, document_ids, and scores
  • Approximate statistics based on the first 1000 samples:
    query_id document_ids scores
    type int list list
    details
    • 836: ~0.10%
    • 3582: ~0.10%
    • 4599: ~0.10%
    • 4645: ~0.10%
    • 4853: ~0.10%
    • 5154: ~0.10%
    • 7504: ~0.10%
    • 12283: ~0.10%
    • 12335: ~0.10%
    • 12916: ~0.10%
    • 14049: ~0.10%
    • 14828: ~0.10%
    • 15674: ~0.10%
    • 15813: ~0.10%
    • 16728: ~0.10%
    • 22006: ~0.10%
    • 23675: ~0.10%
    • 24199: ~0.10%
    • 25323: ~0.10%
    • 28517: ~0.10%
    • 29213: ~0.10%
    • 32344: ~0.10%
    • 34071: ~0.10%
    • 34604: ~0.10%
    • 35424: ~0.10%
    • 35445: ~0.10%
    • 36148: ~0.10%
    • 37078: ~0.10%
    • 37826: ~0.10%
    • 38185: ~0.10%
    • 40855: ~0.10%
    • 42077: ~0.10%
    • 43614: ~0.10%
    • 45073: ~0.10%
    • 46289: ~0.10%
    • 47507: ~0.10%
    • 48005: ~0.10%
    • 48629: ~0.10%
    • 48785: ~0.10%
    • 49216: ~0.10%
    • 49636: ~0.10%
    • 49970: ~0.10%
    • 52075: ~0.10%
    • 52725: ~0.10%
    • 54142: ~0.10%
    • 54210: ~0.10%
    • 55032: ~0.10%
    • 59546: ~0.10%
    • 60087: ~0.10%
    • 60862: ~0.10%
    • 60941: ~0.10%
    • 61037: ~0.10%
    • 61762: ~0.10%
    • 62649: ~0.10%
    • 63333: ~0.10%
    • 64197: ~0.10%
    • 64879: ~0.10%
    • 67608: ~0.10%
    • 67627: ~0.10%
    • 69463: ~0.10%
    • 70002: ~0.10%
    • 70429: ~0.10%
    • 72166: ~0.10%
    • 72518: ~0.10%
    • 72607: ~0.10%
    • 72791: ~0.10%
    • 73325: ~0.10%
    • 74078: ~0.10%
    • 74857: ~0.10%
    • 75323: ~0.10%
    • 75816: ~0.10%
    • 76929: ~0.10%
    • 77845: ~0.10%
    • 77889: ~0.10%
    • 78077: ~0.10%
    • 78256: ~0.10%
    • 78401: ~0.10%
    • 78798: ~0.10%
    • 80871: ~0.10%
    • 81089: ~0.10%
    • 82179: ~0.10%
    • 82883: ~0.10%
    • 84168: ~0.10%
    • 86891: ~0.10%
    • 88282: ~0.10%
    • 89346: ~0.10%
    • 89386: ~0.10%
    • 90699: ~0.10%
    • 90795: ~0.10%
    • 91367: ~0.10%
    • 91795: ~0.10%
    • 92070: ~0.10%
    • 92523: ~0.10%
    • 92597: ~0.10%
    • 92753: ~0.10%
    • 92787: ~0.10%
    • 96382: ~0.10%
    • 96455: ~0.10%
    • 97274: ~0.10%
    • 97603: ~0.10%
    • 100904: ~0.10%
    • 101205: ~0.10%
    • 101305: ~0.10%
    • 102707: ~0.10%
    • 103074: ~0.10%
    • 105437: ~0.10%
    • 108207: ~0.10%
    • 109776: ~0.10%
    • 112056: ~0.10%
    • 112955: ~0.10%
    • 112977: ~0.10%
    • 113635: ~0.10%
    • 115280: ~0.10%
    • 115551: ~0.10%
    • 116098: ~0.10%
    • 117658: ~0.10%
    • 120255: ~0.10%
    • 120298: ~0.10%
    • 121437: ~0.10%
    • 123429: ~0.10%
    • 125043: ~0.10%
    • 125979: ~0.10%
    • 126851: ~0.10%
    • 128218: ~0.10%
    • 128804: ~0.10%
    • 129598: ~0.10%
    • 131299: ~0.10%
    • 132114: ~0.10%
    • 133696: ~0.10%
    • 134460: ~0.10%
    • 137602: ~0.10%
    • 137679: ~0.10%
    • 138121: ~0.10%
    • 138260: ~0.10%
    • 138823: ~0.10%
    • 139039: ~0.10%
    • 140392: ~0.10%
    • 140651: ~0.10%
    • 142305: ~0.10%
    • 145653: ~0.10%
    • 145683: ~0.10%
    • 145763: ~0.10%
    • 150202: ~0.10%
    • 151135: ~0.10%
    • 152307: ~0.10%
    • 152675: ~0.10%
    • 153693: ~0.10%
    • 154470: ~0.10%
    • 155587: ~0.10%
    • 157602: ~0.10%
    • 157779: ~0.10%
    • 158565: ~0.10%
    • 159177: ~0.10%
    • 159224: ~0.10%
    • 159341: ~0.10%
    • 159892: ~0.10%
    • 161881: ~0.10%
    • 162414: ~0.10%
    • 163765: ~0.10%
    • 165888: ~0.10%
    • 168048: ~0.10%
    • 168425: ~0.10%
    • 168894: ~0.10%
    • 169991: ~0.10%
    • 170731: ~0.10%
    • 171705: ~0.10%
    • 176165: ~0.10%
    • 176798: ~0.10%
    • 180259: ~0.10%
    • 181243: ~0.10%
    • 182102: ~0.10%
    • 182660: ~0.10%
    • 183426: ~0.10%
    • 183930: ~0.10%
    • 184045: ~0.10%
    • 184676: ~0.10%
    • 185294: ~0.10%
    • 186475: ~0.10%
    • 187155: ~0.10%
    • 188198: ~0.10%
    • 191383: ~0.10%
    • 192165: ~0.10%
    • 193507: ~0.10%
    • 194207: ~0.10%
    • 195056: ~0.10%
    • 197377: ~0.10%
    • 198224: ~0.10%
    • 198546: ~0.10%
    • 202122: ~0.10%
    • 203519: ~0.10%
    • 206220: ~0.10%
    • 209739: ~0.10%
    • 210554: ~0.10%
    • 212638: ~0.10%
    • 213096: ~0.10%
    • 213410: ~0.10%
    • 214255: ~0.10%
    • 217541: ~0.10%
    • 219718: ~0.10%
    • 220993: ~0.10%
    • 223241: ~0.10%
    • 224657: ~0.10%
    • 227101: ~0.10%
    • 227497: ~0.10%
    • 227726: ~0.10%
    • 228099: ~0.10%
    • 228451: ~0.10%
    • 230413: ~0.10%
    • 231416: ~0.10%
    • 233312: ~0.10%
    • 234348: ~0.10%
    • 235869: ~0.10%
    • 237784: ~0.10%
    • 240739: ~0.10%
    • 246495: ~0.10%
    • 246821: ~0.10%
    • 248675: ~0.10%
    • 249798: ~0.10%
    • 249962: ~0.10%
    • 249977: ~0.10%
    • 250019: ~0.10%
    • 250548: ~0.10%
    • 251089: ~0.10%
    • 254878: ~0.10%
    • 255183: ~0.10%
    • 255727: ~0.10%
    • 256321: ~0.10%
    • 258276: ~0.10%
    • 260993: ~0.10%
    • 261247: ~0.10%
    • 262123: ~0.10%
    • 262508: ~0.10%
    • 266047: ~0.10%
    • 267089: ~0.10%
    • 267192: ~0.10%
    • 268642: ~0.10%
    • 269025: ~0.10%
    • 273171: ~0.10%
    • 273864: ~0.10%
    • 274521: ~0.10%
    • 274586: ~0.10%
    • 275037: ~0.10%
    • 275643: ~0.10%
    • 276744: ~0.10%
    • 277212: ~0.10%
    • 277990: ~0.10%
    • 279931: ~0.10%
    • 280012: ~0.10%
    • 281699: ~0.10%
    • 282128: ~0.10%
    • 283298: ~0.10%
    • 284268: ~0.10%
    • 285697: ~0.10%
    • 285905: ~0.10%
    • 287456: ~0.10%
    • 287506: ~0.10%
    • 288154: ~0.10%
    • 289046: ~0.10%
    • 292211: ~0.10%
    • 292588: ~0.10%
    • 293357: ~0.10%
    • 293661: ~0.10%
    • 294123: ~0.10%
    • 299287: ~0.10%
    • 300622: ~0.10%
    • 302135: ~0.10%
    • 303224: ~0.10%
    • 304353: ~0.10%
    • 304820: ~0.10%
    • 310215: ~0.10%
    • 310236: ~0.10%
    • 310409: ~0.10%
    • 311231: ~0.10%
    • 312821: ~0.10%
    • 314244: ~0.10%
    • 314415: ~0.10%
    • 314745: ~0.10%
    • 316385: ~0.10%
    • 316883: ~0.10%
    • 317442: ~0.10%
    • 318639: ~0.10%
    • 318652: ~0.10%
    • 320855: ~0.10%
    • 321867: ~0.10%
    • 322114: ~0.10%
    • 323196: ~0.10%
    • 324868: ~0.10%
    • 327581: ~0.10%
    • 329337: ~0.10%
    • 331572: ~0.10%
    • 331650: ~0.10%
    • 331993: ~0.10%
    • 332500: ~0.10%
    • 334757: ~0.10%
    • 336561: ~0.10%
    • 336791: ~0.10%
    • 337002: ~0.10%
    • 338332: ~0.10%
    • 338456: ~0.10%
    • 339065: ~0.10%
    • 339870: ~0.10%
    • 340599: ~0.10%
    • 341156: ~0.10%
    • 342121: ~0.10%
    • 342381: ~0.10%
    • 343411: ~0.10%
    • 344860: ~0.10%
    • 345924: ~0.10%
    • 346421: ~0.10%
    • 346425: ~0.10%
    • 348157: ~0.10%
    • 351281: ~0.10%
    • 351858: ~0.10%
    • 352641: ~0.10%
    • 353748: ~0.10%
    • 357399: ~0.10%
    • 359787: ~0.10%
    • 359893: ~0.10%
    • 360094: ~0.10%
    • 360168: ~0.10%
    • 361127: ~0.10%
    • 362220: ~0.10%
    • 362560: ~0.10%
    • 366835: ~0.10%
    • 367185: ~0.10%
    • 369045: ~0.10%
    • 371113: ~0.10%
    • 376044: ~0.10%
    • 376524: ~0.10%
    • 377231: ~0.10%
    • 377735: ~0.10%
    • 378574: ~0.10%
    • 379749: ~0.10%
    • 379953: ~0.10%
    • 381834: ~0.10%
    • 384039: ~0.10%
    • 384364: ~0.10%
    • 384398: ~0.10%
    • 384751: ~0.10%
    • 385758: ~0.10%
    • 385893: ~0.10%
    • 386098: ~0.10%
    • 387205: ~0.10%
    • 387374: ~0.10%
    • 388450: ~0.10%
    • 388589: ~0.10%
    • 388593: ~0.10%
    • 389571: ~0.10%
    • 389572: ~0.10%
    • 391531: ~0.10%
    • 391857: ~0.10%
    • 393174: ~0.10%
    • 393426: ~0.10%
    • 396601: ~0.10%
    • 396905: ~0.10%
    • 397801: ~0.10%
    • 398011: ~0.10%
    • 398132: ~0.10%
    • 398721: ~0.10%
    • 399016: ~0.10%
    • 401601: ~0.10%
    • 403876: ~0.10%
    • 403897: ~0.10%
    • 404830: ~0.10%
    • 406102: ~0.10%
    • 406397: ~0.10%
    • 407151: ~0.10%
    • 409373: ~0.10%
    • 410084: ~0.10%
    • 410859: ~0.10%
    • 411693: ~0.10%
    • 411984: ~0.10%
    • 412214: ~0.10%
    • 412560: ~0.10%
    • 413117: ~0.10%
    • 416391: ~0.10%
    • 417066: ~0.10%
    • 417198: ~0.10%
    • 417751: ~0.10%
    • 417778: ~0.10%
    • 420257: ~0.10%
    • 420787: ~0.10%
    • 421001: ~0.10%
    • 421045: ~0.10%
    • 421354: ~0.10%
    • 428114: ~0.10%
    • 429057: ~0.10%
    • 429459: ~0.10%
    • 430319: ~0.10%
    • 431215: ~0.10%
    • 431332: ~0.10%
    • 431488: ~0.10%
    • 432097: ~0.10%
    • 432283: ~0.10%
    • 434131: ~0.10%
    • 434934: ~0.10%
    • 435353: ~0.10%
    • 437793: ~0.10%
    • 438297: ~0.10%
    • 438806: ~0.10%
    • 439016: ~0.10%
    • 439129: ~0.10%
    • 439217: ~0.10%
    • 439755: ~0.10%
    • 440343: ~0.10%
    • 440506: ~0.10%
    • 441030: ~0.10%
    • 441509: ~0.10%
    • 443408: ~0.10%
    • 443686: ~0.10%
    • 445516: ~0.10%
    • 445999: ~0.10%
    • 447039: ~0.10%
    • 447219: ~0.10%
    • 447298: ~0.10%
    • 453040: ~0.10%
    • 453745: ~0.10%
    • 454869: ~0.10%
    • 456224: ~0.10%
    • 456251: ~0.10%
    • 457065: ~0.10%
    • 459890: ~0.10%
    • 460010: ~0.10%
    • 463716: ~0.10%
    • 465235: ~0.10%
    • 470470: ~0.10%
    • 471875: ~0.10%
    • 472462: ~0.10%
    • 474016: ~0.10%
    • 479266: ~0.10%
    • 479360: ~0.10%
    • 480621: ~0.10%
    • 483014: ~0.10%
    • 484553: ~0.10%
    • 485031: ~0.10%
    • 485828: ~0.10%
    • 486664: ~0.10%
    • 488266: ~0.10%
    • 489488: ~0.10%
    • 490992: ~0.10%
    • 491894: ~0.10%
    • 491983: ~0.10%
    • 492620: ~0.10%
    • 493035: ~0.10%
    • 493461: ~0.10%
    • 494255: ~0.10%
    • 496473: ~0.10%
    • 496474: ~0.10%
    • 496516: ~0.10%
    • 496813: ~0.10%
    • 496853: ~0.10%
    • 499553: ~0.10%
    • 499565: ~0.10%
    • 499737: ~0.10%
    • 500057: ~0.10%
    • 500546: ~0.10%
    • 501510: ~0.10%
    • 501978: ~0.10%
    • 503905: ~0.10%
    • 510559: ~0.10%
    • 511473: ~0.10%
    • 512440: ~0.10%
    • 513832: ~0.10%
    • 514106: ~0.10%
    • 514902: ~0.10%
    • 515053: ~0.10%
    • 515507: ~0.10%
    • 516205: ~0.10%
    • 517903: ~0.10%
    • 518096: ~0.10%
    • 520796: ~0.10%
    • 521570: ~0.10%
    • 522112: ~0.10%
    • 523814: ~0.10%
    • 525505: ~0.10%
    • 525583: ~0.10%
    • 525764: ~0.10%
    • 528105: ~0.10%
    • 530985: ~0.10%
    • 532014: ~0.10%
    • 534952: ~0.10%
    • 538836: ~0.10%
    • 539326: ~0.10%
    • 539504: ~0.10%
    • 541861: ~0.10%
    • 542925: ~0.10%
    • 543525: ~0.10%
    • 544853: ~0.10%
    • 545091: ~0.10%
    • 546527: ~0.10%
    • 546753: ~0.10%
    • 548007: ~0.10%
    • 548100: ~0.10%
    • 554548: ~0.10%
    • 555064: ~0.10%
    • 560255: ~0.10%
    • 560711: ~0.10%
    • 561084: ~0.10%
    • 561114: ~0.10%
    • 561329: ~0.10%
    • 561838: ~0.10%
    • 561946: ~0.10%
    • 564894: ~0.10%
    • 566884: ~0.10%
    • 568110: ~0.10%
    • 569541: ~0.10%
    • 570881: ~0.10%
    • 571286: ~0.10%
    • 571515: ~0.10%
    • 571577: ~0.10%
    • 572354: ~0.10%
    • 573015: ~0.10%
    • 573283: ~0.10%
    • 577767: ~0.10%
    • 578249: ~0.10%
    • 578805: ~0.10%
    • 580872: ~0.10%
    • 581072: ~0.10%
    • 581684: ~0.10%
    • 582341: ~0.10%
    • 583169: ~0.10%
    • 583322: ~0.10%
    • 583889: ~0.10%
    • 584173: ~0.10%
    • 585406: ~0.10%
    • 585523: ~0.10%
    • 585660: ~0.10%
    • 587005: ~0.10%
    • 587399: ~0.10%
    • 588010: ~0.10%
    • 588337: ~0.10%
    • 590946: ~0.10%
    • 593319: ~0.10%
    • 595246: ~0.10%
    • 597157: ~0.10%
    • 597215: ~0.10%
    • 597368: ~0.10%
    • 597453: ~0.10%
    • 598538: ~0.10%
    • 601120: ~0.10%
    • 604762: ~0.10%
    • 605111: ~0.10%
    • 605547: ~0.10%
    • 606244: ~0.10%
    • 606935: ~0.10%
    • 607099: ~0.10%
    • 609731: ~0.10%
    • 609910: ~0.10%
    • 610485: ~0.10%
    • 613040: ~0.10%
    • 614720: ~0.10%
    • 615525: ~0.10%
    • 616416: ~0.10%
    • 618280: ~0.10%
    • 619151: ~0.10%
    • 619170: ~0.10%
    • 622593: ~0.10%
    • 622755: ~0.10%
    • 623529: ~0.10%
    • 625333: ~0.10%
    • 625780: ~0.10%
    • 626317: ~0.10%
    • 626670: ~0.10%
    • 628299: ~0.10%
    • 628510: ~0.10%
    • 629166: ~0.10%
    • 630995: ~0.10%
    • 632641: ~0.10%
    • 634324: ~0.10%
    • 634750: ~0.10%
    • 636542: ~0.10%
    • 637420: ~0.10%
    • 641046: ~0.10%
    • 643232: ~0.10%
    • 643901: ~0.10%
    • 644517: ~0.10%
    • 645962: ~0.10%
    • 647293: ~0.10%
    • 647443: ~0.10%
    • 648173: ~0.10%
    • 649204: ~0.10%
    • 650521: ~0.10%
    • 651961: ~0.10%
    • 652493: ~0.10%
    • 655888: ~0.10%
    • 656535: ~0.10%
    • 658715: ~0.10%
    • 659035: ~0.10%
    • 659593: ~0.10%
    • 660535: ~0.10%
    • 662154: ~0.10%
    • 662784: ~0.10%
    • 663142: ~0.10%
    • 666319: ~0.10%
    • 666386: ~0.10%
    • 666561: ~0.10%
    • 668151: ~0.10%
    • 668862: ~0.10%
    • 670341: ~0.10%
    • 671801: ~0.10%
    • 673081: ~0.10%
    • 673634: ~0.10%
    • 673875: ~0.10%
    • 673881: ~0.10%
    • 674082: ~0.10%
    • 675319: ~0.10%
    • 675492: ~0.10%
    • 676147: ~0.10%
    • 676238: ~0.10%
    • 676318: ~0.10%
    • 676431: ~0.10%
    • 677459: ~0.10%
    • 678468: ~0.10%
    • 679216: ~0.10%
    • 679307: ~0.10%
    • 680354: ~0.10%
    • 681098: ~0.10%
    • 681873: ~0.10%
    • 684800: ~0.10%
    • 685613: ~0.10%
    • 685690: ~0.10%
    • 686886: ~0.10%
    • 689687: ~0.10%
    • 689748: ~0.10%
    • 694425: ~0.10%
    • 694466: ~0.10%
    • 698130: ~0.10%
    • 702137: ~0.10%
    • 703138: ~0.10%
    • 704067: ~0.10%
    • 704460: ~0.10%
    • 705420: ~0.10%
    • 706199: ~0.10%
    • 706878: ~0.10%
    • 708333: ~0.10%
    • 710580: ~0.10%
    • 710897: ~0.10%
    • 713539: ~0.10%
    • 713584: ~0.10%
    • 714733: ~0.10%
    • 718172: ~0.10%
    • 719545: ~0.10%
    • 719580: ~0.10%
    • 720471: ~0.10%
    • 720690: ~0.10%
    • 722394: ~0.10%
    • 723568: ~0.10%
    • 724334: ~0.10%
    • 724700: ~0.10%
    • 727908: ~0.10%
    • 728088: ~0.10%
    • 729096: ~0.10%
    • 730499: ~0.10%
    • 730711: ~0.10%
    • 733963: ~0.10%
    • 734912: ~0.10%
    • 736431: ~0.10%
    • 738012: ~0.10%
    • 738173: ~0.10%
    • 739026: ~0.10%
    • 739605: ~0.10%
    • 740181: ~0.10%
    • 742066: ~0.10%
    • 742298: ~0.10%
    • 745799: ~0.10%
    • 748392: ~0.10%
    • 748838: ~0.10%
    • 749148: ~0.10%
    • 751762: ~0.10%
    • 752092: ~0.10%
    • 752527: ~0.10%
    • 753568: ~0.10%
    • 755386: ~0.10%
    • 756558: ~0.10%
    • 756736: ~0.10%
    • 758706: ~0.10%
    • 759523: ~0.10%
    • 760550: ~0.10%
    • 762688: ~0.10%
    • 762918: ~0.10%
    • 763569: ~0.10%
    • 763766: ~0.10%
    • 765769: ~0.10%
    • 766789: ~0.10%
    • 768119: ~0.10%
    • 768537: ~0.10%
    • 773106: ~0.10%
    • 775589: ~0.10%
    • 775964: ~0.10%
    • 776055: ~0.10%
    • 777088: ~0.10%
    • 777529: ~0.10%
    • 778375: ~0.10%
    • 781066: ~0.10%
    • 782328: ~0.10%
    • 783231: ~0.10%
    • 784413: ~0.10%
    • 785781: ~0.10%
    • 786250: ~0.10%
    • 786845: ~0.10%
    • 788012: ~0.10%
    • 791857: ~0.10%
    • 792788: ~0.10%
    • 793182: ~0.10%
    • 794187: ~0.10%
    • 794308: ~0.10%
    • 794318: ~0.10%
    • 796097: ~0.10%
    • 796117: ~0.10%
    • 797182: ~0.10%
    • 798215: ~0.10%
    • 802050: ~0.10%
    • 802669: ~0.10%
    • 804168: ~0.10%
    • 804253: ~0.10%
    • 804461: ~0.10%
    • 805743: ~0.10%
    • 808416: ~0.10%
    • 808455: ~0.10%
    • 810577: ~0.10%
    • 811702: ~0.10%
    • 811843: ~0.10%
    • 815923: ~0.10%
    • 816475: ~0.10%
    • 818312: ~0.10%
    • 818521: ~0.10%
    • 819278: ~0.10%
    • 820890: ~0.10%
    • 821615: ~0.10%
    • 823136: ~0.10%
    • 823735: ~0.10%
    • 829476: ~0.10%
    • 830591: ~0.10%
    • 832433: ~0.10%
    • 832597: ~0.10%
    • 833053: ~0.10%
    • 835043: ~0.10%
    • 835759: ~0.10%
    • 837731: ~0.10%
    • 837942: ~0.10%
    • 839448: ~0.10%
    • 840228: ~0.10%
    • 840417: ~0.10%
    • 841851: ~0.10%
    • 843327: ~0.10%
    • 843622: ~0.10%
    • 844870: ~0.10%
    • 846084: ~0.10%
    • 846807: ~0.10%
    • 847076: ~0.10%
    • 847535: ~0.10%
    • 847977: ~0.10%
    • 848075: ~0.10%
    • 848326: ~0.10%
    • 852725: ~0.10%
    • 853465: ~0.10%
    • 856427: ~0.10%
    • 857186: ~0.10%
    • 858377: ~0.10%
    • 858543: ~0.10%
    • 860426: ~0.10%
    • 863804: ~0.10%
    • 866039: ~0.10%
    • 866406: ~0.10%
    • 867180: ~0.10%
    • 868280: ~0.10%
    • 872156: ~0.10%
    • 872791: ~0.10%
    • 872953: ~0.10%
    • 872959: ~0.10%
    • 875015: ~0.10%
    • 876522: ~0.10%
    • 878407: ~0.10%
    • 878710: ~0.10%
    • 878855: ~0.10%
    • 880495: ~0.10%
    • 882732: ~0.10%
    • 884335: ~0.10%
    • 884941: ~0.10%
    • 885893: ~0.10%
    • 886713: ~0.10%
    • 887068: ~0.10%
    • 887751: ~0.10%
    • 888027: ~0.10%
    • 890152: ~0.10%
    • 891137: ~0.10%
    • 891890: ~0.10%
    • 892662: ~0.10%
    • 892973: ~0.10%
    • 893360: ~0.10%
    • 893915: ~0.10%
    • 893976: ~0.10%
    • 894324: ~0.10%
    • 895709: ~0.10%
    • 897065: ~0.10%
    • 898387: ~0.10%
    • 899291: ~0.10%
    • 899604: ~0.10%
    • 900513: ~0.10%
    • 900619: ~0.10%
    • 901170: ~0.10%
    • 902794: ~0.10%
    • 903238: ~0.10%
    • 904294: ~0.10%
    • 904520: ~0.10%
    • 904992: ~0.10%
    • 907212: ~0.10%
    • 908062: ~0.10%
    • 908561: ~0.10%
    • 911034: ~0.10%
    • 911982: ~0.10%
    • 913716: ~0.10%
    • 914819: ~0.10%
    • 915750: ~0.10%
    • 915766: ~0.10%
    • 916125: ~0.10%
    • 916648: ~0.10%
    • 917285: ~0.10%
    • 918194: ~0.10%
    • 926035: ~0.10%
    • 927726: ~0.10%
    • 929821: ~0.10%
    • 930300: ~0.10%
    • 930796: ~0.10%
    • 931617: ~0.10%
    • 932719: ~0.10%
    • 933784: ~0.10%
    • 934378: ~0.10%
    • 935900: ~0.10%
    • 936118: ~0.10%
    • 936336: ~0.10%
    • 937231: ~0.10%
    • 938420: ~0.10%
    • 939184: ~0.10%
    • 939567: ~0.10%
    • 941588: ~0.10%
    • 944093: ~0.10%
    • 944912: ~0.10%
    • 945069: ~0.10%
    • 945659: ~0.10%
    • 946110: ~0.10%
    • 950044: ~0.10%
    • 954101: ~0.10%
    • 954147: ~0.10%
    • 958697: ~0.10%
    • 959530: ~0.10%
    • 961721: ~0.10%
    • 963582: ~0.10%
    • 964471: ~0.10%
    • 965026: ~0.10%
    • 966573: ~0.10%
    • 967330: ~0.10%
    • 968346: ~0.10%
    • 970649: ~0.10%
    • 970873: ~0.10%
    • 971636: ~0.10%
    • 971664: ~0.10%
    • 973555: ~0.10%
    • 973851: ~0.10%
    • 974207: ~0.10%
    • 976896: ~0.10%
    • 981402: ~0.10%
    • 983723: ~0.10%
    • 984358: ~0.10%
    • 984653: ~0.10%
    • 987107: ~0.10%
    • 987167: ~0.10%
    • 994360: ~0.10%
    • 995049: ~0.10%
    • 1002688: ~0.10%
    • 1004305: ~0.10%
    • 1004650: ~0.10%
    • 1004849: ~0.10%
    • 1005118: ~0.10%
    • 1005614: ~0.10%
    • 1005626: ~0.10%
    • 1005669: ~0.10%
    • 1006835: ~0.10%
    • 1011008: ~0.10%
    • 1012299: ~0.10%
    • 1014010: ~0.10%
    • 1014030: ~0.10%
    • 1016549: ~0.10%
    • 1017016: ~0.10%
    • 1017335: ~0.10%
    • 1018386: ~0.10%
    • 1020640: ~0.10%
    • 1021041: ~0.10%
    • 1021411: ~0.10%
    • 1025341: ~0.10%
    • 1025423: ~0.10%
    • 1025767: ~0.10%
    • 1026066: ~0.10%
    • 1026434: ~0.10%
    • 1027516: ~0.10%
    • 1027703: ~0.10%
    • 1028119: ~0.10%
    • 1028642: ~0.10%
    • 1031554: ~0.10%
    • 1032300: ~0.10%
    • 1033639: ~0.10%
    • 1033660: ~0.10%
    • 1034832: ~0.10%
    • 1035274: ~0.10%
    • 1037432: ~0.10%
    • 1037536: ~0.10%
    • 1037759: ~0.10%
    • 1039860: ~0.10%
    • 1041131: ~0.10%
    • 1041892: ~0.10%
    • 1043066: ~0.10%
    • 1044326: ~0.10%
    • 1044905: ~0.10%
    • 1047848: ~0.10%
    • 1048534: ~0.10%
    • 1049477: ~0.10%
    • 1050531: ~0.10%
    • 1052073: ~0.10%
    • 1052617: ~0.10%
    • 1054049: ~0.10%
    • 1055142: ~0.10%
    • 1056933: ~0.10%
    • 1057358: ~0.10%
    • 1057911: ~0.10%
    • 1061411: ~0.10%
    • 1062328: ~0.10%
    • 1062485: ~0.10%
    • 1062534: ~0.10%
    • 1062794: ~0.10%
    • 1063269: ~0.10%
    • 1063467: ~0.10%
    • 1064568: ~0.10%
    • 1064868: ~0.10%
    • 1065481: ~0.10%
    • 1065565: ~0.10%
    • 1067970: ~0.10%
    • 1068014: ~0.10%
    • 1070203: ~0.10%
    • 1070708: ~0.10%
    • 1072038: ~0.10%
    • 1072214: ~0.10%
    • 1074885: ~0.10%
    • 1075308: ~0.10%
    • 1078872: ~0.10%
    • 1078979: ~0.10%
    • 1079266: ~0.10%
    • 1079736: ~0.10%
    • 1080075: ~0.10%
    • 1081716: ~0.10%
    • 1137391: ~0.10%
    • 1138530: ~0.10%
    • 1139697: ~0.10%
    • 1140119: ~0.10%
    • 1140869: ~0.10%
    • 1141527: ~0.10%
    • 1144693: ~0.10%
    • 1145425: ~0.10%
    • 1149162: ~0.10%
    • 1149207: ~0.10%
    • 1150086: ~0.10%
    • 1150398: ~0.10%
    • 1150731: ~0.10%
    • 1151256: ~0.10%
    • 1151403: ~0.10%
    • 1152236: ~0.10%
    • 1153693: ~0.10%
    • 1155859: ~0.10%
    • 1156918: ~0.10%
    • 1158007: ~0.10%
    • 1158559: ~0.10%
    • 1158952: ~0.10%
    • 1159165: ~0.10%
    • 1161242: ~0.10%
    • 1163227: ~0.10%
    • 1166023: ~0.10%
    • 1166231: ~0.10%
    • 1167002: ~0.10%
    • 1169844: ~0.10%
    • 1170663: ~0.10%
    • 1171580: ~0.10%
    • 1172072: ~0.10%
    • 1172083: ~0.10%
    • 1173371: ~0.10%
    • 1173809: ~0.10%
    • 1174049: ~0.10%
    • 1175044: ~0.10%
    • 1175745: ~0.10%
    • 1176061: ~0.10%
    • 1176414: ~0.10%
    • 1176993: ~0.10%
    • 1177449: ~0.10%
    • 1178311: ~0.10%
    • 1179029: ~0.10%
    • 1179069: ~0.10%
    • 1180579: ~0.10%
    • 1181077: ~0.10%
    • 1183293: ~0.10%
    • 1184313: ~0.10%
    • 1185090: ~0.10%
    • 1185669: ~0.10%
    • size: 16 elements
    • size: 16 elements
  • Samples:
    query_id document_ids scores
    685613 [7546874, 1176459, 197677, 2306318, 8541504, ...] [0.9999999992804947, 0.24845418756716053, 0.7594154013647826, 0.26644182105618575, 0.390668914839766, ...]
    237784 [6366584, 4034101, 2325374, 6914618, 6042146, ...] [0.9999999991784339, 0.42233632827946693, 0.5956354295491569, 0.12644415907455164, 0.6636713730105909, ...]
    904294 [448408, 8743975, 49600, 7339401, 2714261, ...] [0.9999999991841937, 0.877629062381539, 0.8330146583389045, 0.3116634796692611, 0.4633524534142185, ...]
  • Loss: pylate.losses.distillation.Distillation

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • learning_rate: 4e-06
  • max_steps: 20000
  • fp16: True
  • dataloader_drop_last: True
  • dataloader_num_workers: 8
  • ddp_find_unused_parameters: False
  • torch_compile: True
  • torch_compile_backend: inductor
  • eval_on_start: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 4e-06
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3.0
  • max_steps: 20000
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 8
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: False
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: True
  • torch_compile_backend: inductor
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: True
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}
Downloads last month
17
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train xtr-replicability/modernbert_xtr_kd_multik128-256-512

Evaluation results