Papers
arxiv:2503.20841

In vitro 2 In vivo : Bidirectional and High-Precision Generation of In Vitro and In Vivo Neuronal Spike Data

Published on Mar 26, 2025
Authors:

Abstract

Transformer model applied to neural data with Dice loss enables accurate cross-domain neural activity generation and supports ethical research principles.

AI-generated summary

Neurons encode information in a binary manner and process complex signals. However, predicting or generating diverse neural activity patterns remains challenging. In vitro and in vivo studies provide distinct advantages, yet no robust computational framework seamlessly integrates both data types. We address this by applying the Transformer model, widely used in large-scale language models, to neural data. To handle binary data, we introduced Dice loss, enabling accurate cross-domain neural activity generation. Structural analysis revealed how Dice loss enhances learning and identified key brain regions facilitating high-precision data generation. Our findings support the 3Rs principle in animal research, particularly Replacement, and establish a mathematical framework bridging animal experiments and human clinical studies. This work advances data-driven neuroscience and neural activity modeling, paving the way for more ethical and effective experimental methodologies.

Community

The peer-reviewed version of this work is now available:

Shimono, M. (2026). In Vitro to In Vivo: Bidirectional and High-Precision Generation of In Vitro and In Vivo Neuronal Spike Data. Algorithms, 19(4), 305. https://doi.org/10.3390/a19040305

Code:
https://github.com/ShimonoMLab/GenerativeNeurosci_ML-TrDic

Hugging Face dataset entry:
https://huggingface.co/datasets/Nori778/InVitro-InVivo-SpikeTrain-Generation

Mendeley Data:
https://doi.org/10.17632/kf65cvmtbz.1

This work introduces a Transformer + Dice loss framework for bidirectional neural-domain transfer between unpaired in vitro and in vivo multineuronal spike trains.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2503.20841 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2503.20841 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.