当前位置: X-MOL 学术Entropy › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Method to Present and Analyze Ensembles of Information Sources
Entropy ( IF 2.1 ) Pub Date : 2020-05-21 , DOI: 10.3390/e22050580
Nicholas M Timme 1 , David Linsenbardt 2 , Christopher C Lapish 1, 3
Affiliation  

Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided.

中文翻译:

一种呈现和分析信息源集合的方法

信息论是分析复杂系统的有力工具。在神经科学的许多领域,现在可以从大量神经变量(例如,来自许多神经元、基因或体素的数据)中收集数据。可以用信息论分析单个变量以提供变量之间(在变量之间形成网络)或神经变量和其他变量(例如,行为或感官刺激)之间共享的信息的估计。但是,可能很难 (1) 评估集成是否与纯噪声系统中的预期显着不同,以及 (2) 确定两个集成是否不同。在这里,我们通过分析信息源的集合来介绍相对简单的方法来解决这些问题。我们演示了如何将基于互信息连接构建的集成与空替代数据进行比较,以确定集成是否与噪声显着不同。接下来,我们将展示如何使用随机化过程来比较两个集合,以确定一个中的源是否比另一个包含更多信息。提供了执行这些分析和演示所需的所有代码。
更新日期:2020-05-21
down
wechat
bug