当前位置: X-MOL 学术Nat. Hum. Behav. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning metrics on spectrotemporal modulations reveals the perception of musical instrument timbre
Nature Human Behaviour ( IF 29.9 ) Pub Date : 2020-11-30 , DOI: 10.1038/s41562-020-00987-5
Etienne Thoret , Baptiste Caramiaux , Philippe Depalle , Stephen McAdams

Humans excel at using sounds to make judgements about their immediate environment. In particular, timbre is an auditory attribute that conveys crucial information about the identity of a sound source, especially for music. While timbre has been primarily considered to occupy a multidimensional space, unravelling the acoustic correlates of timbre remains a challenge. Here we re-analyse 17 datasets from published studies between 1977 and 2016 and observe that original results are only partially replicable. We use a data-driven computational account to reveal the acoustic correlates of timbre. Human dissimilarity ratings are simulated with metrics learned on acoustic spectrotemporal modulation models inspired by cortical processing. We observe that timbre has both generic and experiment-specific acoustic correlates. These findings provide a broad overview of former studies on musical timbre and identify its relevant acoustic substrates according to biologically inspired models.



中文翻译:

光谱时间调制的学习指标揭示了乐器音色的感知

人类擅长使用声音对周围环境做出判断。尤其是,音色是一种听觉属性,可传达有关声源身份(尤其是音乐)的重要信息。尽管音色主要被认为占据了多维空间,但要弄清音色的声学关联仍然是一个挑战。在这里,我们重新分析了1977年至2016年发表的研究中的17个数据集,并观察到原始结果只能部分复制。我们使用数据驱动的计算帐户来揭示音色的声学关联。通过在皮层处理启发下的声学光谱时间调制模型中学习的指标来模拟人类相异性等级。我们观察到,音色既有通用的声音,也有特定于实验的声音相关。

更新日期:2020-12-01
down
wechat
bug