Towards automated quantification of vocal communication during social behaviors in songbirds
dc.contributor.author
Tomka, Tomas
dc.contributor.supervisor
Hahnloser, Richard H.R.
dc.contributor.supervisor
Lipkind, Dina
dc.contributor.supervisor
Grewe, Benjamin
dc.date.accessioned
2023-03-29T15:01:54Z
dc.date.available
2023-03-22T13:06:06Z
dc.date.available
2023-03-23T07:43:15Z
dc.date.available
2023-03-29T15:01:54Z
dc.date.issued
2023
dc.identifier.uri
http://hdl.handle.net/20.500.11850/604555
dc.identifier.doi
10.3929/ethz-b-000604555
dc.description.abstract
Vocalizations are produced by highly specialized motor gestures and regulate social interactions in many species. Vocal learners, such as songbirds or humans, acquire their vocal repertoire through cultural transmission with an intriguing computational efficiency. The zebra finch, a highly social songbird, has been a model organism of outstanding importance for our understanding of vocal learning. Primarily reductionist research has generated valuable insights into molecular, neural, and behavioral aspects of vocal learning. But the combinatorial effect of social factors on cultural transmission remains largely unknown. Today, the field is transitioning towards more holistic inquiries at the social level, using big data paradigms to uncover systemic principles. However, multiple challenges need to be solved to enable conclusive longitudinal studies of entire animal groups.
Reliable vocal detection in large-scale sound data has been a longstanding problem and has served as playground for many machine learning efforts, but benchmark animal datasets with labelled vocalization boundaries are scarce. Creation of such datasets requires tedious screening for vocalizations that have been missed with machine-based approaches. The challenge of faithfully annotating vocal data aggravates when studying interactive behaviors, due to overlap of individual vocalizations and noises from animal interactions. Additionally, contextualization of vocal interactions with relatively rare and brief non-vocal events, such as copulations, previously required strenuous and time-consuming inspection of video data. Lastly, correlation-based hypotheses need to be tested for causality, which requires experimental control over individual social interactions. We tackle these challenges in threefold manner.
First, we introduce a benchmark dataset of vocal segments from single zebra finches at different developmental stages. We test how well zebra finch vocalizations can be retrieved as vocal neighbors of each other in spectrographic space, using different distance measures. Interestingly, the Spearman distance outperforms other popular distance measures such as the cosine and Euclidean distances. We find excellent performance for adults (F1 score of 0.93 ± 0.07) using 50 labelled examples (templates), but not for juveniles (F1 score of 0.64 ± 0.18), which produce highly variable vocalizations. For juveniles, the retrieval is improved when searching with equally sized overlapping template slices (F1 score of 0.72 ± 0.10), compared to searches with entire templates. As an addition to a growing array of computational tools for vocal communication research, our vocal retrieval method is useful to proofread human- or computer-annotated datasets.
Secondly, we introduce a dataset of interacting mixed-sex zebra finch couples engaging in copulations. We have found that animal-borne wireless sensors, which have been originally introduced to assign vocalizations to individuals, are highly suitable for automated copulation detection. We have observed that the female radio transmitter’s carrier frequency is modulated by the physical mounting of the flying male. Copulation attempts are detected by joint occurrence of this modulation and male wing flaps. Annotating vocal and non-vocal behaviors, we find behavioral signatures signaling solicited copulations roughly 25-30 s in advance: for instance, with frequent female nest/whine calls, or changes in courtship song tempo and composition. Monitoring, or even predicting, copulations based on behavioral signatures could benefit animal caretaking and wildlife conservation programs.
Thirdly, our group has developed a system for real-time control of vocal interactions among separately housed and digitally connected animals. We have characterized vocal interactions between pairs of connected birds by the cross-covariance function and we have shown that birds engaged in reliable vocal interactions constrained by the imposed network topology. Our system and analysis could be applied in the future to probe detailed causal relationships in vocal interactions among songbird couples or during vocal learning in juvenile birds.
Taken together, our main contribution is to democratize access to large-scale curated zebra finch datasets, which can be used in the future to train machine-based solutions to detect vocalizations or predict reproductive behaviors. Additionally, we provide a computational tool for proofreading existing datasets, and a system to manipulate vocal interactions in real-time. With these efforts, we aim to accelerate systemic insights into the structure, development, and function of vocal expressions – and positively impact human coexistence with animal wildlife.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
Songbird
en_US
dc.subject
Vocal learning
en_US
dc.subject
Vocal communication
en_US
dc.subject
Social behavior
en_US
dc.subject
Search algorithms
en_US
dc.subject
Courtship Behavior
en_US
dc.subject
Vocal analysis
en_US
dc.title
Towards automated quantification of vocal communication during social behaviors in songbirds
en_US
dc.type
Doctoral Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
dc.date.published
2023-03-23
ethz.size
97 p.
en_US
ethz.code.ddc
DDC - DDC::5 - Science::500 - Natural sciences
en_US
ethz.identifier.diss
29067
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Dep. Inf.technologie und Elektrotechnik / Dep. of Inform.Technol. Electrical Eng.::02533 - Institut für Neuroinformatik / Institute of Neuroinformatics::03774 - Hahnloser, Richard H.R. / Hahnloser, Richard H.R.
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Dep. Inf.technologie und Elektrotechnik / Dep. of Inform.Technol. Electrical Eng.::02533 - Institut für Neuroinformatik / Institute of Neuroinformatics::03774 - Hahnloser, Richard H.R. / Hahnloser, Richard H.R.
en_US
ethz.date.deposited
2023-03-22T13:06:06Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
ethz.date.embargoend
2024-03-23
ethz.rosetta.installDate
2023-03-23T07:43:16Z
ethz.rosetta.lastUpdated
2024-02-02T21:25:24Z
ethz.rosetta.exportRequired
true
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Towards%20automated%20quantification%20of%20vocal%20communication%20during%20social%20behaviors%20in%20songbirds&rft.date=2023&rft.au=Tomka,%20Tomas&rft.genre=unknown&rft.btitle=Towards%20automated%20quantification%20of%20vocal%20communication%20during%20social%20behaviors%20in%20songbirds
Dateien zu diesem Eintrag
Publikationstyp
-
Doctoral Thesis [30235]