Hinweis

Dieser Eintrag befindet sich in Bearbeitung.

Zur Kurzanzeige

dc.contributor.author
Kahlbacher, Fabian
dc.contributor.supervisor
Marx, Alexander
dc.contributor.supervisor
Immer, Alexander
dc.contributor.supervisor
Bühlmann, Peter
dc.date.accessioned
2024-06-17T07:48:22Z
dc.date.available
2024-06-15T10:47:58Z
dc.date.available
2024-06-17T07:48:22Z
dc.date.issued
2023-09-11
dc.identifier.uri
http://hdl.handle.net/20.500.11850/678414
dc.description.abstract
Causal discovery concerns the problem of learning the causal structures between variables of a system from observational data. To tackle the computational complexity when mul- tiple variables are involved, continuous optimization is a growing area. However, most approaches rely on additive noise, and no systematic evaluation of non-additive location- scale noise (LSN) has been performed. Modeling LSN or heteroscedasticity is important as it is common in many real-world systems. If heteroscedasticity is not modeled, this can lead to predicting the wrong causal structure. We build upon recent advances in contin- uous optimization for structure learning and provide extensions to a current method to work in the presence of location-scale noise. Further, we consider an existing method that models LSN and has not been evaluated on synthetic data. We provide extensive syn- thetic experiments demonstrating the superiority of LSN methods over current continuous methods that do not model LSN. The existing method modeling LSN achieves the best performance on all experiment types except one, where our proposed methods perform better. On the other hand, we do not observe performance improvements by modeling LSN on real-world data sets. Finally, we discuss the limitations and insights of learning the causal structure through continuous optimization-based approaches and propose multiple ideas to improve our methods.
en_US
dc.language.iso
en
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.subject
Causal discovery
en_US
dc.subject
neural networks
en_US
dc.subject
Machine Learning
en_US
dc.subject
Heteroscedastic noise
en_US
dc.title
Continuous Optimization DAG Learning in the Presence of Location Scale Noise: A Systematic Evaluation of Different Frameworks
en_US
dc.type
Master Thesis
dc.rights.license
In Copyright - Non-Commercial Use Permitted
ethz.code.ddc
DDC - DDC::0 - Computer science, information & general works::000 - Generalities, science
en_US
ethz.code.jel
JEL - JEL::C - Mathematical and Quantitative Methods::C1 - Econometric and Statistical Methods and Methodology: General::C13 - Estimation: General
en_US
ethz.publication.status
unpublished
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02000 - Dep. Mathematik / Dep. of Mathematics::02537 - Seminar für Statistik (SfS) / Seminar for Statistics (SfS)
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning
en_US
ethz.date.deposited
2024-06-15T10:47:58Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Closed access
en_US
ethz.rosetta.exportRequired
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Continuous%20Optimization%20DAG%20Learning%20in%20the%20Presence%20of%20Location%20Scale%20Noise:%20A%20Systematic%20Evaluation%20of%20Different%20Frameworks&rft.date=2023-09-11&rft.au=Kahlbacher,%20Fabian&rft.genre=unknown&rft.btitle=Continuous%20Optimization%20DAG%20Learning%20in%20the%20Presence%20of%20Location%20Scale%20Noise:%20A%20Systematic%20Evaluation%20of%20Different%20Frameworks
 Printexemplar via ETH-Bibliothek suchen

Dateien zu diesem Eintrag

DateienGrößeFormatIm Viewer öffnen

Zu diesem Eintrag gibt es keine Dateien.

Publikationstyp

Zur Kurzanzeige