Meta-Learning via Hypernetworks
dc.contributor.author
Zhao, Dominic
dc.contributor.author
Kobayashi, Seijin
dc.contributor.author
Sacramento, João
dc.contributor.author
von Oswald, Johannes
dc.date.accessioned
2022-04-12T14:09:06Z
dc.date.available
2021-01-27T08:08:37Z
dc.date.available
2021-01-27T08:28:16Z
dc.date.available
2021-01-27T08:38:53Z
dc.date.available
2022-04-11T13:44:40Z
dc.date.available
2022-04-12T14:09:06Z
dc.date.issued
2020-12
dc.identifier.uri
http://hdl.handle.net/20.500.11850/465883
dc.identifier.doi
10.3929/ethz-b-000465883
dc.description.abstract
Recent developments in few-shot learning have shown that during fast adaption, gradient-based meta-learners mostly rely on embedding features of powerful pretrained networks. This leads us to research ways to effectively adapt features and utilize the meta-learner's full potential. Here, we demonstrate the effectiveness of hypernetworks in this context. We propose a soft row-sharing hypernetwork architecture and show that training the hypernetwork with a variant of MAML is tightly linked to meta-learning a curvature matrix used to condition gradients during fast adaptation. We achieve similar results as state-of-art model-agnostic methods in the overparametrized case, while outperforming many MAML variants without using different optimization schemes in the compressive regime. Furthermore, we empirically show that hypernetworks do leverage the inner loop optimization for better adaptation, and analyse how they naturally try to learn the shared curvature of constructed tasks on a toy problem when using our proposed training algorithm.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
NeurIPS
en_US
dc.rights.uri
http://rightsstatements.org/page/InC-NC/1.0/
dc.title
Meta-Learning via Hypernetworks
en_US
dc.type
Conference Paper
dc.rights.license
In Copyright - Non-Commercial Use Permitted
ethz.size
11 p.
en_US
ethz.version.deposit
publishedVersion
en_US
ethz.event
4th Workshop on Meta-Learning at NeurIPS 2020 (MetaLearn 2020)
en_US
ethz.event.location
Online
en_US
ethz.event.date
December 11, 2020
en_US
ethz.notes
Due to the Coronavirus (COVID-19) the conference was conducted virtually.
Accpeted version replaced with published version. Number of authors and author order has been changed.
en_US
ethz.grant
Probabilistic learning in deep cortical networks
en_US
ethz.publication.place
s.l.
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Dep. Inf.technologie und Elektrotechnik / Dep. of Inform.Technol. Electrical Eng.::02533 - Institut für Neuroinformatik / Institute of Neuroinformatics::09479 - Grewe, Benjamin / Grewe, Benjamin
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02140 - Dep. Inf.technologie und Elektrotechnik / Dep. of Inform.Technol. Electrical Eng.::02533 - Institut für Neuroinformatik / Institute of Neuroinformatics::09479 - Grewe, Benjamin / Grewe, Benjamin
en_US
ethz.grant.agreementno
186027
ethz.grant.agreementno
186027
ethz.grant.fundername
SNF
ethz.grant.fundername
SNF
ethz.grant.funderDoi
10.13039/501100001711
ethz.grant.funderDoi
10.13039/501100001711
ethz.grant.program
Ambizione
ethz.grant.program
Ambizione
ethz.date.deposited
2021-01-27T08:08:45Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2021-01-27T08:28:25Z
ethz.rosetta.lastUpdated
2023-02-07T00:47:37Z
ethz.rosetta.exportRequired
true
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Meta-Learning%20via%20Hypernetworks&rft.date=2020-12&rft.au=Zhao,%20Dominic&Kobayashi,%20Seijin&Sacramento,%20Jo%C3%A3o&von%20Oswald,%20Johannes&rft.genre=proceeding&rft.btitle=Meta-Learning%20via%20Hypernetworks
Dateien zu diesem Eintrag
Publikationstyp
-
Conference Paper [35671]