Which one to use?

TagsCS 330Meta-Learning Methods

The computational graph perspective

In all three approaches, the computational graph is very similar. Only the inner loop varies between approaches. And there can be different approaches that mix and match the approaches. For example, do MAML but you optimize over the protonet embedding.

Adaptability & Practicals

Black-box is easy to combine with things like RL. Optimization-based is model-agnostic, so it’s super convenient. Unfortunately, non-parametric is limited to classification so far.

Black-box can be data-inefficient and less explainable.

Non-parametric can be hard to scale with the number of points we want to compare to. In contrast, optimization-based models handle varying number of points very well (it’s literally just training).

Non-parametric and black-box are both entirely feed-forward. Optimization-based uses second-order optimization so it is also compute and memory intensive.

Expressivity

Expressive power is the ability for a model to represent a range of functions. This is important because it is the potential for learning.

Black box is the most expressive. Optimization-based can be expressive for deep models, and non-parametric is also expressive.

Consistency

This is the condition that more data yields better performance. It also means that the model is more robust, etc.

Black-box is not consistent because, again, it’s a black box. Optimization is very consistent and will perform better on OOD cases. Non-parametric is consistent under certain conditions.

Uncertainty awareness

This is the ability to reason about ambiguity, like a meta-cognitive awareness. This is important for high-risk situations, or active learning, RL, etc.