Sketching the Journey of Learning

Bali's notes & demos.

Page 2 of 4 for Sketching the Journey of Learning | Bali’s notes & demos.

Notes for Prof. Hung-Yi Lee's ML Lecture: RNN

RNN

Notes for Prof. Hung-Yi Lee's ML Lecture: Support Vector Machine

My Intuitive Explanation

Notes for Prof. Hung-Yi Lee's ML Lecture: Transfer Learning

Transfer learning is about the scenarios that we have data which are not directly directly related to the task considered. Besiedes, the usage of terminologies in the field of transfer learning are somewhat chaotic, so the meaning of some terms may vary between articles.

Notes for Prof. Hung-Yi Lee's ML Lecture: Deep Generative Model

Component by Component

Notes for Prof. Hung-Yi Lee's ML Lecture: More about Auto-Encoder

More than minimizing reconstruction error

Notes for Prof. Hung-Yi Lee's ML Lecture 16: Auto-Encoder

By the concept of auto-encoder, we train an NN encoder and an NN decoder simultaneously. The encoder is a function that generate a code by the input, and the decoder is a function that generate an output by a code. We also call the code as the embedding, the latent representation, or the latent code, etc.. To train the encoder and the decoder, we cascade the decoder after the decoder, and train the cascaded neural network to reproduce outputs that are as close as the inputs.

Notes for Prof. Hung-Yi Lee's ML Lecture 15: Neighbor Embedding

Manifold Learning

Notes for Prof. Hung-Yi Lee's ML Lecture 14: Word Embedding

Approaches to Represent Words

Notes for Prof. Hung-Yi Lee's ML Lecture 13: Unsupervised Learning- Linear Methods

Unsupervised Learning

Notes for Prof. Hung-Yi Lee's ML Lecture 12: Semi-Supervised Learning

In supervised learning, all the training data are labelled, i.e. we have