👋 Welcome to Scott’s Tech Doc

Hi, this is Scott. I’m documenting my learning notes in this blog. Currently, I’m a PhD student at Purdue, see my research homepage for more information.

Loss functioin in neural network

Kullback-Leibler divergence Information theory Quantify information of intuition1 Likely events should have low information content Less likely events should have higher information content Independent events should have additive information. For example, finding out that a tossed coin has come up as heads twice should convey twice as much information as finding out that a tossed coin has come up as heads once. Self-information $I(x)=-\log P(x)$ Deals only with a single outcome Shannon entropy $H(\mathrm{x})=\mathbb{E}_{\mathrm{x} \sim P}[I(x)]=-\mathbb{E}_{\mathrm{x} \sim P}[\log P(x)]$ Quantify the amount of uncertainty in an entire probability distribution KL divergence and cross-entropy Measure how different two distributions over the same random variable $x$ $D_{\mathrm{KL}}(P | Q)=\mathbb{E}_{\mathrm{x} \sim P}\left[\log \frac{P(x)}{Q(x)}\right]=\mathbb{E}_{\mathrm{x} \sim P}[\log P(x)-\log Q(x)]$ Properities Non-negative....

October 13, 2020 · 2 min · Scott Du

在服务器上部署 Jupyter Notebook

April 22, 2020 · 1 min · Scott Du

使用 Hugo 进行持续集成写作及同步

January 22, 2020 · 2 min · Scott Du

Go Hugo!

January 21, 2020 · 2 min · Scott Du