本日は待ちに待ったソフトマックス回帰の実装です。ロジスティックの兄さんのような存在のロジスティック。出力が確率の多値分類になっただけです。この記事では数式をゴリゴリ計算していこうと思います。
Softmax function
![Rendered by QuickLaTeX.com n](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-b170995d512c659d8668b4e42e1fef6b_l3.png)
![Rendered by QuickLaTeX.com K](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-ea9c87a513e4a72624155d392fae86e2_l3.png)
![Rendered by QuickLaTeX.com K=2](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-7d415c72da6452a215e7f13f19f94931_l3.png)
![Rendered by QuickLaTeX.com j \neq k](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-64bdbcb270fd60d123cd10ebfc80f1a5_l3.png)
![Rendered by QuickLaTeX.com j = k](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-aeca976d30c308cb06a2b6407bda3cc5_l3.png)
![Rendered by QuickLaTeX.com \log C = - \max_j {\theta^{(j)}}^T x^{(i)}](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-2a33073a944fcf56bc8d8d95e0c8288a_l3.png)
![Rendered by QuickLaTeX.com 1 { True } = 1, 1 {False } = 0](https://research.miidas.jp/wp-content/ql-cache/quicklatex.com-42b8c3d1948fcf483f99374a1f3d544b_l3.png)
参考
- https://en.wikipedia.org/wiki/Softmax_function
- https://www.kdnuggets.com/2016/07/softmax-regression-related-logistic-regression.html
- http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/
- https://math.stackexchange.com/questions/1428344/what-is-the-derivation-of-the-derivative-of-softmax-regression-or-multinomial-l
- https://houxianxu.github.io/2015/04/23/logistic-softmax-regression/