From 610e937db9457f73bfafcf98dbb38239333c24cd Mon Sep 17 00:00:00 2001 From: SJ2050cn Date: Sun, 21 Nov 2021 15:17:37 +0800 Subject: [PATCH] Finish derivativing the formulas of softmax method. --- homework_04_logistic_regression/homework/README.md | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/homework_04_logistic_regression/homework/README.md b/homework_04_logistic_regression/homework/README.md index 79298f9..16d9756 100644 --- a/homework_04_logistic_regression/homework/README.md +++ b/homework_04_logistic_regression/homework/README.md @@ -22,11 +22,13 @@ g(z_i)=g(\theta_i^T \mathbf{x})=\frac{e^{\theta_i^T\mathbf{x}}}{\sum\limits_{j=1 $$ 构造似然函数,若有$m$个训练样本: $$ -\begin{align} +\begin{equation} +\begin{split} L(\Theta)&=p(\mathbf{y}|\mathbf{X};\Theta) \\ & = \prod\limits_{i=1}^{m} p(y^{i}|\mathbf{x}^{i};\Theta) \\ & = \prod_{i=1}^m h_{\theta_i}(\mathbf{x}) -\end{align} +\end{split} +\end{equation} $$ 对似然函数取对数,转换为: $$ @@ -41,10 +43,12 @@ $$ $$ 转换后的似然函数对$\theta$求偏导,在这里我们以只有一个训练样本的情况为例: $$ -\begin{align} +\begin{equation} +\begin{split} \frac{\partial}{\partial\theta_k}l(\Theta)&=\frac{\partial l(\Theta)}{\partial{z_k}}\cdot \frac{\partial z_k}{\partial \theta_k} \\ &=(y_k-h_{\theta_k}(\mathbf{x}))\mathbf{x} -\end{align} +\end{split} +\end{equation} $$ 上式中$y_k$的表达式如下: $$