Skip to content

Commit

Permalink
notation fix
Browse files Browse the repository at this point in the history
  • Loading branch information
dgedon committed Nov 9, 2023
1 parent 05a17e4 commit 0169693
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions preparatory_notebooks/F3_logistic_regression.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@
"Using this model, we can compute the average misclassification loss given a set of parameters $\\theta$. This will be our cost function:\n",
"\n",
"$$\n",
"J_{\\text{misclass}}(w) = \\frac{1}{N} \\sum_{i=1}^N \\ell_{\\text{misclass}}(y_i, \\hat{y}_i ; \\theta)\n",
"J_{\\text{misclass}}(\\theta) = \\frac{1}{N} \\sum_{i=1}^N \\ell_{\\text{misclass}}(y_i, \\hat{y}_i ; \\theta)\n",
"$$\n",
"\n",
"where $N$ is the number of samples in the dataset.\n",
Expand Down Expand Up @@ -196,7 +196,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Below, we plot the decision boundary for a classifier with the initialized $w$ parameters, alongside our data points which are colored according to their true label. Moreover, the misclassification cost is also calculated for the classifier and printed.\n",
"Below, we plot the decision boundary for a classifier with the initialized $\\theta$ parameters, alongside our data points which are colored according to their true label. Moreover, the misclassification cost is also calculated for the classifier and printed.\n",
"\n",
"Tasks:\n",
"1. Play around with the parameters $\\theta_1$ and $\\theta_2$ to: \n",
Expand Down Expand Up @@ -231,10 +231,10 @@
"\n",
"In this section, we will look at the same classifier as before, but this time we will use the logistic loss instead of the misclassification loss. We will also visualize the loss surface in addition to the decision boundary.\n",
"\n",
"Remembering our definition of the logistic loss, we can compute the average logistic loss given a set of parameters $w$. This will be our cost function:\n",
"Remembering our definition of the logistic loss, we can compute the average logistic loss given a set of parameters $\\theta$. This will be our cost function:\n",
"\n",
"$$\n",
"J_{\\text{logistic}}(w) = \\frac{1}{N} \\sum_{i=1}^N \\ell_{\\text{logistic}}(y_i, \\hat{y}_i ; \\theta)\n",
"J_{\\text{logistic}}(\\theta) = \\frac{1}{N} \\sum_{i=1}^N \\ell_{\\text{logistic}}(y_i, \\hat{y}_i ; \\theta)\n",
"$$\n",
"\n",
"where $N$ is the number of samples in the dataset.\n",
Expand Down Expand Up @@ -283,7 +283,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Below, we again draw the decision boundary of our classifier for the same dataset. However, this time we calculate and print the logistic loss instead of the misclassification loss. Moreover, the loss surface is also plotted, where you can see how the loss changes for different values of $w_1$ and $w_2$, for this specific dataset.\n",
"Below, we again draw the decision boundary of our classifier for the same dataset. However, this time we calculate and print the logistic loss instead of the misclassification loss. Moreover, the loss surface is also plotted, where you can see how the loss changes for different values of $\\theta_1$ and $\\theta_2$, for this specific dataset.\n",
"\n",
"Task:\n",
"1. Try again to minimize the cost by changing $\\theta_1$ and $\\theta_2$ in order to separate the data points as best as possible. Note how the best decision boundary does not yield a cost of 0, but rather a small value now. What does this mean for the classifier?\n",
Expand Down Expand Up @@ -346,7 +346,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
"version": "3.7.4"
}
},
"nbformat": 4,
Expand Down

0 comments on commit 0169693

Please sign in to comment.