Skip to content

Commit

Permalink
r0.17: [MXNet] Unpin matplotlib==3.4 Dep & Fix #2046 (#2080)
Browse files Browse the repository at this point in the history
* Sync d2l lib

* [MXNet] Fix explicit NumPy coercing for matplotlib>3.4
  • Loading branch information
AnirudhDagar authored Mar 23, 2022
1 parent 61e5ed8 commit e770440
Show file tree
Hide file tree
Showing 5 changed files with 38 additions and 7 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -352,7 +352,7 @@ z = np.exp(- x**2 - y**2)
# Plot function
ax = d2l.plt.figure().add_subplot(111, projection='3d')
ax.plot_wireframe(x, y, z)
ax.plot_wireframe(x.asnumpy(), y.asnumpy(), z.asnumpy())
d2l.plt.xlabel('x')
d2l.plt.ylabel('y')
d2l.plt.xticks([-2, -1, 0, 1, 2])
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -553,8 +553,10 @@ w = np.exp(-1)*(-1 - (x + 1) + (x + 1)**2 + y**2)
# Plot function
ax = d2l.plt.figure().add_subplot(111, projection='3d')
ax.plot_wireframe(x, y, z, **{'rstride': 10, 'cstride': 10})
ax.plot_wireframe(x, y, w, **{'rstride': 10, 'cstride': 10}, color='purple')
ax.plot_wireframe(x.asnumpy(), y.asnumpy(), z.asnumpy(),
**{'rstride': 10, 'cstride': 10})
ax.plot_wireframe(x.asnumpy(), y.asnumpy(), w.asnumpy(),
**{'rstride': 10, 'cstride': 10}, color='purple')
d2l.plt.xlabel('x')
d2l.plt.ylabel('y')
d2l.set_figsize()
Expand Down
16 changes: 14 additions & 2 deletions chapter_linear-networks/linear-regression.md
Original file line number Diff line number Diff line change
Expand Up @@ -342,7 +342,7 @@ rather than writing costly for-loops in Python.**)
%matplotlib inline
from d2l import mxnet as d2l
import math
import numpy as np
from mxnet import np
import time
```

Expand Down Expand Up @@ -481,7 +481,19 @@ def normal(x, mu, sigma):
We can now (**visualize the normal distributions**).

```{.python .input}
#@tab all
#@tab mxnet
# Use numpy again for visualization
x = np.arange(-7, 7, 0.01)
# Mean and standard deviation pairs
params = [(0, 1), (0, 2), (3, 1)]
d2l.plot(x.asnumpy(), [normal(x, mu, sigma).asnumpy() for mu, sigma in params], xlabel='x',
ylabel='p(x)', figsize=(4.5, 2.5),
legend=[f'mean {mu}, std {sigma}' for mu, sigma in params])
```

```{.python .input}
#@tab pytorch, tensorflow
# Use numpy again for visualization
x = np.arange(-7, 7, 0.01)
Expand Down
19 changes: 18 additions & 1 deletion chapter_optimization/optimization-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,24 @@ annotate('saddle point', (0, -0.2), (-0.52, -5.0))
Saddle points in higher dimensions are even more insidious, as the example below shows. Consider the function $f(x, y) = x^2 - y^2$. It has its saddle point at $(0, 0)$. This is a maximum with respect to $y$ and a minimum with respect to $x$. Moreover, it *looks* like a saddle, which is where this mathematical property got its name.

```{.python .input}
#@tab all
#@tab mxnet
x, y = d2l.meshgrid(
d2l.linspace(-1.0, 1.0, 101), d2l.linspace(-1.0, 1.0, 101))
z = x**2 - y**2
ax = d2l.plt.figure().add_subplot(111, projection='3d')
ax.plot_wireframe(x.asnumpy(), y.asnumpy(), z.asnumpy(),
**{'rstride': 10, 'cstride': 10})
ax.plot([0], [0], [0], 'rx')
ticks = [-1, 0, 1]
d2l.plt.xticks(ticks)
d2l.plt.yticks(ticks)
ax.set_zticks(ticks)
d2l.plt.xlabel('x')
d2l.plt.ylabel('y');
```

```{.python .input}
#@tab pytorch, tensorflow
x, y = d2l.meshgrid(
d2l.linspace(-1.0, 1.0, 101), d2l.linspace(-1.0, 1.0, 101))
z = x**2 - y**2
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
requirements = [
'jupyter==1.0.0',
'numpy==1.21.5',
'matplotlib==3.4',
'matplotlib==3.5.1',
'requests==2.25.1',
'pandas==1.2.4'
]
Expand Down

0 comments on commit e770440

Please sign in to comment.