You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The convention adopted in autograd for complex numbers is generally that parameters can be updated in the direction of conj(g) to minimize a function. linalg.norm follows the opposite convention, preventing it from being used in code that combines it with other functions.
This was mentioned in #393 alongside some potential fixes.
Minimal reproducer:
importnumpyasnpimportautograd.numpyasanpfromautogradimportvalue_and_gradrng=np.random.default_rng(42)
x=rng.normal(size=10) +1j*rng.normal(10)
y=rng.normal(size=10) +1j*rng.normal(10)
deffoo(x, y):
# simple loss functionreturnanp.linalg.norm(x-y)
deffoo2(x, y):
# alt impld=x-yreturnanp.sum(anp.abs(d)**2)**0.5# use foo2 for correct behaviorfoo_vag=value_and_grad(foo)
for_inrange(10):
v, g=foo_vag(x, y)
# or remove conj here to make foo workx-=0.05*np.conj(g)
# value should initially decreaseprint(v)
# 5.761837589211455# 5.785341294273737# 5.8095148223718915# 5.834344369254469# 5.8598161062729055# 5.885916205539433# 5.912630863722038# 5.939946324450408# 5.967848899318491# 5.996324987479975
autograd v1.7.0.
The text was updated successfully, but these errors were encountered:
Hi all, thanks for such a still useful project.
The convention adopted in
autograd
for complex numbers is generally that parameters can be updated in the direction ofconj(g)
to minimize a function.linalg.norm
follows the opposite convention, preventing it from being used in code that combines it with other functions.This was mentioned in #393 alongside some potential fixes.
Minimal reproducer:
autograd v1.7.0.
The text was updated successfully, but these errors were encountered: