Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

linalg.norm has mismatched convention for complex numbers #666

Open
jcmgray opened this issue Dec 16, 2024 · 0 comments
Open

linalg.norm has mismatched convention for complex numbers #666

jcmgray opened this issue Dec 16, 2024 · 0 comments

Comments

@jcmgray
Copy link

jcmgray commented Dec 16, 2024

Hi all, thanks for such a still useful project.

The convention adopted in autograd for complex numbers is generally that parameters can be updated in the direction of conj(g) to minimize a function. linalg.norm follows the opposite convention, preventing it from being used in code that combines it with other functions.

This was mentioned in #393 alongside some potential fixes.

Minimal reproducer:

import numpy as np
import autograd.numpy as anp
from autograd import value_and_grad

rng = np.random.default_rng(42)

x = rng.normal(size=10) + 1j * rng.normal(10)
y = rng.normal(size=10) + 1j * rng.normal(10)

def foo(x, y):
    # simple loss function
    return anp.linalg.norm(x - y)

def foo2(x, y):
    # alt impl
    d = x - y
    return anp.sum(anp.abs(d)**2)**0.5

# use foo2 for correct behavior
foo_vag = value_and_grad(foo)

for _ in range(10):
    v, g = foo_vag(x, y)
    # or remove conj here to make foo work
    x -= 0.05 * np.conj(g)
    # value should initially decrease
    print(v)
# 5.761837589211455
# 5.785341294273737
# 5.8095148223718915
# 5.834344369254469
# 5.8598161062729055
# 5.885916205539433
# 5.912630863722038
# 5.939946324450408
# 5.967848899318491
# 5.996324987479975

autograd v1.7.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant