Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't run your sample code #254

Open
KajiMaCN opened this issue Dec 4, 2024 · 0 comments
Open

Can't run your sample code #254

KajiMaCN opened this issue Dec 4, 2024 · 0 comments

Comments

@KajiMaCN
Copy link

KajiMaCN commented Dec 4, 2024

Can't run your sample code:

import torch
from dig.threedgraph.dataset import QM9_3D
from dig.threedgraph.method import SphereNet
from dig.threedgraph.evaluation import threedEvaluator
from dig.threedgraph.method import run

device=torch.device("cuda:0")
# Load the dataset and split
dataset = QM9_3D(root='dataset/')
target = 'U0'
dataset.data.y = dataset.data[target]
split_idx = dataset.get_idx_split(len(dataset.data.y), train_size=110000, valid_size=10000, seed=42)
train_dataset, valid_dataset, test_dataset = dataset[split_idx['train']], dataset[split_idx['valid']], dataset[split_idx['test']]

# Define model, loss, and evaluation
model = SphereNet(energy_and_force=False, cutoff=5.0, num_layers=4,
                  hidden_channels=128, out_channels=1, int_emb_size=64,
                  basis_emb_size=8, out_emb_channels=256,
                  num_spherical=3, num_radial=6, envelope_exponent=5,
                  num_before_skip=1, num_after_skip=2, num_output_layers=3)
loss_func = torch.nn.L1Loss()
evaluation = threedEvaluator()

# Train and evaluate
run3d = run()
run3d.run(device, train_dataset, valid_dataset, test_dataset, model, loss_func, evaluation,
          epochs=20, batch_size=32, vt_batch_size=32, lr=0.0005, lr_decay_factor=0.5, lr_decay_step_size=15)

torch:2.4.1-gpu
python:3.12.7

ERROR:

/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/dig/threedgraph/dataset/QM9_pyg.py:67: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  self.data, self.slices = torch.load(self.processed_paths[0])
/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/torch_geometric/data/in_memory_dataset.py:300: UserWarning: It is not recommended to directly access the internal storage format `data` of an 'InMemoryDataset'. If you are absolutely certain what you are doing, access the internal storage via `InMemoryDataset._data` instead to suppress this warning. Alternatively, you can access stacked individual attributes of every graph via `dataset.{attr_name}`.
  warnings.warn(msg)
Traceback (most recent call last):
  File "/home/asus/stuFile1/mzh/PythonProjects/3DGNN/main.py", line 16, in <module>
    model = SphereNet(energy_and_force=False, cutoff=5.0, num_layers=4,
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/dig/threedgraph/method/spherenet/spherenet.py", line 258, in __init__
    self.emb = emb(num_spherical, num_radial, self.cutoff, envelope_exponent)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/dig/threedgraph/method/spherenet/spherenet.py", line 28, in __init__
    self.dist_emb = dist_emb(num_radial, cutoff, envelope_exponent)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/dig/threedgraph/method/spherenet/features.py", line 179, in __init__
    self.reset_parameters()
  File "/home/asus/anaconda3/envs/DIG/lib/python3.12/site-packages/dig/threedgraph/method/spherenet/features.py", line 182, in reset_parameters
    torch.arange(1, self.freq.numel() + 1, out=self.freq).mul_(PI)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: aten::arange(): functions with out=... arguments don't support automatic differentiation, but one of the arguments requires grad.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant