Skip to content

Commit

Permalink
Merge pull request #4 from togethercomputer/fix_rotar_emb
Browse files Browse the repository at this point in the history
Replace RotaryEmbedding with GPTNeoXRotaryEmbedding
  • Loading branch information
azahed98 authored Aug 10, 2023
2 parents 43a7562 + a6553fe commit d0027d5
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/models/gpt_neox/modeling_gpt_neox.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ def __init__(self, config):
self._init_bias(config.max_position_embeddings)

self.register_buffer("masked_bias", torch.tensor(-1e9), persistent=False)
self.rotary_emb = RotaryEmbedding(
self.rotary_emb = GPTNeoXRotaryEmbedding(
self.rotary_ndims, config.max_position_embeddings, base=config.rotary_emb_base, device = 'cuda:0'
)
self.register_buffer(
Expand Down

0 comments on commit d0027d5

Please sign in to comment.