Skip to content

Commit 888a63a

Browse files
authored
fix an issue for DP on Megatron-DeepSpeed (#368)
1 parent ebe8025 commit 888a63a

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

pretrain_gpt.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,8 +36,8 @@ def model_provider(pre_process=True, post_process=True):
3636

3737
args = get_args()
3838
config = core_transformer_config_from_args(args)
39-
if hasattr(mpu, 'get_sequence_parallel_group'):
40-
dpg = mpu.get_sequence_parallel_group()
39+
if hasattr(mpu, 'get_sequence_data_parallel_group'):
40+
dpg = mpu.get_sequence_data_parallel_group()
4141
elif hasattr(mpu, 'get_data_parallel_group'):
4242
dpg = mpu.get_data_parallel_group()
4343
else:

0 commit comments

Comments
 (0)