Replies: 1 comment
-
@imchinfei yes, you need to modify the .sh to include the master/ip info, you can find distributed launch script examples in pytorch code / docs... |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
distributed_train.sh seems to support single-machine multi-card training, does this project support multi-machine multi-card training?
Beta Was this translation helpful? Give feedback.
All reactions