Article Details

Scaling to trillion-parameter model training on AWS - Amazon Science

Retrieved on: 2022-09-26 19:13:09

Tags for this article:

Click the tags to see associated articles and topics

Scaling to trillion-parameter model training on AWS - Amazon Science. View article details on HISWAI: https://www.amazon.science/blog/scaling-to-trillion-parameter-model-training-on-aws

Excerpt

AWS recently announced a preview of Amazon EC2 P4de GPU instances powered by 400 Gbps networking and 80GB GPU memory. P4de provides twice as much ...

Article found on: www.amazon.science

View Original Article

This article is found inside other Hiswai user's workspaces. To start your own collection, sign up for free.

Sign Up