Industry analysts predict Deep Learning will account for the majority of cloud workloads. Additionally, training of Deep Learning models will represent the majority of server applications in the next few years. Among Deep Learning workloads, foundation models — a new class of AI models that are trained on broad data (typically via self-supervision) using billions of parameters – are expected to consume the majority of the infrastructure.
It’s a fascinating topic that the SNIA Cloud Storage Technologies Initiative (CSTI) will tackle on March 15, 2023 at our live webcast “Training Deep Learning Models in the Cloud” where our SNIA Deep Learning experts, Milind Pandit, from Habana, an Intel Company and Seelam Seetharami of IBM, will discuss how Deep Learning models are gaining prominence in various industries, and provide examples of the benefits of AI adoption.
We’ll enumerate considerations for selection of Deep Learning infrastructure in on-premises and cloud data centers. This session will include an assessment of various solution approaches and identify challenges faced by enterprises in their adoption of AI and Deep Learning technologies. We’ll answer questions like:
- What benefits are enterprises enjoying from innovations in AI, Machine Learning, and Deep Learning?
- How are cloud native AI software stacks such as Kubernetes being leveraged by organizations to reduce complexity with rapidly evolving software stacks (TensorFlow, PyTorch, etc.)?
- What are the challenges in operationalizing Deep Learning infrastructure?
- How can Deep Learning solutions scale?
- Besides cost, time-to-train, data storage capacity and data bandwidth, what else should be considered when designing and selecting a Deep Learning infrastructure?
Our presenters will be available to answer your questions. Register here to reserve your spot. We look forward to seeing you on March 15th.