Ceph Q&A

In a little over a month, more than 1,500 people have viewed the SNIA Cloud Storage Technologies Initiative (CSTI) live webinar, “Ceph: The Linux of Storage Today,” with SNIA experts Vincent Hsu and Tushar Gohad. If you missed it, you can watch it on-demand at the SNIA Educational Library. The live audience was extremely engaged with our presenters, asking several interesting questions. As promised, Vincent and Tushar have answered them here.

Given the high level of this interest in this topic, the CSTI is planning additional sessions on Ceph. Please follow us @SNIACloud or at SNIA LinkedIn for dates.

Q: How many snapshots can Ceph support per cluster? Q: Does Ceph provide Deduplication? If so, is it across objects, file and block storage?

 A: There is no per-cluster limit. In the Ceph filesystem (cephfs) it is possible to create snapshots on a per-path basis, and currently the configurable default limit is 100 snapshots per path. The Ceph block storage (rbd) does not impose limits on the number of snapshots.  However, when using the native Linux kernel rbd client there is a limit of 510 snapshots per image. Read More

Here’s Why Ceph is the Linux of Storage Today

Data is one of the most critical resources of our time. Storage for data has always been a critical architectural element for every data center, requiring careful considerations for storage performance, scalability, reliability, data protection, durability and resilience. A decade ago, the market was aggressively embracing public storage because of its agility and scalability. In the last few years, people have been rethinking that approach, moving toward on-premises storage with cloud consumption models. The new cloud native architecture on-premises has the promise of the traditional data center’s security and reliability with cloud agility and scalability.

Ceph, an Open Source project for enterprise unified software-defined storage, represents a compelling solution for this cloud native on-premises architecture and will be the topic of our next SNIA Cloud Storage Technologies Initiative webinar, “Ceph: The Linux of Storage Today.”

This webinar will discuss: Read More

Edge AI Q&A

At our recent SNIA Cloud Storage Technologies (CSTI) webinar “Why Distributed Edge Data is the Future of AI” our expert speakers, Rita Wouhaybi and Heiko Ludwig, explained what’s new and different about edge data, highlighted use cases and phases of AI at the edge, covered Federated Learning, discussed privacy for edge AI, and provided an overview of the many other challenges and complexities being created by increasingly large AI models and algorithms. It was a fascinating session. If you missed it you can access it on-demand along with a PDF of the slides at the SNIA Educational Library.

Our live audience asked several interesting questions. Here are answers from our presenters.

Q. With the rise of large language models (LLMs) what role will edge AI play? Read More

Confidential AI Q&A

Confidential AI is a new collaborative platform for data and AI teams to work with sensitive data sets and run AI models in a confidential environment. It includes infrastructure, software, and workflow orchestration to create a secure, on-demand work environment that meets organization’s privacy requirements and complies with regulatory mandates. It’s a topic the SNIA Cloud Storage Technologies Initiative (CSTI) covered in depth at our webinar, “The Rise in Confidential AI.” At this webinar, our experts, Parviz Peiravi and Richard Searle provided a deep and insightful look at how this dynamic technology works to ensure data protection and data privacy. Here are their answers to the questions from our webinar audience.

Q. Are businesses using Confidential AI today?

A. Absolutely, we have seen a big increase in adoption of Confidential AI particularly in industries such as Financial Services, Healthcare and Government, where Confidential AI is helping these organizations enhance risk mitigation, including cybercrime prevention, anti-money laundering, fraud prevention and more.

Q: With compute capabilities on the Edge increasing, how do you see Trusted Execution Environments evolving?

Read More

How Edge Data is Impacting AI

AI is disrupting so many domains and industries and by doing so, AI models and algorithms are becoming increasingly large and complex. This complexity is driven by the proliferation in size and diversity of localized data everywhere, which creates the need for a unified data fabric and/or federated learning. It could be argued that whoever wins the data race will win the AI race, which is inherently built on two premises: 1) Data is available in a central location for AI to have full access to it, 2) Compute is centralized and abundant.

The impact of edge AI is the topic for our next SNIA Cloud Storage Technologies Initiative (CSTI) live webinar, “Why Distributed Edge Data is the Future of AI,” on October 3, 2023. If centralized (or in the cloud), AI is a superpower and super expert, but edge AI is a community of many smart wizards with the power of cumulative knowledge over a central superpower.  In this webinar, our SNIA experts will discuss: Read More

Training Deep Learning Models Q&A

The estimated impact of Deep Learning (DL) across all industries cannot be understated. In fact, analysts predict deep learning will account for the majority of cloud workloads, and training of deep learning models will represent the majority of server applications in the next few years. It’s the topic the SNIA Cloud Storage Technologies Initiative (CSTI) discussed at our webinar “Training Deep Learning Models in the Cloud.” If you missed the live event, it’s available on-demand at the SNIA Educational Library where you can also download the presentation slides.

The audience asked our expert presenters, Milind Pandit from Habana Labs Intel and Seetharami Seelam from IBM several interesting questions. Here are their answers:

Q. Where do you think most of the AI will run, especially training? Will it be in the public cloud or will it be on-premises or both Read More