What the “T” Means in SNIA Cloud Storage Technologies

The SNIA Cloud Storage Initiative (CSI) has had a rebrand; we’ve added a T for Technologies into our name, and we’re now officially the Cloud Storage Technologies Initiative (CSTI).

That doesn’t seem like a significant change, but there’s a good reason. Our old name reflected the push to getting acceptance of cloud storage, and that specific cloud storage debate has been won, and big time. One relatively small cloud service provider is currently storing 400PB of clients’ data. Twitter alone consumes 300PB of data on Google’s cloud offering. Facebook, Amazon, AliBaba, Tencent – all have huge data storage numbers.

Enterprises of every size are storing data in the cloud. That’s why we added the word “technologies.” The expanded charter and new name reflect the need to support the evolving cloud business models and architectures such as OpenStack, software defined storage, Kubernetes and object storage. It includes data services, orchestration and management, understanding hyperscale requirements and the role standards play.

So what do we do? The CSTI is an active group that publishes articles and white papers, speaks at industry conferences and presents at highly-rated webcasts that have been viewed by thousands. You can learn more about the CSTI and check out the Infographic for highlights on cloud storage trends and CSTI activities.

If you’re interested in cloud storage technologies, I encourage you to consider joining our group. We have multiple membership options for established vendors, startups, educational institutions, even individuals. Learn more about CSTI membership here.

Simplifying the Movement of Data from Cloud to Cloud

We are increasingly living in a multi-cloud world, with potentially multiple private, public and hybrid cloud implementations supporting a single enterprise. Organizations want to leverage the agility of public cloud resources to run existing workloads without having to re-plumb or re-architect them and their processes. In many cases, applications and data have been moved individually to the public cloud. Over time, some applications and data might need to be moved back on premises, or moved partially or entirely, from one cloud to another.

That means simplifying the movement of data from cloud to cloud. Data movement and data liberation – the seamless transfer of data from one cloud to another – has become a major requirement.

On August 7, 2018, the SNIA Cloud Storage Technologies Initiative will tackle this issue in a live webcast, “Cloud Mobility and Data Movement.” We will explore some of these data movement and mobility issues and include real-world examples from the University of Michigan. We’ll discus:

  • How do we secure data both at-rest and in-transit?
  • What are the steps that can be followed to import data securely? What cloud processes and interfaces should we use to make data movement easier?
  • How should we organize our data to simplify its mobility? Should we use block, file or object technologies?
  • Should the application of the data influence how (and even if) we move the data?
  • How can data in the cloud be leveraged for multiple use cases?

Register now for this live webcast. Our SNIA experts will be on-hand to answer you questions.

 

AI, Machine Learning and Natural Language Processing in Action

SNIA Cloud Storage recently hosted a fascinating webcast on the real world use of IBM Watson – the computer that mesmerized viewers on “Jeopardy!” by answering questions accurately and faster than its human competitors. Our webcast, “Customer Support through Natural Language Processing and Machine Learning,” detailed how Watson is being used as a virtual support assistant, named Elio, at NetApp. We had many interesting questions during the live event which is now available on-demand. Here are answers to them all from our expert presenters who have been driving the success of Elio – Ross Ackerman from NetApp and Robin Marcenac from IBM. Read More

Marketing Your New Website From the Cloud

Launching a new website is an exciting venture, but it’s crucial to have a robust marketing strategy in place to ensure it reaches its intended audience. With cloud-based tools and services at your disposal, marketing your new website can be both efficient and effective. Here are some strategies to consider:

Leverage Cloud Analytics:

Data-Driven Insights: Cloud-based analytics platforms like Google Analytics provide invaluable insights into your website’s performance. Track visitor behavior, demographics, and engagement metrics to fine-tune your marketing efforts.

SEO Optimization: Use cloud-based SEO tools to optimize your website’s content and structure. Identify relevant keywords, monitor rankings, and ensure your site is search engine-friendly. Legal professionals may even seek law firm marketing services to optimize their web presence.

Social Media Marketing:

Content Scheduling: Cloud-based social media management tools like Hootsuite and Buffer allow you to schedule posts, ensuring a consistent online presence. Create engaging content that promotes your website’s value.

Paid Advertising: Platforms like Facebook Ads and Google Ads offer cloud-based advertising solutions. Target specific demographics, interests, and behaviors to reach your ideal audience. You can check out this reference to know more about marketing.

Email Marketing:

Email Campaigns: Cloud-based email marketing platforms like Mailchimp or SendinBlue can help you create and automate email campaigns. Build a subscriber list and send personalized content, including newsletters, product updates, or exclusive offers.
Content Creation and Collaboration:

Cloud Storage: Use cloud storage solutions like Google Drive or Dropbox to collaborate with your team on content creation. Share documents, images, and videos effortlessly.

Content Management Systems (CMS): Many CMS platforms, such as WordPress or Joomla, are cloud-based. They provide user-friendly interfaces for updating your website’s content regularly.

Performance Monitoring:

Uptime Monitoring: Cloud-based website monitoring services like UptimeRobot notify you immediately if your site experiences downtime. Ensuring your site is always accessible is crucial for user experience.

Load Testing: Perform load testing using cloud-based tools to simulate heavy traffic and ensure your website can handle increased user loads without slowing down.

Security and Backups:

Cloud Security: Protect your website from cyber threats with cloud-based security solutions. These services offer real-time threat detection and mitigation.

Automated Backups: Use cloud backup services to automatically back up your website’s data and files. This ensures you can quickly recover in case of data loss.

Scalability:

Cloud Hosting: Consider hosting your website in the cloud for scalability. Cloud hosting services like AWS, Azure, or Google Cloud can accommodate traffic spikes without performance issues.

Marketing your new website from the cloud offers a plethora of tools and services that can streamline your efforts. By leveraging cloud analytics, social media marketing, email campaigns, content creation, performance monitoring, security, and scalability solutions, you can reach your target audience effectively and ensure your website’s long-term success. Stay agile, adapt your strategies based on data insights, and continuously optimize your online presence to stay ahead in the competitive digital landscape.

Watson: From Jeopardy! to Digital Support Assistant

When IBM Watson premiered on “Jeopardy!” viewers were mesmerized by Watson’s ability to answer the quiz show’s questions and most times, beat the human contestants! Fast-forward to today and the real-world applications extend well beyond playing trivia games. Watson is being deployed in a variety of medical and business scenarios.

In fact, NetApp is now using Watson as part of Elio, a virtual support assistant that responds to queries in natural language. Elio is built using Watson’s cognitive computing capabilities which enable Elio to analyze unstructured data, by using natural language processing to understand grammar and context, interpret complex questions, and evaluate all possible meanings to determine what is being asked. Elio then reasons and identifies the best answers to questions with help from experts who monitor the quality of answers and continue to train Elio on more subjects. It’s a fascinating application of artificial intelligence (AI) that we will discuss in detail at our SNIA Cloud Storage webcast on February 22, 2018, “Customer Support through Natural Language Processing and Machine Learning.”

Elio and Watson represent an innovative and novel use of large quantities of unstructured data to help solve problems, on average, four times faster than traditional methods. Join us at this webcast, where those on the front lines of this innovative application will discuss:

  • The challenges of utilizing large quantities of valuable yet unstructured data
  • How Watson and Elio continuously learn as more data arrives, and navigates an ever growing volume of technical information
  • How Watson understands customer language and provides understandable responses

Learn how these new and exciting technologies are changing the way we look at and interact with large volumes of traditionally hard-to-analyze data. Register now! We look forward to seeing you on the Feb. 22nd.

 

 

Evaluator Group to Share Hybrid Cloud Research

In a recent survey of enterprise hybrid cloud users, the Evaluator Group saw that nearly 60% of respondents indicated that lack of interoperability is a significant technology issue that they must overcome in order to move forward. In fact, lack of interoperability was the number one issue, surpassing public cloud security and network security as significant inhibitors.

The SNIA Cloud Storage Initiative (CSI) is pleased to have John Webster, Senior Partner at Evaluator Group, who will join us on December 12th for a live webcast to dive into the findings of their research. In this webcast, Multi-Cloud Storage: Addressing the Need for Portability and Interoperability, my SNIA Cloud colleague, Mark Carlson, and John will discuss enterprise hybrid cloud objectives and barriers to adoption. John and Mark will focus on cloud interoperability within the storage domain and the CSI’s work that promotes interoperability and portability of data stored in the cloud. Read More

Expert Answers to Cloud Object Storage and Gateways Questions

In our most recent SNIA Cloud webcast, “Cloud Object Storage and the Use of Gateways,” we discussed market trends toward the adoption of object storage and the use of gateways to execute on a cloud strategy.  If you missed the live event, it’s now available on-demand together with the webcast slides. There were many good questions at the live event and our expert, Dan Albright, has graciously answered them in this blog.

Q. Can object storage be accessed by tools for use with big data?

A. Yes. Technically, access to big data is in real-time with HDFS connectors like S3, but it is  conditional on latency and if it is based on local hard drives, it should not be used as the primary storage as it would run very slowly. The guidance is to use hard drive based object storage either as an online archive or a backup target for HDFS.

Q. Will current block storage or NAS be replaced with cloud object storage + gateway?

A. Yes and no.  It’s dependent on the use case. For ILM (Information Lifecycle Management) uses, only the aged and infrequently accessed data is moved to the gateway+cloud object storage, to take advantage of a lower cost tier of storage, while the more recent and active data remains on the primary block or file storage.  For file sync and share, the small office/remote office data is moved off of the local NAS and consolidated/centralized and managed on the gateway file system. In practice, these methods will vary based on the enterprise’s requirements.

Q. Can we use cloud object storage for IoT storage that may require high IOPS?

A. High IOPS workloads are best supported by local SSD based Object, Block or NAS storage.  remote or hard drive based Object storage is better deployed with low IOPS workloads.

Q. What about software defined storage?

A. Cloud object storage may be implemented as SDS (Software Defined Storage) but may also be implemented by dedicated appliances. Most cloud Object storage services are SDS based.

Q. Can you please define NAS?

A. The SNIA Dictionary defines Network Attached Storage (NAS) as:

1. [Storage System] A term used to refer to storage devices that connect to a network and provide file access services to computer systems. These devices generally consist of an engine that implements the file services, and one or more devices, on which data is stored.

2. [Network] A class of systems that provide file services to host computers using file access protocols such as NFS or CIFS.

Q. What are the challenges with NAS gateways into object storage? Aren’t there latency issues that NAS requires that aren’t available in a typical Object store solution?

A. The key factor to consider is workload.  If the workload of applications accessing data residing on NAS experiences high frequency of reads and writes then that data is not a good candidate for remote or hard drive based object storage. However, it is commonly known that up to 80% of data residing on NAS is infrequently accessed.  It is this data that is best suited for migration to remote object storage.

Thanks for all the great questions. Please check out our library of SNIA Cloud webcasts to learn more. And follow us on Twitter @SNIACloud for announcements of future webcasts.

 

How Gateways Benefit Cloud Object Storage

The use of cloud object storage is ramping up sharply especially in the public cloud, where its simplicity can significantly reduce capital budgets and operating expenses. And while it makes good economic sense, enterprises are challenged with legacy applications that do not support standard protocols to move data to and from the cloud.

That’s why the SNIA Cloud Storage Initiative is hosting a live webcast on September 26th, “Cloud Object Storage and the Use of Gateways.”

Object storage is a secure, simple, scalable, and cost-effective means of managing the explosive growth of unstructured data enterprises generate every day. Enterprises have developed data strategies specific to the public cloud; improved data protection, long term archive, application development, DevOps, Data Science, and cognitive artificial intelligence to name a few.

However, these same organizations have legacy applications and infrastructure that are not object storage friendly, but use file protocols like NFS and SMB. Gateways enable SMB and NFS data transfers to be converted to Amazon’s S3 protocol while optimizing data with deduplication, providing QoS (quality of service), and efficiencies on the data path to the cloud.

This webcast will highlight the market trends toward the adoption of object storage and the use of gateways to execute a cloud strategy, the benefits of object storage when gateways are deployed, and the use cases that are best suited to leverage this solution.

You will learn:

  • The benefits of object storage when gateways are deployed
  • Primary use cases for using object storage and gateways in private, public or hybrid cloud
  • How gateways can help achieve the goals of your cloud strategy without
    retooling your on-premise infrastructure and applications

We plan to share some pearls of wisdom on the challenges organizations are facing with object storage in the cloud from a vendor-neutral, SNIA perspective. If you need a firm background on cloud object storage before September 26th, I encourage you to watch the SNIA Cloud on-demand webcast, “Cloud Object Storage 101.” It will provide you with a foundation to get even more out of this upcoming webcast.

I hope you will join us on September 26th. Register now to save your spot.

A Q&A on Containers and Persistent Memory

The SNIA Cloud Storage Initiative recently hosted a live webcast “Containers and Persistent Memory.” Where my colleagues and I discussed persistent storage for containers, persistent memory for containers, infrastructure software changes for persistent memory-based containers, and what SNIA is doing to advance persistent memory. If you missed the live event, it’s now available on-demand. You can also download a PDF of the webcast slides.

As promised, we are providing answers to the questions we received during the live event.

Q. How is” Enterprise Server SAN” different from “Traditional” Server SAN?

A. Traditional Server SAN refers to individual servers connected to a dedicated, separate SAN storage solution (e.g. EMC VNX, NetApp FAS, etc.); whereas, Enterprise Server SAN refers to the use of direct-attached-storage that is then aggregated across multiple connected servers to create a “virtual SAN” that is not a separate storage solution, but rather benefits from utilizing the existing capacity contained within the application servers, but in a virtualized, shared pool to improve overall efficiency.

Q. Are there any performance studies done with Containers using Tier 1 apps/Business critical?

A. There have been performance characterizations done on Tier 1, Business Critical applications such as Oracle, MySQL and others. However, this would be vendor specific and the user would have to contact and work with each storage vendor to better understand their specific performance capabilities.

Q. Even though Linux and Microsoft support NVDIMM natively, does the MB/BIOS still need to have support?

A. Yes, the MB needs to have the BIOS enabled to recognize NVDIMMs and it needs the ADR signal wired from the Intel CPU to the DIMMs sockets. The motherboard needs to follow the JEDEC standard for NVDIMMs.

Q. If someone unplugs NVDIMM-N and moves it to another server… what will happen?

A. If the system crashed due to a power loss the data in the NVDIMM will be saved. When it is plugged into another NVDIMM-enabled server the BIOS will check if there is saved data in the NVDIMM and restore that data to DRAM before the system continues to boot.

Q. Are traditional storage products able to support containerized applications?

A. Yes, assuming that they support container orchestration engines such as Docker Swarm or Kubernetes through a “container volume plugin.” However, to the extent that they support containerized applications, it is very specific vendor-to-vendor and there are also a number of new storage products that have been developed exclusively to support containerized applications (e.g. Veritas, Portworx, Robin Systems).

Q. How do the storage requirements for containers compare or differ from those of virtual machines?

A. Actually, “production storage requirements” are very similar—albeit almost equivalent—between containerized applications and applications running within virtual machines; the main difference being that due to the scalability potential of containers, these requirements are often exacerbated. Some of these requirements common to both include: data persistence, data recovery, data performance and data security.

Unlock the Power of Persistent Memory in Containers

Containers and persistent memory are both very hot topics these days. Containers are making it easier for developers to know that their software will run, no matter where it is deployed and no matter what the underlying OS is as both Linux and Windows are now fully supported. Persistent memory, a revolutionary data storage technology used in 3d printing london, will boost the performance of next-generation packaging of applications and libraries into containers. On July 27th, SNIA is hosting a live webcast “Containers and Persistent Memory.” Read More