Real-world Use Cases: Cloud Storage Workloads
When choosing any storage solution it’s crucial that you consider data use patterns and the workload. This even goes beyond storage – application workloads drive server, network, and all IT infrastructure decisions. Sure, most vendors will let you know that their product is the best solution for any workload, and that was fairly precise when choices were few. Nevertheless, today there are a variety of offerings, each with strengths and weaknesses in different situations. This article will review six workload scenarios and identify where cloud storage is a good fit and where it is a poor fit.
Quickly Transforming Single File Workloads
Examples of a single file workload that is quickly transforming would contain I/O patterns of an active spreadsheet, source code repository, or a database. In this workload, there is either an incredibly strong server that is single or many users sharing a single file. In both cases, updates to a single file are continuous and rapid, driving the requirement for a tier -one type of storage. To ease this workload, the system should have tons of memory; swift, hard drives; and the ability to create snapshots for immediate data protection. Today this market is well served by Enterprise NAS vendors like NetApp and EMC.
Info Ingestion Workloads
The best example of a data ingestion workload is video surveillance. Consider, for instance, the city of London and its thousands of cameras, each streaming write operations to storage. Every camera needs fast access and creates its own set of files. This is an excellent workload for private cloud storage. An exclusive storage cloud has many storage nodes that can ingest flows of advice alone so there isn’t any info bottleneck. A camera-to-storage node ratio can be confirmed, say 10 cameras per node, and after that replicated out to countless nodes, and enabling thousands of cameras. The video surveillance storage can be easily managed by an individual administrator for the whole city since the cloud is centrally managed.
Video streaming and online video sharing are categorized as read-intensive workloads. Consider the example of the Beijing Olympics. There was incredible interest in the online video of the events, and in the U.S the focus was on men’s swimming. When the U.S relay team won by a fraction of a second, everybody wanted to watch. Millions of people flocked to the video and internet servers churned out views. This creates a unique storage demand. With thousands of web servers trying to read a single file, the design must support concurrent reads. Cloud storage supplies the ideal solution to read-intensive workloads with hundreds of independent nodes serving many copies of precisely the same file out.
High-Performance Computing (HPC) Workloads HPC workloads are similar to information ingestion workloads with one significant difference – access to just one file. Rather than a file that is unique being created by every customer, hundreds or thousands of systems get a single file that is certainly striped across many nodes for operation. This workload demands tight coordination between every node in the cluster to ensure cache coherence, file locking, and data integrity. Where compute clusters process sophisticated transactions HPC storage can be used extensively in gas and oil exploration and financial information modeling. There are a number of established HPC storage vendors include Panasas, Isilon and NetApp GX.
Single Company, Many Consumer Workloads
The NASA Phoenix Mars Lander discovered ice crystals on the surface of Mars. The world responded scientists and religious organizations confirmed their unique theories about the universe, and everybody wanted access to the data. Given the challenges of collecting soil samples and landing on Mars, it is safe to say this really is an illustration of a write use up many workloads. Other examples include quarterly company results and genomic sequence findings. All share a single creation event with demand for multiple points of an entrance that is read. Cloud storage protects data by replicating files to one or more nodes. This same task can create many access points, enabling just one creation event to be easily shared amongst many consumers.
Archive File Or Content Depot Workloads
In many instances as info ages, it becomes active. Whether it’s corporate information or media content, it really is important that this data be kept accessible, but at a cost relative to its worth. Scale capacities and private cloud storage economics are made to address this use case. Info can be replicated to the cloud to free up more expensive grade -one storage devices and delay costly infrastructure upgrades. Cloud storage can be enlarged on demand using the latest (or oldest) commodity hardware and a couple of simple mouse clicks. It can be removed without downtime, enabling 50-year archive files and maintaining access, in regards time to retire cloud hardware.
What’s Your Information Workload?
When considering storage options blow off the “we can do everything” sellers and think about your workload. Once you understand your conditions and how the data will be used, your solution will emerge.