Tag

Multi-Cloud

Browsing

By 

How are varied industries unlocking innovation without being tied to a single cloud?

As companies of all sizes and in all industries learn to cope with a new “business-as-usual” approach that includes all-remote workforces and intense pressure to cut costs, faster innovation and data agility may appear out of immediate reach. Industry leaders who look to data analytics, artificial intelligence (AI), and machine learning are demonstrating that you can have it all, with the right planning and infrastructure.

Data-intensive applications—such as image processing, data analytics, and AI—depend on rapidly growing enterprise data. With that growth comes architectural considerations. As datasets grow, applications are required to be closer and closer to that data due to network latency. As more applications are added to the environment, they start to generate data faster than it’s possible to move this data elsewhere without great cost and interruption, making migration almost impossible. This is the data gravity paradox that creates lock in and introduces future business risk: the more you gather your data together, the harder it comes to change how you handle it.

So how are businesses unlocking the power of innovation without being tied to a single cloud? Successes (and challenges) exist across a variety of industries. Here, we’ll look at a few dynamic examples of how multi-cloud accelerates innovation, enhances data agility, and reduces costs.

Multiple clouds – a balancing act for the future…

Media and entertainment

Today’s media and entertainment landscape is increasingly composed of relatively small and specialised studios that meet the swelling content-production needs of the largest players, like Netflix and Hulu. To deliver the blockbuster movies and award-winning TV shows, these geographically dispersed studios require efficient collaboration on animation, color correction, special effects, and editing. Multi-cloud solutions enable these teams to work together on the same projects, access their preferred production tools from various public clouds, and streamline approvals without the delays associated with moving large media files from one site to another. A high-throughput, low-latency data lake eliminates concerns that lag will inhibit productivity. Additionally, a central storage solution that attaches to multiple clouds reduces the large egress fees often associated with taking enormous video files out of public clouds.

Beyond the need for collaboration, other factors drive the growth of data and of multi-cloud within media and entertainment. Cameras and viewing devices have greater resolution, meaning that file sizes are larger than ever, requiring greater bandwidth in dispersed data centers than what can be achieved on-premises. Streaming services rely on data analytics to programmatically understand content popularity, divine what new content should be created, and which content should be shelved. Many of the processes related to these workflows are increasingly utilising the public cloud due to the availability of complementary data sets and use-case-specific tools for handling different types of analytics.

Transportation and autonomous driving

Connected car and autonomous driving projects generate immense amounts of data from a variety of sensors. For example, Tesla’s autopilot utilises eight cameras, twelve ultrasonic sensors, and one radar to interpret the car’s surroundings and make decisions about its path and how to avoid potential obstacles. Researchers in this field are trying to accommodate the 100s of petabytes of video and still image-generated data that are used to retrain algorithms. These are still the early days for autonomous vehicles (AVs). When 20–50x more are on road, handling more variants in driving situations (manoeuvring around any city street, any parking garage, etc.), an even greater amount of deep learning will be required. By 2030, autonomous vehicles on the road will create a predicted 1 Zettabyte of data.

Car manufacturers, public transportation agencies, and rideshare companies are among those motivated to take advantage of multi-cloud innovation, blending both accessibility of data across multiple clouds without the risks of significant egress charges and slow transfers, while maintaining the freedom to leverage the optimal public cloud services for each project.

Five business transformations enabled by multicloud

Energy sector

Within the energy sector, multi-cloud adoption can help lower the significant costs associated with finding and drilling for resources. In one example, an oil and gas services company had more than 4 petabytes of data, which it had collected by accumulating data such as sonar scans of undersea floors, geospatial photos, and land surveys, for petrotechnical analytics and seismic processing. Engineers and data scientists at this company used machine learning (ML) analytics to identify places that merited more resources to prospect for oil, to gauge environmental risks of new projects, and to improve safety.

By taking advantage of the services and processing power across multiple clouds, this company created efficiencies that can help save millions of dollars. This is possible by leveraging spot instances across multiple clouds at the same time in order to get much faster results at a lower cost than when a limited number of GPUs are available in any particular cloud. By simultaneously replicating its on-prem data lake and making the data available to multiple cloud services, this organisation supported a wide set of applications and workloads. This demonstrates how oil and gas organisations can scale PBs of data without sacrificing time, while also delivering a new level of resilience by enabling cloud-based recovery.

Healthcare and life sciences

Healthcare is one of the industries that’s lagging behind in multi-cloud adoption. This isn’t due to lack of desire, but because of the many challenges around the protection of data. The need to know where data lives, who has access to it, who has accessed it—along with Health Insurance Portability and Accountability Act (HIPAA) regulations and Digital Advertising Alliance (DAA) guidelines—all bring unique challenges in this field.

Even with those caveats, multi-cloud helps healthcare and life sciences unlock the power of innovation. This is clear in the realm of genomic analysis, in particular, where analysis of huge datasets can help improve—or save—lives. FASTQ files contain the sequencing data of raw genomes; they contain millions of snippets of DNA that need to be assembled like a jigsaw puzzle. These files then allow researchers to do variant analysis, identifying differences between individuals’ genomes. The intensive process of analysing genomes consumes quite a lot of space from a storage standpoint. For example, to study the genomes of 150 cancer patients who receive a particular treatment, then analyse differences in DNA between those who were treated successfully, those who didn’t respond well to the treatment, and against the general population, variant analysis on thousands of people may be necessary. The ability to scale up across clouds and take advantage of spot instances, while sharing access to datasets with researchers around the world, is critical to making these workflows practical and accessible.

Multiply the innovation from your cloud strategy

A multi-cloud solution, in which the same copy of data is available to multiple clouds, allows users to take advantage of each cloud’s services—more than 500 available today. Multi-cloud storage can improve data agility, provide data proximity without vendor lock-in, and scale compute and storage on-demand, independent of each other. Multi-cloud offers financial savings and eliminates many operational complexities, whether you have 10s of TB or 100s of PB of data.

Adopting a multi-cloud strategy today can future-proof your organisation for when you’re ready to tackle a new workload—without being forced to copy or move it closer to the latest cloud capabilities. You don’t need to have that use case today, but an effective multi-cloud strategy allows you to leverage it when the use case is required tomorrow.

Feature Image Credit: Image credit: Image Credit: Rawpixel.com / Shutterstock

By 

Rebekah Dumouchelle, Sr. Product Marketing Manager, Faction

Sourced from ITProPortal

By Mark Bowen

Dave Russell, Vice President for Product Strategy at Veeam, outlines five intelligent data management needs CIOs need to know about in 2019.

The world of today has changed drastically due to data. Every process, whether an external client interaction or internal employee task, leaves a trail of data. Human and machine generated data is growing 10 times faster than traditional business data, and machine data is growing at 50 times that of traditional business data.

With the way we consume and interact with data changing daily, the number of innovations to enhance business agility and operational efficiency are also plentiful. In this environment, it is vital for enterprises to understand the demand for Intelligent Data Management in order to stay one step ahead and deliver enhanced services to their customers.

I’ve highlighted five hot trends in 2019 decision-makers need to know – keeping the Europe, Middle East and Africa (EMEA) market in mind, here are my views:

  1. Multi-Cloud usage and exploitation will rise

With companies operating across borders and the reliance on technology growing more prominent than ever, an expansion in multi-cloud usage is almost inevitable. IDC estimates that customers will spend US$554 billion on cloud computing and related services in 2021, more than double the level of 2016.

On-premises data and applications will not become obsolete, but that the deployment models for your data will expand with an increasing mix of on-prem, SaaS, IaaS, managed clouds and private clouds.

Over time, we expect more of the workload to shift off-premises, but this transition will take place over years, and we believe that it is important to be ready to meet this new reality today.

  1. Flash memory supply shortages, and prices, will improve in 2019

According to a report by Gartner in October this year, flash memory supply is expected to revert to a modest shortage in mid-2019, with prices expected to stabilise largely due to the ramping of Chinese memory production.

Greater supply and improved pricing will result in greater use of flash deployment in the operational recovery tier, which typically hosts the most recent 14 days of backup and replica data. We see this greater flash capacity leading to broader usage of instant mounting of backed up machine images (or copy data management).

Systems that offer copy data management capability will be able to deliver value beyond availability, along with better business outcomes. Example use cases for leveraging backup and replica data include DevOps, DevSecOps and DevTest, patch testing, analytics and reporting.

  1. Predictive analytics will become mainstream and ubiquitous

The predictive analytics market is forecast to reach $12.41 billion by 2022, marking a 272% increase from 2017, at a CAGR of 22.1%.

Predictive analytics based on telemetry data, essentially Machine Learning (ML) driven guidance and recommendations is one of the categories that is most likely to become mainstream and ubiquitous.

Machine Learning predictions are not new, but we will begin to see them utilising signatures and fingerprints, containing best practice configurations and policies, to allow the business to get more value out of the infrastructure that you have deployed and are responsible for.

Predictive analytics, or diagnostics, will assist us in ensuring continuous operations, while reducing the administrative burden of keeping systems optimised. This capability becomes vitally important as IT organisations are required to manage an increasingly diverse environment, with more data, and with more stringent service level objectives.

As predictive analytics become more mainstream, SLAs and SLOs are rising and businesses’ SLEs, Service Level Expectations, are even higher. This means that we need more assistance, more intelligence in order to deliver on what the business expects from us.

  1. The ‘versatalist’ (or generalist) role will increasingly become the new operating model for the majority of IT organisations.

While the first two trends were technology-focused, the future of digital is still analogue: it’s people. Talent shortages combined with new, collapsing on-premises infrastructure and public cloud + SaaS, are leading to broader technicians with background in a wide variety of disciplines, and increasingly a greater business awareness as well.

Standardisation, orchestration and automation are contributing factors that will accelerate this, as more capable systems allow for administrators to take a more horizontal view rather than a deep specialisation.

Specialisation will of course remain important but as IT becomes more and more fundamental to business outcomes, it stands to reason that IT talent will likewise need to understand the wider business and add value across many IT domains.

Yet, while we see these trends challenging the status quo next year, some things will not change. There are always constants in the world, and we see two major factors that will remain top-of-mind for companies everywhere….

  1. Frustration with legacy backup approaches and solutions

The top three vendors in the market continue to lose market share in 2019. In fact, the largest provider in the market has been losing share for 10 years. Companies are moving away from legacy providers and embracing more agile, dynamic, disruptive vendors, such as Veeam, to offer the capabilities that are needed to thrive in the data-driven age.

  1. The pain points of the Three Cs: Cost, complexity and capability 

These Three Cs continue to be why people in data centres are unhappy with solutions from other vendors. Broadly speaking, these are excessive costs, unnecessary complexity and a lack of capability, which manifests as speed of backup, speed of restoration or instant mounting to a virtual machine image. These three major criteria will continue to dominate the reasons why organisations augment or fully replace their backup solution.

  1. The arrival of the first 5G networks will create new opportunities for resellers and CSPs to help collect, manage, store and process the higher volumes of data

In early 2019 we will witness the first 5G-enabled handsets hitting the market at CES in the US and MWC in Barcelona. I believe 5G will likely be most quickly adopted by businesses for Machine-to-Machine communication and Internet of Things (IoT) technology. Consumer mobile network speeds have reached a point where they are probably as fast as most of us need with 4G.

2019 will be more about the technology becoming fully standardised and tested, and future-proofing devices to ensure they can work with the technology when it becomes more widely available, and EMEA becomes a truly Gigabit Society.

For resellers and cloud service providers, excitement will centre on the arrival of new revenue opportunities leveraging 5G or infrastructure to support it. Processing these higher volumes of data in real-time, at a faster speed, new hardware and device requirements, and new applications for managing data will all present opportunities and will help facilitate conversations around edge computing.

Feature Image: Dave Russell, Vice President for Product Strategy at Veeam

By Mark Bowen

Sourced from INTELLIGENT CIO