As digitisation continues to take hold within enterprises, and business data grows in value accordingly, the next 10 years could make it necessary for enterprises to drastically shake up their hosting and cloud arrangements, it is claimed.
During a panel session at the NetEvents Global IT Summit in San Jose, Peter Burris, chief research officer and general manager at analyst house Wikibon, said the next decade or so is likely to see organisations embark on a reorganisation in terms of where they choose to store and process their data.
This process is likely to result in organisations developing a more edge-centric data management strategy, as they seek out ways to derive additional value from their data in the most efficient way possible. And that will require a considerable change in mindset when it comes to cloud, said Burris.
“For the past five, six or eight years, we have thought of the cloud [in terms of] ‘we’re going to move our data into the public cloud and we’re going to get access to a lot of services’,” he said.
“The quid pro quo is ‘give me your data – which locks you in – and then we’ll give you access to these interesting services’. But what enterprises want is to keep data where the action is going to take place, where it’s most secure, where intellectual protection or intellectual property protections are easiest to administer.”
And that means turning the notion of treating cloud as a destination that organisations move their data to in order to derive the most value from it completely on its head, Burris continued.
“This is not about moving data to the cloud. This is about moving the cloud and cloud services to the data. The natural organisation of the cloud in 10 years is going to reflect a natural organisation of data, whether it’s at the edge, in the core or in public cloud,” he said.
Fellow panellist Jean-Luc Valente, vice-president of product management in the cloud platforms and solutions group at networking giant Cisco, backed this point, adding that as the amount of data that enterprises have to work with grows ever greater, its portability tends to worsen.
And, again, this is likely to fuel enterprise demand for edge computing environments, in particular, so this data can be processed closer to where it is being created.
“The bigger the data, the more [solid it becomes],” said Valente. “If you take a terabyte of data, and try to move it to the [public] cloud from a private cloud, you can do it. It’s still going to take time and cost you money. But if you take an exabyte, which obviously today we would generate [that] volume of data, it’s almost impossible. It would cost $30m to egress that to a public cloud.”
As a result, there is a possibility that, in the years to come, organisations will adopt a tiered approach to the way they store and manage their data, which could see them have pockets located in private and public clouds, and edge environments, he said.
“We see a tiering of environment from private environments [by] customers to [on-premise] or hosted by partners, and obviously a very big explosion of the edge ultimately all the way to a device,” Valente added.
“That actually creates – both from a networking and security standpoint – a lot of new characteristics, challenges and complexity.”
So much so, Valente said, that enterprises will need to “fundamentally change the way” they approach networking to accommodate the fact their data is liable to be spread across multiple environments, which will mean taking steps to ensure it can be accessed in a secure and timely way.
This is a view shared by another NetEvents panellist, Mansour Karam, CEO and founder of autonomous infrastructure software supplier Apstra, who also made the point that as enterprise datasets become more distributed, the data management effort involved will ramp up considerably.
“One consequence of all this is that managing networks like before no longer works. You can’t manage networks manually, by configuring devices by hand. It has to be done through software. It has to be done through automation,” he said.
“You need to have the ability to abstract out all of those network services across all of those domains and you need to have the ability to operate these networks, enforce those policies, set those configurations and verify them remotely. So you can’t have humans essentially do this in every location where the data resides.”