MWC Barcelona last month heavily focused on two fast-emerging technology trends; 5G and edge computing. Together, they will significantly impact businesses by enabling massive volumes of digital data to transfer between cloud servers located in multiple regions around the world as well as between IoT devices and edge nodes. This is due to the hyper-fast speed of 5G networks and edge computing architectures that have micro-clouds and data centres located closer to data-generating IoT devices.
To seize new opportunities and stay ahead of competitors, businesses are in the process of transforming their operational models to take advantage of 5G and edge computing.
Currently, this data generated by multiple devices is stored in the cloud; this could either be on-premises, in a public cloud like Amazon Web Services (AWS), Azure or Google, hybrid, or multi-cloud. Additionally, the edge can also be seen as a ‘mini-cloud’ where some data will surely reside to support endpoint applications. With the edge, an increasing number of data storage servers are emerging to host data. In a few years, large amounts of data will be scattered across clouds and edges located in different countries and continents.
However, growing amounts of digital data are bounded by the regulations of many countries and regions, which helps to gain data sovereignty, enabling the protection of both general and sensitive information from external access for misuse. Last year, for example, the European Union implemented GDPR. Similarly, India, China and Brazil, among other nations, established their own data protection bills.
The varied and growing number of regulations creates concerns for businesses, which are in the midst of transformation driven by 5G and edge benefits. Businesses, including technology infrastructure vendors and service providers, will want ownership of data which is generated by consumers, whether that occurs locally or across borders.
The key question therefore is: how can data in multi-cloud and multi-node environments be managed? Will data sovereignty be a roadblock to latency-sensitive 5G use cases?
I came across one company, Kmesh, and found it was working on compelling solutions for data mobility in edge and multi-cloud scenarios. I got in touch with Jeff Kim, CEO of Kmesh, to learn about the core of their technology.
Kmesh, founded only in 2018, today has several solution offerings to address challenges with data used in multi-cloud environments, different countries, and edges. The offerings are SaaS solutions for data sovereignty, edge data and multi-cloud, and each provides a centralised software portal where users can set up policies for the ways they wish to distribute data. These SaaS offerings allow organisations to transform centralised data into distributed data, operating over multiple clouds, countries and edges as a single global namespace.
Kmesh enables businesses to take full control of their data generated at various data centres and residing in different geographies. Businesses can also move or synchronise the data in real time. So how do their SaaS offerings work? “Using our SaaS, you install a Kmesh software agent on-premises and another Kmesh software agent on any cloud or clouds,” said Kim. “Then, using our SaaS, you control which data gets moved where. Push a button, and the data gets moved/synced in real time, with no effort by the customer.”
With this approach, Kmesh aims to deliver significant efficiency improvements in operations involving data by providing the ability to orchestrate where data generated by end devices will reside and be accessed across edge, multi-cloud and on-prem.
Kmesh also aims to offer agility and flexibility in application deployment when used with Kubernetes, the de facto technology for orchestrating where applications reside. Businesses gain the flexibility to deploy applications anywhere and can leverage data ponds, which are placed at different locations. Like Kubernetes, Kmesh follows the native design principles targeted at cloud, hybrid cloud, and multi-cloud use cases.
Leading public clouds are known to have excellent artificial intelligence (AI) and machine learning (ML) capabilities for data provided to them. Kim explained how Kmesh can focus on data mobility in the age of AI and ML. “Enterprise customers still have their data predominantly on-premises,” he said. “Cloud providers have great AI/ML applications, such as TensorFlow and Watson, but moving data to the cloud and back again remains a challenge. Kmesh makes that data movement easy and eliminates those challenges, allowing customers to focus on what they want – the AI/ML application logic.”
Kmesh offerings reduce the burden on network resources by eliminating the need to transfer huge amounts of data between cloud and digital devices. In addition, businesses can substantially lower their storage costs by eliminating the need for data replication on different clouds.
I also asked if Kmesh could benefit telecom service providers in any way. “We can help in two ways, with them as partners and as customers,” said Kim. “As customers, telcos have massive amounts of data, and we can help them move it faster and more intelligently. As partners, if they offer cloud compute solutions, then they can resell Kmesh-based services to their enterprise customers.
“One early sales entry point to enterprises is by supporting data sovereignty in countries where the big clouds – AWS, Azure, Google – have little or no presence,” added Kim. “Many countries, particularly those with high GDPs, now have regulations that mandate citizen data remains in-country. Telcos in countries like Vietnam, Indonesia, Switzerland, Germany [and] Brazil can use Kmesh to offer data localisation compliance.”
The technology world is looking for flexible IT infrastructure that will easily evolve to meet changing data and performance requirements in support of the onslaught of upcoming and lucrative use cases. Kmesh is one company which aims to address data management and data sovereignty concerns while decreasing costs associated with storage and network resources.