Network virtualization is the 3rd wave of digitization

Updated: Feb 10

Most of the current discussion about the future of networking centers around 5G technology, its features and the line-up of phones and service providers supporting it. This near-term focus obscures that network virtualization will have a much larger impact on the networking industry. Network virtualization, a technology coming to market along with 5G, will revolutionize the networking business.

download the paper

The virtualization of networking technology is the 3rd great digitization wave of the information processing industry.


The personal computer platform initiated the first wave: it liberated the PC software from the proprietary hardware architectures and created the foundation for a huge new software industry.


Cloud computing, the second great wave, virtualizes the data. It liberates the data from being tied to specific computers, and provides computing power and data storage anytime, anywhere, in any amount. It created the foundation for a tsunami of Internet services, for consumers as well as enterprises.


Up to now, however, connectivity still requires large amounts of specialty equipment that is expensive to acquire, share and manage. Network virtualization will change this and drive business changes of similar magnitude as the first two waves of digitization.


Let’s take a look at the first two waves and their impacts, and then discuss how the network virtualization will evolve.


The personal computer wave - the new platform


The personal computer wave was driven by three developments, each of which dramatically lowered the system cost and barriers to entry for new businesses supporting PCs.

The emergence of the microprocessor started in the mid 1970s. Many small system offerings emerged, but none managed to grow beyond enthusiast users.

The introduction of the IBM PC validated the market and led to the standardization of system components, processor architecture and operating system. The publication of the AT bus specification created a level playing field for system component manufacturers. This finally drove the commercial break-out.


The economies of scale and savvy business decisions by Intel and Microsoft resulted in the consolidation of the foundation. For the major part of the PC era, the monopoly-like Wintel partnership harvested the lion’s profit share of the PC systems business.



The PC revolution caused large changes of fortune in the mainframe and minicomputer industry. Some incumbents, such as IBM and Unisys, stalled. Other leaders of the prior era, for instance Digital Equipment and Wang, disappeared. The new leaders of the PC era grew dramatically: Intel, Microsoft, Cisco, Compaq, Dell, and Cisco represented 80% of the computer industry market value.


In 1987, the combined market capitalization of the industry incumbents was just over $93bn. Ten year later their value was about $113bn, a minor increase at best. The value of the newly created competitors increased dramatically. In 1987, their combined value was less than $12bn. By 1997, it was nearly $588bn. In other words, creative destruction caused industry value to increase nearly seven-fold over a decade of rapid innovation.


Key factors driving the Personal Computer platform

  • The semiconductor cost reduction, as documented by Moore’s law, led to the emergence of microprocessors.

  • Standardization and modularization of the system components created markets for all system components such as disk drives, displays and other peripherals.

  • Standardization based on a single processor architecture and operating system separated the software from the hardware, creating an efficient platform for application software to be built.


The personal computer wave – a new software industry is born



The very low barrier to entry in the application software business drove business volume and market value, effectively creating a new software industry. The radical system cost reductions made economically viable a huge number of applications that had been uneconomical in the mainframe and minicomputer era that required custom software. Shrink wrapped software came to be sold like ordinary consumer goods. Applications such as Microsoft Office, Lotus 1-2-3 and e-mail became available to employees of all enterprises and changed the way business was done. Consumers accessed the World Wide Web, started to use e-mail, played games, and adopted educational and many other applications.


While few software companies grew as large as Microsoft, the PC era spawned a huge software industry with some large software providers such as Intuit, Adobe, Symantec, Google, and Netscape, new Internet services such as PayPal and eBay, and a huge number of smaller players, focusing on specialized application niches.


The cloud computing wave - driven by virtualization



Just like the PC wave, the cloud computing wave is driven by massive cost reduction and standardization, now focusing on data center cost. The growth of Internet services required the creation of large data centers supporting a great variety of applications and services. With Moore’s law reducing the hardware cost, the management cost became dominant. As many services had to be available globally, application mobility and remote management became important. The figure on the left illustrates the order of magnitude of possible cost reductions, particularly for the major cost components such as administration and storage.


Large Internet service providers such as Amazon and Google managed to reduce the operating cost of their gigantic data centers by more than an order of magnitude. Once they accomplished this, they leveraged this advantage by offering services to others, creating Amazon Web Services and Google Cloud. Using open source software reduced the cost and accelerated the evolution of the data center platforms.


The key technology enabling these services was the virtualization of the two main computing resources: storage and processing. Virtualization effectively creates software-defined resources on top of the actual hardware resources. Storage virtualization made data available reliably, on demand, in any amount, anywhere in the world, to any application. Virtual machine technology provided the same features to processing capacity. These virtualizations also enabled the separate provisioning and management of storage and processing resources.


The cloud wave – innovations


Infrastructure innovations


Following virtualization, additional innovations drove further cost reductions for the cloud platform providers, reshaping the cloud platform business.


  • Open source software accelerated the development and standardization.

  • Standardization of the software stack created an open modular architecture with distinct layers – IaaS, PaaS, and SaaS. This structure enabled a new ecosystem of services on top of the base infrastructure.

  • Powerful automation dramatically reduced the cost of virtual machine creation and management.

  • Large numbers of diverse workloads can run on the same physical hardware.

  • Standardization of packaging simplified service acquisition and management, including users’ self-service.

  • Large economies of scale reduced all aspects of the platform operating costs.



User benefits


The cloud innovations translated into advantages that are driving cloud adoption by a vast range of customers large and small.


  • The granularity of available resources is independent of hardware capacity.

  • Dynamic resource management allows resource acquisition and management directly tied to the needs of the moment.

  • Resources can be accessed from anywhere. The global nature of the infrastructure also improves availability.

  • Low entry cost for users supports emerging high-growth businesses.

  • ‘Free’ marginal use can create entirely new services.



The cloud wave – industry impacts



The move to the cloud has driven a revolution in the business environment. The hardware layers have been commoditized. The tremendous economies of scale provided by the large data centers drove the consolidation of the cloud platform providers. Only very large providers survive. The four largest providers, AWS, Microsoft, Google, and Alibaba share about 2/3 of the cloud infrastructure market.


As none of these companies are pure-play cloud providers, market capitalization numbers can only be estimated. In early 2020 Gartner stated “We believe AWS would be $300B - $400B as a stand-alone company”, based on a $33B run rate. Applying the same logic to the other major could companies would give us an approximate range of $620bn - $820bn for the industry.


The majority of users move their consumption from a capex model to an opex model, often with a pay-as you-go arrangements. This mode of resource acquisition aligns very well with many business models, in particular for start-ups and fast-growing companies. However, the greatest effect was the dramatic drop of computing cost. The cost reduction of cloud computing resulted in tremendous growth of its users and the creation of entirely new services.


For enterprise applications, the Software-as-a-Service (SaaS) industry emerged and has come to dominate software consumption. Today, 90% of companies are using cloud services.


With consumer Internet services, the value of the data collected by the service providers often is greater than the cost of computing needed to provide the service. This has led to a profusion of new service businesses, many of them “free”, or rather, paid for with consumer data. Most of the popular consumer services, from Facebook, G-mail and WhatsApp to Netflix, Spotify, Zoom, and TikTok would not be possible without cloud computing.


The value creation has shifted from the hardware and operation to the services!

The market cap of the 12 largest pure play US cloud enabled businesses in September 2020 were more than $2.3trn, more than double its value just a year earlier.


The network virtualization wave


Network virtualization is now beginning to be deployed. The introduction of 5G technology will accelerate the adoption. The expanded bandwidth, increased connectivity and reduced latency of 5G enable a vast range of new applications. In addition, the implementation of 5G requires substantial rebuilding of the network infrastructure, creating the opportunity to implement virtualization at the same time.


We should expect that just like with cloud computing, the barriers of entry at the upper layers of the stack will be lowered, shifting value creation towards the top of the stack, which will lead to business innovation and growth.




The network virtualization wave – innovations

Infrastructure innovations


The kinds of infrastructure innovations that enabled the cloud revolution in computing will now be brought to bear on networking technology.


Just like virtualization of computing resources, network virtualization creates functional layers, software defined networks (SDNs), on top of the networking hardware. Some of these developments are already visible.


At the lowest levels of the stack, chip technologies from Intel, Marvell, ARM, Qualcomm, AMD and Samsung are challenging the legacy offerings from the traditional equipment providers. Instead of relying on proprietary field programmable gate arrays (FPGA), next generation chipsets include modules based on structured application specific integrated circuits (ASIC), programmable digital signal processors (DSP) and hardware accelerated baseband processors. These are all generally available, significantly lowering the barriers to entry at the physical hardware level.


Modularity permeates also the system software layers. Since 2016 the Telecom Infrastructure Project (TIP) has been working to accelerate innovation and commercialization of networking software. Initially targeting the Radio Access Network (RAN) domain with multi-vendor interoperable definitions of products and solutions, the project now includes the transport and core networks as well. Open, disaggregated technology is rapidly lowering the barriers to entry also at this level.


At higher levels, modularization of automation and orchestration functionality improve operational agility and flexibility. Major efforts are now underway to design and develop the requirements, architectures and workflows necessary to automate the end to end life-cycle management of network service use cases as reusable building blocks.


User benefits


And just as with cloud computing, these innovations will translate into advantages that will drive service adoption by a vast range of customers large and small.


  • The granularity of available resources is independent of hardware capacity.

  • Dynamic resource management allows resource acquisition and management directly tied to immediate needs.

  • Ability to run multiple SDNs with diverse workloads on the same physical network makes hardware acquisition and management more economical.

  • While some of the hardware infrastructure will have to remain local, global access to virtual networks will be standard. This drives a dramatic improvement of end point mobility.

  • Moving from capex to opex offerings will align networking costs with a broad range of business models, and ease consumption by start-ups and fast-growing companies.

  • Low entry cost for users supports emerging high-growth businesses. ‘Free’ marginal use can create entirely new services.


The network virtualization wave – industry impact



Infrastructure cost reduction lowers barriers to entry also at higher levels of the stack, leveling the playing field and creating opportunities for new entrants to exploit.


Eco-system accelerator centers, supporting network technology startups, are now active in Germany, Korea and the UK. In the US investment dollars are flowing to newly established challengers using open technology.


Recently, many early stage companies have seen funding rounds well above the $100 million mark. In 2019, Mavenir Networks raised $105.3 million, Affirmed Networks $155 million, Altiostar $357.5 million, to mention just a few. In 2020 this trend continued, as next generation network technology roll-outs ramped up, defying the general economy slowdown.


Some of these funds have been raised by major players in the venture capital industry, such as Eight Roads Ventures, Bessemer Venture Partners, Accel and Sequoia Capital. But startup funding is increasingly also coming from established network technology companies deciding to get in on the action. Qualcomm Ventures, Cisco Investments and Nokia Growth Partners are all looking at channeling significant funds into the open networking eco-system.


Commercial deployments are not far behind. In Japan, Rakuten has built a 5G wireless network, using open networking technologies from the ground up. In the US Dish network is looking to do the same. And in Europe, Vodafone is looking at continent-wide deployment of open technology, aiming to increase efficiency and to drive down costs with a fully virtualized network.


Conclusions


A major benefit of virtualization is the nearly seamless scaling of resources. Virtualizing network functionality will thus make many current network architectures irrelevant. In the past, we could see distinct borders between local area networks and wide area networks, between public networks and enterprise networks. Virtualizing network functionality will make these boundaries much less relevant. In a virtualized network environment applications and services will scale almost seamlessly, from the personal area network to the globe spanning Internet. Together with the cost reduction due to standardization and automation, these characteristics will enable many new applications.



Still, some major constraints remain: time and distance. Network latency is the time it takes for a request or data to go from the source to the destination. Moving data to the cloud for storage does not require low latency. Cloud storage can be physically very distant from the data source, with little impact on operations.


Controlling machinery is another matter. Operating a robot arm at a conveyor belt, or a drill in a mining shaft, requires near real-time response. Data need to be captured, transferred and processed, and instructions need to be issued and acted upon almost instantly. If latency is too high, operations will suffer.


This will lead to highly distributed, many-tiered architectures. Where early cloud implementations located computing resources far from the network edges to gain economies of scale, recent developments point to a different distribution of computing resources. With two thirds of data generated at the edge and likely to remain there, computing resources are moving to the edge. For edge computing, scale is less important than latency when determining resource allocation. Network virtualization creates the foundation for flexible edge computing environments.


As network virtualization penetrates the industry, we expect developments of a magnitude similar to the prior two digitization waves for both business users and consumers. The specifics of these developments, however, will emerge as the effects of virtualization takehold. As usual, it will be a case of “follow-the-money”.


Where economies of scale determine the outcome, industry will focus on consolidation. Investments will be allocated to centralized processing and storage centers. This suits providers of information at high volumes, such as media distributors and social network platforms. Virtual and augmented reality applications require great amounts of bandwidth, but only during actual use. Dynamic bandwidth scaling will allow many new applications to emerge.


In most manufacturing environments, low latency is the scarce resource and physical proximity and computing power are the determinants of successful outcomes. This new capability is expected to drive great benefits for Industry 4.0 applications. In some countries, 5G frequencies have already been reserved for industrial companies. Here, edge connectivity and processing requirements will drive investment.


In cities, the multitude of applications and points of data generation makes multipoint connectivity the scarce resource. Connecting a proliferation of sensors, capturing their data and turning it into actionable information are the key areas of interest and investments. An important application area for this is the management of public services and assets, including the support of interactions of mobile users – cars and people – with fixed infrastructure.

Few companies will have the resources and ability to simultaneously pursue these different development areas. Major network technology companies will allocate development resources trying to match their clients demands in many of the market segments. But addressing the full market potential is beyond even these giants.


Much more likely is a trend towards specialization. With the current low barriers to entry, low cost of capital as well as high availability of both generic off-the-shelf hardware and open source software, we can expect to see a rapid increase in venture capital allocated to network technology start-ups.




About the authors


Mike Edholm’s focus is on developing network technology strategies. He has been working extensively with leading technology companies on all continents while living in the US, in Europe, in Asia and in Africa. Mr. Edholm is a member of the IBM Industry Academy, currently focusing on cross industry impact of 5G, Edge and Cloud computing


Martin Kienzle has done research and development on the digitization revolution for more than three decades. He developed PC media streaming prototypes in the nineties, IPTV portals in the 00s, and IoT technologies in the teens. While working with many technology companies, he always considered the perspective of the users to be paramount. Martin is a Senior Member of the IEEE. be paramount. Martin is a Senior Member of the IEEE.



29 views0 comments