Living (and Thriving) at the Edge of Technology

Edge computing is fast gaining momentum. A recent Stratus Technologies report shows that 91% of today’s data is created and processed in a traditional data center. By 2025, Gartner estimates that 75% of all data will need analysis and action at the Edge. Edge computing, which locates more processing power closer to the end user, has the potential to improve performance, reduce costs, and help fuel the growth of next-generation technologies and applications. 

Gensler began designing facilities to be used for digital streaming services more than a decade ago. At the time, a client was developing sites outside of major metropolitan areas to support local customers. They were moving away from centralized enterprise computing to distributed computing primarily to reduce latency (the time it takes for data to be transferred from its origin to the customer). These facilities were at the forefront of Edge computing.

These were (and still are) small data centers in underutilized industrial parks adjacent to rail lines outside or on the outskirts of major markets. At the time, the term “Edge” was used because these facilities were usually on the edge of town, but more accurately, these satellite locations are near the outside limit of a fiber network.

Typically, a customer will only wait three seconds for a website to load before moving on, so not dealing with latency means losing customers. There’s a great deal more to a streaming transaction than just data transfer, such as customer identification and confirmation, product availability, product selection, credit account confirmation, and transaction consummation. This information must be exchanged rapidly before the entertainment can be transferred.

Storing and computing data closer to the end user

At that time, this was the only application of Edge computing we knew. Optical fiber is provided in loops encircling cities for distributing data/content to local customers. Content is brought from a single source, usually a remote location, via long haul fiber, which is akin to an interstate highway. Optical fiber networks are optimized to carry signals in both directions along the fiber, which is referred to as backhaul.

If you were not near a major metropolitan area, it was almost impossible to receive a signal because long haul optical fiber doesn’t stop at small towns along the way; it is a direct flight with no layovers, no exit ramps. Content is delivered into the fiber loop around a city. It’s referred to as “metro fiber” and is meant for delivery to locals and stops at every customer along the way like a postal carrier.

Because fiber networks require backhaul for efficiency there aren’t many hub-and-spoke networks where the signal originates at the hub and is delivered to the end of the spoke, especially if the signal needs to be delivered a great distance.

Edge computing is the middleman located where long haul delivers content and in turn delivers content to the metro fiber loop. So, with the movie content example, the transaction is asymmetrical: you exchange a minuscule amount of information for a lot of information. Edge is a system to deliver content “the last mile.”

New uses for Edge computing

Today, Edge computing has expanded, first to serve the proliferation of personal devices, and secondly, as consumers began to expect full functionality of said device, at any time and at any place. In addition, more telecommunications and enterprise and web scale data centers began to be developed.

But as the consumer has become more dependent on this access to data, other uses for Edge have emerged. Concurrently, data centers were developed rapidly to take advantage of more powerful servers and to support online customer service and transmission of information. The production of content, applications, gaming, information (search), online services (financial), and e-commerce, exploded to satisfy the appetite of the consumer.

Edge computing collects and analyzes data where it’s generated, where things (machines) and people (personal devices) produce or consume the data they (it) create. Rather than sending that data to the Cloud or to a traditional data center for processing, it is turned into actionable data on the spot.

The Cloud is simply servers that are accessed over the internet, along with the software and databases that run on those servers. Cloud servers are in data centers across the world but not necessarily near customers. Cloud computing allows users to run software applications with personal devices, computers, gaming controllers, machines, appliances, and automobiles, with a large part of the processing carried out in the data center. This means devices can be smaller, require less processing power and less battery, back up personal files, store or archive information, and operate applications much faster.

An explosion of Edge growth

What is driving expansion now are technologies like the Internet of Things (IoT), autonomous machines, robotics, machine learning, machine-to-machine communication, AR/VR, as well as telco network functions that generally require processing in near real time for applications to be useful, desirable, or safe.

The real explosion of Edge growth is fifth generation wireless broadband technology or 5G networks. 5G is at least 10 times faster than current 4G wireless networks. 5G’s capability for transmission speed will enable real-time, mission-critical applications, as well as autonomous control of vehicles, smart grids, industrial automation, and even remote surgery if Edge data computing infrastructure is developed at the same time to support these advancements.

For instance, autonomous corridors along highways will require Edge data processing every few miles; smart metering for utilities; home appliances using web-based applications to communicate with each other and to outside service providers; automobile and truck monitoring for tracking, diagnostics, and maintenance; tracking for goods and materials in the logistics chain and distribution; and more. Not only does Edge computing bring us these applications, but also new technology we can’t conceive of today.

For media inquiries, email .

Brion Sargent
Brion, AIA, is a Regional Critical Facilities Practice Area Leader for Gensler’s South Central region. Brion has more than 40 years of experience and a deep portfolio of projects. His distinguished career includes delivering a variety of project types that are singularly focused on supporting the client’s critical business needs. These may include data centers, network operations centers, or e-commerce and logistic facilities. Brion’s work often involves repositioning existing properties and creating a rapid delivery model to allow these facilities to open in the fastest possible time. He also provides strategy development to reduce the maze of local development restrictions and to streamline the entitlement process. Brion is based in Dallas. Contact him at .