The Hidden Problem of Edge Computing

The New Edge

Edge computing is the true leading edge of information technology today. For decades, companies have been pushing computing closer to the worker, but still in a way that means the worker has to come to IT. First it was terminals, then PCs, and now smartphones, and now increasingly with loT and "smart" things-vehicles, homes and offices, cities, even entire global systems. Now, we're bringing IT to the work, to the real world, to what we call "the edge".

the new edge

The Hidden Problem of Edge Computing

Did you know that all the productivity tools, and the gains they've produced, have missed about half the total labor force? We've done a great job with the office worker, but what about workers on factory and warehouse floors and even beyond, roaming rail yards and farms and forests? These workers aren't sitting at PCs today, and they're not going there to find productivity gains. We need to bring IT to them, through some of those smart things that leverage lot and event processing. Things that are justifying edge computing. Just saying this is "edge" computing is really too simple a way to describe this new relationship with the real world.

Edge computing is really a complex web of components linked by a combination of event flows that are made up of short messages from devices, and transaction flows to existing applications in the cloud and data center. The "edge" is really an extreme and geographically extended form of distributed computing. Both events and transactions are often processed, even stored, at a number of points in the flow and components.

There has to be a better way, and Atombeam has found it. Simply put, you code the data by creating a unique codeword to represent a frequent pattern within messages, and you "read" the data through the same codebook used to encode it. One concept, one coding process, and you read without incurring massive processing and latency penalties.

CODEBOOK

The New Edge: Data as a Codebook

Transmission efficiency is important for many of those flows, because of limitations in the capacity of links. The link limitations are particularly critical in applications that require satellite or 3G/4G wireless connectivity. Security is always important, and companies have used compression and encryption technology for decades to address this. In edge computing, though, we have to consider latency and its impact on the applications and their users. If we have to decompress/decode at every point our data is looked at, we'll use up our latency budget before we do any processing.

It's time to rethink edge computing based on its fundamentals, the movement and storage of real-world data. The application of data as codewords technology transforms edge computing, ensuring data efficiency and security for even short messages without increasing latency.

Atombeam supports the popular Docker container format for its Neurpac Storage & Retrieval reader functionality, which makes the reader compatible with any orchestration and hosting environment that's Docker-compatible, including Kubernetes. Specialized support for public cloud hosting, starting with Amazon's AWS, will be announced starting late in 1H24. Any mix of hosting options can be supported for edge applications, and any number of reader processes can be used where the message data must be analyzed. Because of the low latency associated with coding and reading, this approach is ideal where there is a strict latency budget associated with the application. IoT control-loop processing is a good example of the type of application where violating latency constraints can have major real-world consequences, and Atombeam ensures that data transmission, storage, and security constraints are met without compromising latency management.

A reader component can be combined with a coder if intermediate processing can alter data. In this case, the altered data is redcoded and downstream users would use that codebook instead of the original. Of course, any edge application can generate a new message for coding as well. The codeword approach is highly efficient for lot and telemetry events, but also for other short messages, even text and many transactions and responses.

Transmission efficiency is important for many of those flows, because of limitations in the capacity of links. The link limitations are particularly critical in applications that require satellite or 3G/4G wireless connectivity. Security is always important, and companies have used compression and encryption technology for decades to address this. In edge computing, though, we have to consider latency and its impact on the applications and their users. If we have to decompress/decode at every point our data is looked at, we'll use up our latency budget before we do any processing.