Skip to main content

Edge Computing is the New Black

The IT industry has been living a big transformation for quite some time now. Most enterprises are focusing their investments in hybrid cloud solutions while on-premises core data center infrastructures are shrinking in favor of public cloud. At the same time, many have already realized that most of the data is created and consumed at the edge, which is now a fundamental component of every IT strategy.

What and Where is the Edge?

The definition of edge in edge computing is more complex than you might think. In general terms we talk about bringing compute and data storage close to the edge of your network with the goal to improve response times while avoiding unnecessary data movements between the clients and the cloud. In practice, depending on who you speak with, edge computing is becoming a general term to also describe mobile computing and, up to a certain extent, IoT. And, to be honest, there is some truth in it. IoT devices are becoming smarter and better able to create small networks and share their resources; while the latest mobile devices have a lot of CPU power and local storage capable of concentrating more and more complex operations locally.

In this context edge computing is now everything that is not in a core data center or in the public cloud.

Edge Computing Enablers

Three of the biggest enablers of edge computing are modern networking, in the form of SD-WAN, automation, and hardware designed for these use cases.

SD-WAN is radically changing how enterprises are designing their wide area networks, replacing traditional and very expensive MPLS and CDN circuits with internet connections. The network layout, and the services on top of it, are virtualized and abstracted from the physical network. This allows organizations of all sizes to take advantage of cheaper and faster connectivity, that is also much more flexible, easy to purchase, all without sacrificing security.

Let’s be clear here, edge computing always exited in some form. Branch offices always had small servers and some locale storage with applications and data on them. It was just that everything was harder to manage and expensive. Now, thanks to new technology and tools, managing large distributed infrastructures is much easier, there is no need to manually change backup tapes in the remote office or perform sysadmin activities there for example. Everything can be deployed remotely and automatically with minimal manual intervention. For example, tools like Ansible allow automation of application deployment on several computers concurrently, the tool checks prerequisites and brings every component to the desired state. This, in conjunction with new hardware that is cheaper and sturdier than ever, allows users to build resilient infrastructures to run small sets of applications practically unattended. During Tech Field Day 20.

This kind of low cost and highly automated infrastructure, coupled with a good SD-WAN solution, has the potential to make edge computing available to everybody and for a very large set of use cases.

Key Takeaways

Everybody is talking about edge computing. Last trend seems to be Kubernetes at the edge! And I have to say that this is not totally wrong, and actually reflects business needs of keeping this type of infrastructure highly automated and easy to manage at scale. I’m not sure that Kubernetes simplifies this type of deployment, but we will see how this will pan out.

Low cost and ease of management are key elements in this type of discussion, while SD-WAN is another aspect that is critical, not only from the cost perspective but also for the type of services that come with it, including enhanced security.



from Gigaom https://gigaom.com/2019/11/21/edge-computing-is-the-new-black/

Comments

Popular posts from this blog

Voices in AI – Bonus: A Conversation with Hilary Mason

[voices_in_ai_byline] About this Episode On this Episode of Voices in AI features Byron speaking with Hilary Mason, an acclaimed data and research scientist, about the mechanics and philosophy behind designing and building AI. Listen to this episode or read the full transcript at www.VoicesinAI.com Transcript Excerpt Byron Reese: This is Voices in AI, brought to you by Gigaom and I am Byron Reese. Today, our guest is Hilary Mason. She is the GM of Machine Learning at Cloudera, and the founder and CEO of Fast Forward Labs, and the Data Scientist in residence at Accel Partners, and a member of the Board of Directors at the Anita Borg Institute for Women in Technology, and the co-founder of hackNY.org. That’s as far down as it would let me read in her LinkedIn profile, but I’ve a feeling if I’d clicked that ‘More’ button, there would be a lot more. Welcome to the show, amazing Hilary Mason! Hilary Mason: Thank you very much. Thank you for having me. I always like to start with...