Skip to main content

Edge Computing is the New Black

The IT industry has been living a big transformation for quite some time now. Most enterprises are focusing their investments in hybrid cloud solutions while on-premises core data center infrastructures are shrinking in favor of public cloud. At the same time, many have already realized that most of the data is created and consumed at the edge, which is now a fundamental component of every IT strategy.

What and Where is the Edge?

The definition of edge in edge computing is more complex than you might think. In general terms we talk about bringing compute and data storage close to the edge of your network with the goal to improve response times while avoiding unnecessary data movements between the clients and the cloud. In practice, depending on who you speak with, edge computing is becoming a general term to also describe mobile computing and, up to a certain extent, IoT. And, to be honest, there is some truth in it. IoT devices are becoming smarter and better able to create small networks and share their resources; while the latest mobile devices have a lot of CPU power and local storage capable of concentrating more and more complex operations locally.

In this context edge computing is now everything that is not in a core data center or in the public cloud.

Edge Computing Enablers

Three of the biggest enablers of edge computing are modern networking, in the form of SD-WAN, automation, and hardware designed for these use cases.

SD-WAN is radically changing how enterprises are designing their wide area networks, replacing traditional and very expensive MPLS and CDN circuits with internet connections. The network layout, and the services on top of it, are virtualized and abstracted from the physical network. This allows organizations of all sizes to take advantage of cheaper and faster connectivity, that is also much more flexible, easy to purchase, all without sacrificing security.

Let’s be clear here, edge computing always exited in some form. Branch offices always had small servers and some locale storage with applications and data on them. It was just that everything was harder to manage and expensive. Now, thanks to new technology and tools, managing large distributed infrastructures is much easier, there is no need to manually change backup tapes in the remote office or perform sysadmin activities there for example. Everything can be deployed remotely and automatically with minimal manual intervention. For example, tools like Ansible allow automation of application deployment on several computers concurrently, the tool checks prerequisites and brings every component to the desired state. This, in conjunction with new hardware that is cheaper and sturdier than ever, allows users to build resilient infrastructures to run small sets of applications practically unattended. During Tech Field Day 20.

This kind of low cost and highly automated infrastructure, coupled with a good SD-WAN solution, has the potential to make edge computing available to everybody and for a very large set of use cases.

Key Takeaways

Everybody is talking about edge computing. Last trend seems to be Kubernetes at the edge! And I have to say that this is not totally wrong, and actually reflects business needs of keeping this type of infrastructure highly automated and easy to manage at scale. I’m not sure that Kubernetes simplifies this type of deployment, but we will see how this will pan out.

Low cost and ease of management are key elements in this type of discussion, while SD-WAN is another aspect that is critical, not only from the cost perspective but also for the type of services that come with it, including enhanced security.



from Gigaom https://gigaom.com/2019/11/21/edge-computing-is-the-new-black/

Comments

Popular posts from this blog

Who is NetApp?

At Cloud Field Day 9 Netapp presented some of its cloud solutions. This comes on the heels of NetApp Insight , the annual corporate event that should give its user base not just new products but also a general overview of the company strategy for the future. NetApp presented a lot of interesting news and projects around multi-cloud data and system management. The Transition to Data Fabric This is not the first time that NetApp radically changed its strategy. Do you remember when NetApp was the boring ONTAP-only company? Not that there is anything wrong with ONTAP of course (the storage OS originally designed by NetApp is still at the core of many of its storage appliances). It just can’t be the solution for everything, even if it does work pretty well. When ONTAP was the only answer to every question (even with StorageGrid and EF systems already part of the portfolio), the company started to look boring and, honestly, not very credible. The day the Data Fabric vision was announced

Inside Research: People Analytics

In a recent report, “ Key Criteria for Evaluating People Analytics ,” distinguished analyst Stowe Boyd looks at the emerging field of people analytics, and examines the platforms that focus on human resources and the criteria with which to best judge their capabilities. Stowe in the report outlines the table stakes criteria of People Analytics—the essential features and capabilities without which a platform can’t be considered relevant in this sector. These include basic analytic elements such as recording performance reviews, attendance monitoring, and integration with other HR tools. The report also defines the key criteria, or the features that actively differentiate products within the market and help organizations to choose an appropriate solution. These criteria include: Full employee life cycle tracking Support for different employee types (seasonal or freelance workers) Employee surveys Diversity and inclusion monitoring Stowe also looks at the rapid innovation and em