In May 2023, Dell announced NativeEdge, an edge operations software platform. Dell has been talking to customers for years in advance of the release about the needs of technology operating at the edge.
To get into the details, I spoke with Aaron Chaisson, Dell Technologies’ vice president of telecom and edge solutions marketing, at Dell Technologies World in Las Vegas. The following is a transcript of my interview with Chaisson; the interview has been edited for length and clarity.
Jump to:
- Challenges of cloud spending and deployment
- Customers choose between edge and on-prem
- Not generative AI, but smart vision and data analytics
- How NativeEdge helps with secure device onboarding
Challenges of cloud spending and deployment
Megan Crouse: What decisions are you seeing customers or potential customers struggle with right now in terms of enterprise cloud purchasing that weren’t being talked about a year or three years ago?
Aaron Chaisson: One of the biggest things that companies are looking to do is there is an interest in being able to consume (cloud) in an as-a-service fashion. They want to take the experiences they are getting from hyperscalers and potentially be able to bring those experiences on-prem, especially toward the edge. Customers want to leverage edge technologies to drive new business outcomes, to be able to act upon data more rapidly. How do they take the capabilities, the features and the experiences that they get from a cloud and deliver those in edge environments?
One of the questions that we commonly see is: Are you taking established cloud technologies and moving them to the edge? Or are you really looking to use the best practices of cloud, of automation and orchestration-as-a-service, but to deliver it in a more purpose-built fashion that delivers unique value to the edge? And that’s really where NativeEdge is designed to be able to deliver an edge experience, but in a customized way that targets outcomes that customers are looking to at the edge.
SEE: Don’t curb your enthusiasm: Trends and challenges in edge computing (TechRepublic)
Customers choose between edge and on-prem
Megan Crouse: Do you see customers deciding workflow-by-workflow, where they’re going to pull from the edge, and if so, how is Dell working on simplifying that process through something like NativeEdge?
Aaron Chaisson: It’s early days for the consultative conversation that comes out of that. As we were moving toward the cloud several years back, the question was always what workloads do I keep in IT? What workloads do I move to the cloud? Which applications work great? Which applications do I want to migrate? Which applications do I want to modernize? Which ones do I want to retire right away? We worked with customers by looking at all of their workloads and determining on a workload-by-workload basis what should live where and whether it should be virtualized, containerized or function-based.
I think that same approach is now going to start happening at the edge. As you look at your edge environments, do you want to run those workloads at the edge or in the cloud, or maybe across both? NativeEdge is doing two things on the application orchestration front: There’s lifecycle management of edge infrastructure and lifecycle management of workloads and applications. The focus right now is deploying edge workloads.
I might need to deploy the same workload to 1,000 retail outlets to run in-store inventory control or in-store security for loss prevention, right? So I need to be able to push that to all these edge locations. Or, I might need to push a centralized management console that manages those thousand workloads or reports against them or does model training so I can continually make sure that my loss prevention AI that’s running at the edge is going as the most up-to-date model. That model training might run in AWS. That same tool needs to be able to deploy all of these edge locations and a component in a cloud. We can work with the customers to understand [their needs] based on the workload they are looking to deploy.
SEE: Explore five important facts about edge computing. (TechRepublic video)
We also have customers who say, “Hey, do I deploy NativeEdge, or do I do Microsoft on-prem?” And so a lot of that comes down to whether they want to have a common set of cloud services from a single cloud vendor that extends from edge to cloud, but has trade-offs in that it’s not necessarily purpose-built for the edge, but it can simplify some of the consumption of those services by using a common cloud layer. Or do they really want to optimize for the edge but have an application management tool like NativeEdge that can manage those workloads, whether they’re in the cloud or at the edge?
It really comes down to what operating environment the customer prefers — something that is optimized for the edge, or something that’s optimized for cloud that extends. That’s a case-by-case conversation. Right now, it’s more preference-based, which is why we offer both.
Not generative AI, but smart vision and data analytics
Megan Crouse: What is the conversation around AI in your world right now?
Aaron Chaisson: In the world of telecom, I think it’s still very young. Telecom tends to move a little slower than enterprise IT environments both because those generational changes are longer in length and because the service requirements and availability and the requirements of the network tend to be much more stringent, so they want to leverage proven technologies before they roll it out into production. That doesn’t mean they’re not talking about it, but I think it’s early days, and we’re starting to have those conversations.
On the enterprise front, I’ll focus not on generative AI, which is the top topic today. I think that’s matured so fast in the last six months, I think everybody’s trying to get out in front of that, ourselves included. But traditional AI use cases of recent years are driving the edge right now. Traditional AI use cases may be computer vision for everything from security, to inventory control, to restocking of shelving, to managing robotics in a warehouse.
You name the industry, they’re looking to leverage AI to be able to drive new services. So that requires the ability to capture that data, analyze that data in real-time oftentimes and store that data as needed for model training. [They need to] selectively determine what data needs to be eliminated and what data needs to be kept. So they’re asking us what solutions we can provide. Right now, most of it is compute-centric.
In a lot of our APEX solutions today, we take our storage tech and run it in cloud data centers so I could maybe capture the data off the gateways, buffer those in memory on the servers at the edge location, act on it in real time, and then move subsets of that data to a cloud provider to do model training to continue to improve the services we deliver at the edge.
The emergence of AI is the thing that’s driving edge more than any other workload that we’re seeing.
How NativeEdge helps with secure device onboarding
Megan Crouse: NativeEdge is intended to help with secure onboarding. Can you go into more detail about that?
Aaron Chaisson: One of the biggest challenges that edge has over core data centers is, from a security perspective, you don’t have physical control necessarily over the environment. It’s not behind lock and key. You don’t have a well-proven, established firewall connected to you around the network. The edge could literally be a server mounted on the wall of a storage room. It could be a gateway that’s on a truck that doesn’t have physical control over it, right? And so being able to provide an increased level of security is going to be an absolute key constraint that we need to build for it.
So that’s why we really are getting out in front of what’s happening in zero trust. We can actually certify that device and fingerprint it in the factory, which is one of the advantages of being a manufacturer.
When you get it onsite, we can then send the voucher of that fingerprint to your NativeEdge controller, so when you bring that server online, it can check. If there’s been any tampering at any point along that supply chain, (the server) would literally be a brick. It will never be able to come online.
The only way to basically be able to provision that is to get a new server. So the old-school (method) of, “Oh, I’m just going to format it and reinstall my operating system.” No, you can’t do that. We want to make sure that it’s completely tamper-proof along the entire chain.
More news from Dell Technologies World
- Dell brings more cloud products under APEX umbrella
- Dell’s Project Helix is a wide-reaching generative AI service
- Dell’s Project Helix heralds a move toward specifically trained generative AI
- Dell reveals new edge as-a-service portfolio, NativeEdge
- Dell VP on the changing world of DevOps, CloudOps, AI and multicloud by design
- Dell’s sustainable data center management strategy
- Dell Technologies World 2023: Interview with Rob Emsley on data protection, recovery and more
Disclaimer: Dell paid for my airfare, accommodations and some meals for the Dell Technologies World event held May 22-25 in Las Vegas.