close
close

F5 CEO on why AI application development will ‘accelerate’ APIs and edge security

In an interview, F5 CEO François Locoh-Donou told CRN that “securing these APIs to secure AI is absolutely critical.”


The rapid growth of AI-based applications thanks to the emergence of generative AI could create huge new opportunities for solution and service providers in securing APIs and edge deployments, F5 CEO François Locoh-Donou told CRN.

Locoh-Donou said in a recent interview that APIs are becoming increasingly important to enable communications and other functions of GenAI applications — which means API security, an area of ​​expertise for F5, will also gain strength.

“Artificial intelligence will accelerate this,” he said. “For all of these AI applications, they communicate with the back end of these AI applications – that is, the AI ​​model, the AI ​​factories, or the enterprise data stores – and all of that communication happens through an API. So securing these APIs to secure AI is absolutely critical.”

(Related: 20 Coolest Website, Email, and App Security Companies in 2024)

Similarly, deployments of AI applications and large-language models (LLM) in edge environments are expected to increase significantly as more organizations seek to run AI technologies close to where they are needed to improve performance and reduce latency, according to Locoh -Donou.

This also means major new security service opportunities for both F5 and its channel partners, he told CRN. “Wherever we run these things, we will have to secure them,” Locoh-Donou said.

The following is a condensed and edited portion of CRN’s interview with Locoh-Donou.

What are the most important things you want partners to know about F5 right now?

First, let me tell you about what has not changed and is not changing. Our business model at F5 is very channel and partner focused. Over 90 percent of our business is through partners. And while there has been a lot of evolution in the F5 portfolio and the technology assets we hold, this part of our business model remains unchanged. In fact, this is the heart of our market victory.

Of course, the world around us has changed a lot in the last few years. ADCs (Application Delivery Controllers) is a market that F5 pioneered and truly dominated. By the time I joined F5, the cloud had become mainstream and CIOs believed they could move all their applications to one cloud – and it would be simple, fast and highly reliable. Things have become very different in the last six or seven years as people now live in a hybrid and multi-cloud environment. All of our customers have multiple infrastructure environments. Because of this, the attack surface that attackers can take advantage of increases exponentially. People have more containers, more APIs, and more distributed applications. And this, of course, creates a lot of complexity. Our customers have started using devices on-premises, using ADCs in the cloud when they are in the cloud, using different security providers in different environments.

In terms of the investments that the company has made over the last three years, we have focused on really tackling that complexity and building a platform, a set of features that can really simplify our customers’ experience. Specifically, we are focused on enabling our customers to secure, deliver and optimize any application or API – but anywhere, in any infrastructure environment, whether in the cloud or private cloud, on-premises or at the edge. Or even increasingly in remote areas, in the case of some AI applications. But for our customers to be able to use the same security engine, the same delivery engine, a single policy across all of these environments, and essentially a set of application security and delivery services that are abstracted from the underlying infrastructure and can be a layer of simplification for our customers.

What do you think are the key moves that F5 has made to build this platform?

To this end, we have made several acquisitions. We acquired NGINX in 2019. We acquired Shape (Security) to strengthen our security capabilities. We also acquired Volterra, which was the basis of our SaaS platform. And of course, to be able to do what I just said, we needed a SaaS style form factor. Now we can provide all these capabilities in the form of hardware, software or SaaS. And the SaaS business is growing quite quickly. This is another area where I’m quite happy with our channel partners because a significant portion of the opportunities we close in SaaS come from our partners. For a company that has a history in the home appliance industry – and that has many partners who have grown up with us in the home appliance industry – it wasn’t necessarily obvious that our community of partners would transform with us and be with us in software sales and SaaS sales and cultivating relationships with end customers.

Are most of your partners involved in SaaS at this point?

Of course, it varies. We have several thousand partners around the world, and not all of them take action at the same pace or invest in skill sets at the same pace. However, one of the indicators we look at are the so-called PIO – Partner-initiated opportunities. And the percentage of our SaaS business derived from PIO is at least as high, if not higher, than the percentage of traditional business derived from PIO. This means that the multiplier effects that our partners achieve when concluding new transactions in the SaaS model are at least as good as in the case of traditional business, which is very encouraging.

In what new areas do you want to attract more of your partners?

I would like to touch on one key aspect of this issue, API security. For a long time, API security was something of a niche that struggled to attract real attention. Now we’ve brought the entire solution to market, which is built organically, but we’ve also made some acquisitions. We can now go to customers and say, “We will discover and catalog all your APIs. We will look at all vulnerabilities and catalog them. We will mitigate these vulnerabilities with our WAF (Web Application Firewall), bot protection, or API security protection. Customers love that the problem goes away. And this is connected with the fact that customers are increasingly saying: “API security is a big problem for me.” That’s because the number of APIs is exploding.

And honestly, AI will accelerate this. For all of these AI applications, they communicate with the back end of these AI applications – that is, the AI ​​model, the AI ​​factories, or the enterprise data stores – and all of this communication happens via an API. So securing these APIs to secure AI is absolutely critical. And that’s why we’re starting to see more and more demand.

What other possibilities does F5 currently see in terms of artificial intelligence?

We’re starting to see large enterprises build AI factories where they put multiple GPU clusters in their data centers to be able to either build sovereign AI models or deal with the very large amounts of data they want to process in their models. And this creates a new opportunity for F5. Or rather, it creates a new opportunity that is an old opportunity. When we first entered the market, what really allowed F5 to thrive initially was website load balancing. Our first clients were online houses. We pioneered load balancing to direct traffic to the appropriate server and vice versa. So the same use case now arises in AI – where you have all your GPU clusters, but if your request goes to a cluster that is fully occupied, you will have a lot of waiting time. Load balancing – quickly and efficiently balancing the load between GPU clusters – is becoming an important need to improve their performance. This is a completely new opportunity, but it involves technology that we have been perfecting and building for 20 years. This is really interesting potentially for our partners who work with customers who are building artificial intelligence factories.

Another phenomenon related to AI is that we expect and are starting to see that AI applications will be very distributed. There are several reasons for this. One of them is data weight. People don’t want to keep moving their data around. An AI application may be in the public cloud or on-premises, but it accesses data located in multiple locations. Another reason is that when it comes to inference, people want to infer – run AI applications – as close to where the machine is running as possible. They want to do it where the decision needs to be made to minimize delays. This means that manufacturers may want to run inference right where the machines are running. Retailers will want to launch AI in retail stores right where the action is. And so on. This creates a need for application security and delivery services that are also highly distributed and can operate across all of these locations. And today there isn’t really a company that is built to address this use case. Because even the biggest players, the big CDN players, can provide their services in their PoP location where they have a point of presence. But they cannot do this when used on far edges. We have built technology on our SaaS platform that can be implemented virtually anywhere where there is computing infrastructure. It may be in the trunk of your car, in a retail store, or in an intensive care unit. May be in a military vehicle in the field. We see that AI will need these highly distributed functions. We believe that as companies move to inference for distributed AI applications, this capability will become a key differentiator.

It almost sounds like you expect another extension of your advantage as a result of artificial intelligence?

We’ve started to see this in modern applications, but we think AI will only accelerate it. You’re already hearing phone manufacturers talking about how you’re going to run small LLMs on your phones. Now translate this into the business world. We will want to run our LLM wherever necessary. And it won’t be limited to the public cloud or a few large data centers. Wherever we run these things, we will need to secure them. We need to make sure they are delivered correctly and that they will work 24 hours a day, 7 days a week. Therefore, the software stack that provides all the security and delivery will need to run wherever these elements reside. This is the potential we have built.