close
close

AWS CEO Matt Garman on generative AI, open source, and shutdown services

It was quite a surprise when Adam Selipsky stepped down as CEO of Amazon’s AWS cloud computing unit. Perhaps an equally big surprise was that he was replaced by Matt Garman. Garman joined Amazon as an intern in 2005 and became a full-time employee in 2006, working on early AWS products. Few people know the company better than Garman, whose last position before becoming CEO was as senior vice president of sales, marketing and global services for AWS.

Garman told me in an interview last week that he hasn’t made any significant changes to the organization yet. “Not much has changed in the organization. “The business is doing quite well, so there is no need to make huge changes to anything we are focusing on,” he said. However, he pointed out several areas where he believes the company needs to focus and where he sees opportunities for AWS.

Emphasize start-ups and rapid innovation

One of them, somewhat surprisingly, is startups. “I think we have evolved as an organization. … In the early days of AWS, our main focus was on how to really attract developers and startups, and we got a lot of traction there from the beginning,” he explained. “And then we started thinking about how do we appeal to larger businesses, how do we appeal to governments, how do we appeal to regulated sectors around the world? And I think one of the things that I just emphasized again – it’s not really a change – but I just emphasized that we can’t lose focus on startups and developers. We have to do all these things.”

The second area he wants the team to focus on is keeping up with the whirlwind of change in the industry.

“I have really emphasized with the team how important it is for us to continue to stay at the forefront that we have in terms of the set of services, capabilities and features and functions that we have today – and continue to lean forward and build the plan action involving real innovation,” he said. “I think the reason customers use AWS today is because we have the best and broadest set of services. The reason people turn to us today is because we continue to deliver industry-leading security and operational efficiency by far, and help them innovate and move faster. We must continue to implement the action plan. “It’s not really a change in itself, but that’s probably what I highlighted the most: how important it is for us to maintain the level of innovation and the speed at which we deliver products.”

When I asked him if he thought maybe the company hadn’t innovated fast enough in the past, he said no. “I think the pace of innovation is only going to accelerate, so it’s just important to emphasize that we also need to accelerate the pace of innovation. It’s not that we’re losing it; it is simply an emphasis on how much we need to accelerate given the pace of technology.”

Generative Artificial Intelligence in AWS

With the emergence of generative AI and technology changing rapidly, AWS needs to be “at the forefront of all of them,” he said.

Shortly after ChatGPT’s launch, many experts questioned whether AWS was too slow to run generative AI tools on its own and left the door open to competitors like Google Cloud and Microsoft Azure. However, Garman believes that this was more imagination than reality. He noted that AWS has long offered successful machine learning services like SageMaker, even before generative AI became a buzzword. He also noted that the company has taken a more thoughtful approach to generative AI than perhaps some of its competitors.

“We were looking at generative AI before it became a widely accepted thing, but I will say that when ChatGPT came out, it was kind of a discovery of a new area and how to apply this technology. I think everyone was excited and energized by it, right? … I think a group of people – our competitors – were kind of racing to put chatbots at the top and show that they are leading the way in generative AI,” he said.

I think a group of people – our competitors – were kind of racing to put chatbots on top of everything and show that they were the leader in generative AI.

Instead, Garman said, the AWS team wanted to take a step back and look at how its customers, whether startups or enterprises, could best integrate the technology into their applications and leverage own, differentiated data. “They will need a platform that they can build on freely and think of it as a platform to build on, not an application that they are going to customize. That’s why we took the time to build this platform,” he said.

In the case of AWS, that platform is Bedrock, where it offers access to a wide range of open and proprietary models. Just doing this – and allowing users to connect different models together – was a bit controversial at the time, he said. “But for us, we thought that was probably where the world was going, and now it was certain that that was where the world was going,” he said. He said he thinks everyone will want custom models and provide their own data for them.

Garman said the bedrock is “growing like a weed right now.”

One problem with generative AI that it still wants to solve is price. “A lot of this is doubling down on our custom silicon and making some other changes to the models to draw the conclusion that you’re going to be building (something) much cheaper into your applications.”

Garman said the next generation of AWS’s custom Trainium chips, which the company debuted at the re:Invent conference in late 2023, will be launched later this year. “I’m really excited that we can really turn this cost curve around and start delivering real value to customers.”

One area where AWS hasn’t necessarily tried to compete with some of the other tech giants is in building its own large language models. When I asked Garman about this, he noted that these are still issues the company is “very focused on.” He thinks it’s important for AWS to have its own models while still using third-party models. But he also wants to make sure AWS’s own models can bring unique value and differentiation, either through leveraging its own data or “through other areas where we see opportunities.”

Among these areas of opportunity are costs, but also agents, which everyone in the industry seems optimistic about at the moment. “Having models that are reliable, at a very high level of correctness, and can call other APIs and do things. “I think there is some innovation that can be done in this area,” Garman said. He says agents will gain much more utility from generative AI, automating processes on behalf of their users.

Q, a chatbot powered by artificial intelligence

At the recent re:Invent conference, AWS also unveiled Q, its AI-powered generative assistant. Currently, there are basically two versions of this solution: Q Developer and Q Business.

Q Developer integrates with many of the most popular development environments and, among other things, offers code completion and tools for modernizing legacy Java applications.

“We really think of Q Developer as a broader sense of really helping throughout the developer lifecycle,” Garman said. “I think a lot of early developer tools focused on coding, and we’re thinking more about how do we help with everything that’s painful and labor-intensive for developers?”

At Amazon, teams used Q Developer to update 30,000 Java applications, saving $260 million and 4,500 years of developer labor, Garman said.

Q Business uses similar technologies under the hood, but focuses on aggregating internal company data from many different sources and making it searchable using a ChatGPT-like Q&A service. The company “sees a real driving force there,” Garman said.

Shutting down services

While Garman noted that not much has changed under his leadership, one thing that has happened recently at AWS is that the company announced plans to shut down some of its services. Traditionally, AWS hasn’t done this very often, but this summer it announced plans to shut down services like the Cloud9 web IDE, CodeCommit competitor GitHub, CloudSearch and others.

“It’s sort of a clean-up where we looked at a number of services and either, frankly, introduced a better service that people should move to, or we launched one that we just didn’t get better,” he explained. “And by the way, some of them we just don’t do well, and their traction has been quite poor. We looked at it and said, “You know what? The partner ecosystem actually has a better solution and we intend to build on it.” You can’t invest in everything. You can’t build everything. We don’t like to do this. We take this seriously if companies want to rely on us to support their activities in the long term. That’s why we’re very careful.”

AWS and the open source ecosystem

One relationship that has long been difficult for AWS – or at least has been perceived as difficult – is its relationship with the open source ecosystem. This is changing, and just a few weeks ago AWS contributed its code to the OpenSearch Linux Foundation and the newly formed OpenSearch Foundation.

We love open source. We rely on open source code. I think we’re trying to leverage the open source community AND make a huge contribution to the open source community.

“I think our view is pretty simple,” Garman said when I asked him what he thought about the future relationship between AWS and open source software. “We love open source. We rely on open source code. I think we’re trying to leverage the open source community AND make a huge contribution to the open source community. I think that’s what open source is all about – benefiting from the community – and that’s why we take it seriously.”

He noted that AWS has made key investments in open source software and many of its own open source projects.

“Most of the friction has come from companies that originally started open source projects and then decided to kind of decommission them from open source, and I think they have the right to do that. But you know, that’s not the true spirit of open source. Whenever we see people doing this, take Elastic for example, and OpenSearch (AWS’ ElasticSearch fork) is quite popular. … If there is a Linux (Foundation) project, or an Apache project, or anything else that we can build on, we want to build on that; we contribute to them. I think we have evolved and learned as an organization how to be good stewards of this community, and I hope that has been noticed by others.