close
close

Army’s demand for GenAI surging, with focus on integration

Demand for generative artificial intelligence is surging within the Army — in fact, Army leaders are having to put a bit of an “appetite suppressant” on it. In the early stages of GenAI adoption, the service supported teams with organic AI skillsets, but without the necessary business process expertise, resourcing quickly became an issue.

Now, the Army is focused on building a strong foundation for GenAI adoption servicewide, including having cost containment strategies and clear processes to help soldiers and civilian employees integrate GenAI into their work more strategically.

Due to some early successes of using large language models, however, Army leaders now say they are now notifying organizations tend to “jump to this as the answer” without fully understanding whether the technology is the most suitable option.

“We got more people at the gates than we can handle right now. We’re just making sure we set a foundation that makes sense and that one, we can help people make sure that they don’t jump in there, run out of money and not get what they’re looking for. Two, leverage models that aren’t going to be supportive of their requirements, which will be disappointing. Or three, I think the bigger thing is to start leveraging technology that really doesn’t augment the kind of work that they’re doing. So we’re trying to balance all that right now,” Army Chief Information Officer Leonel Garciga told Federal News Network.

Earlier this month, the Army deployed generative AI capability to its secure cloud cArmy. Developed by Ask Sage, the platform allows soldiers and civilian employees to securely access Azure Gov OpenAI services, enabling them to apply OpenAI models to existing Army data.

The platform, however, doesn’t just support one language model — a key feature of Ask Sage is that users can swap between models without having to retrain their data.

The platform also comes with existing plugins and agents that users can immediately deploy without having to “reinvent the wheel.”

“You want to be able to understand the benefits of each model and be able to swap without getting locked into one provider. And because Ask Sage can run on anywhere and we deploy it on cArmy at Impact Level 5, and we can do it on Amazon, on Azure, on Google, but we also support 150 models with text, images, videos and audio — that gives a lot of options for the teams,” said Ask Sage CEO Nic Chaillan, who served as the Air Force’s first chief software officer from 2019 to 2021.

The new software is already helping the Army with software development, acquisition, cybersecurity and, even, aviation safety.

The Army Contracting Command, for instance, is already exploring how to use AI to write better contracts, identify commonalities across contracts and find opportunities to integrate with the broader contracting community and the defense industrial base.

“Just their first couple of runs, definitely got a lot of great feedback from Army Contracting Command on where they’re making some strides on understanding conflicts within contracts that they hadn’t seen before. “We’ve been able to actually take that and turn that into some decisions and some changes to contracts, which, in the end, both save the government money and make it more efficient for the vendor to deliver,” said Garciga.

The Army has also seen immediate benefits in how its legal teams handle new policies and regulations. The software is already helping legal teams quickly sift through federal law and existing Army and DoD policies, which reduces manual work and significantly speeds up legal review.

And as the use cases continue to bubble up, one key takeaway has been the importance of using cloud-native application programming interfaces (APIs) for base models, which often come with built-in safeguards that ensure ethical use of the technology, said Garciga .

Additionally, centralizing data libraries and models has proven to be cost-effective — it helps the Army avoid duplicating efforts across different systems and paying multiple times for the same data.

“One of the challenges of bouncing from model to model, and maybe even across cloud vendors has been replicating the data and paying for it a couple of times. “We’ve been working a lot to make sure we understand what that looks like and how we maximize this idea of ​​having centralized libraries or curated topics or even domains of documents, so we’re not replicating the data and paying for it,” said Garciga.

And developing personas—profiles of different users, such as policy analysts or resource managers—is helping the service streamline the learning curve for new AI adopters when expanding the use of AI tools to more people.

“Our goal here is to continue with these proofs of concepts over the next couple of months, really start understanding this space enough that’s going to drive a requirement, which eventually, at some point, will drive some acquisitions. I think just from what we’re learning right now, the good thing about this space is that nobody’s doing a one-size-fits-all right. There’s so much work out here and opportunity, I think it’s going to be up to us to make sure that we’ve got the right use cases, and we’re really disciplined in how we compete to get our partners to work with us, ” said Garciga.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.