close
close

Hazelcast 5.5 Platform Introduces Vector Search, Performance, Resiliency, and Flexibility Improvements

Hazelcast, Inc., a leading software provider powering mission-critical applications that drive economies, is introducing vector search capabilities for its flagship product, the Hazelcast Platform. Hazelcast 5.5, which equips Hazelcast users with the tools they need to modernize in the AI ​​era, enables a range of new use cases—including semantic search, fraud detection, and RAG—as well as additional advancements in compute, resiliency, and continuity.

Vector search support enables organizations to deploy efficient, scalable pipelines to perform structured queries AND unstructured data. The Hazelcast platform now provides the agility needed to create vector data structures and embeddings from text graph summaries, opening up a new world of productivity for data scientists, according to the company.

“Being able to have that context—having reference data alongside the actual vector imagery that you can then search—is a real benefit of having something like Hazelcast at its core,” said Avtar Raikmo, vice president of engineering at Hazelcast. “Being able to mine that data and figure out data science use cases is really critical. Data scientists typically don’t have the experience of building industrialized software that’s easy to use, easy to scale, efficient, resilient, and all the other benefits that a platform like Hazelcast can bring.”

In addition, the Hazelcast platform showed significant performance gains over most competitors, especially in the area of ​​vector embedding and retrieval. Against OpenAI’s 1 million angular vectors, Hazelcast outperformed it, delivering single-digit millisecond latency for vector uploading, indexing, and retrieval with 98% precision, according to the company.

“A lot of vendors have highly optimized algorithms for fetching data, but you pay a penalty for creating and inserting vectors,” Raikmo notes. “The algorithms we ended up with—based on approximate nearest neighbor on disk—really allow us to have a much more 50/50 split between the compute benefits and the storage and fetch benefits.”

Combined with support for vector search, Hazelcast 5.5 delivers two new capabilities that give organizations greater agility, resiliency, and performance for enterprise applications:

  • Employment Check at Jet Job: Customers can decouple Hazelcast node compute from data storage components to increase agility and resiliency for compute-intensive workloads
  • Client multi-member routing: Increases resiliency, throughput, and control for applications connecting to geographically distributed clusters

Specifically, in reference to Jet Job Placement, Raikmo explained that “in the AI ​​world, you want to have members in the cluster that are more efficient at doing the computation and others that are more efficient at storing data and making it available for retrieval. So, by being able to actually separate the computation and storage needs, vector search is really going to be a major beneficiary of that.”

The Hazelcast 5.5 platform also introduces an industry-leading 3-year long-term support (LTS), allowing mission-critical customers to evolve their systems for the long term with streamlined updates, the company said.

To learn more about the latest Hazelcast update, visit https://hazelcast.com/.