Milvus 2.6: Built for Scale, Designed to Reduce Costs
REDWOOD CITY, Calif., June 12, 2025 (GLOBE NEWSWIRE) -- Zilliz, the company behind the open-source vector database Milvus, today announced the release of Milvus 2.6, furthering the company's long-standing mission to democratize AI through accessible and affordable vector data solutions. This latest release continues Zilliz's commitment to helping organizations scale their AI applications without incurring significant costs, delivering substantial cost reductions across storage, compute, operations, and developer effort.
"I've long been passionate about democratizing AI by making the underlying technologies more accessible and affordable for everyone," said Charles Xie, CEO at Zilliz. "For over a year, I've emphasized that cost reduction is critical for widespread AI adoption. As data volumes continue to explode, organizations need vector database solutions that can scale efficiently while keeping costs under control. Milvus 2.6 furthers our commitment to this vision with innovations that streamline infrastructure requirements and optimize resource utilization—all while maintaining the high performance our users expect."
Continuing Our Mission: Cost Reduction Across Multiple Dimensions
Monetary Savings: Lower Infrastructure & Storage Bills
Milvus 2.6 introduces several innovations that directly reduce infrastructure costs:
-
Tiered Storage with Hot/Cold Data Separation: Automatically moves frequently accessed vectors to high-performance storage while relegating less-used data to more economical options, reducing storage costs without compromising retrieval performance. Works seamlessly with leading storage providers including Cohesity, Pure Storage, MinIO, and NetApp.
-
Int8 Vector Compression for HNSW Indexes: Stores dense vectors using 8-bit integers, substantially reducing memory requirements while maintaining search accuracy.
-
RabitQ 1-bit Quantization: Pushes quantization to an extreme to achieve comparable retrieval quality with only half the memory cost.
- New Write-Ahead Log (WAL) with Woodpecker: Eliminates the need for external message queues like Kafka or Pulsar, featuring a diskless architecture that reduces infrastructure costs while improving write performance.
Organizations migrating from OpenSearch to Milvus have reported up to 8x cost reductions while maintaining or improving performance for vector search workloads.
Operational Efficiencies: Less Infrastructure to Manage
Milvus 2.6 simplifies operations with its innovative diskless architecture:
-
Streaming Node: A new dedicated component for real-time data ingestion built directly into the platform, eliminating the need for the costly external message queues on the write path.
-
CDC + BulkInsert: Combines Change Data Capture (CDC) with a bulk data ingestion feature, simplifying data replication across instances in different geolocations.
-
Storage v2 Format: Optimized for performance and future compatibility with data processing frameworks like Apache Spark.
- APT/YUM Deployment: Native package support simplifies installation and upgrades, reducing operational overhead.
Developer Time: More Built-In Tools, Less Plumbing
Milvus 2.6 boosts developer productivity with:
-
Data-In, Data-Out: Enables direct ingestion of raw content (text, images, audio) with built-in inference capabilities—eliminating separate pre-processing pipelines.
-
Custom Reranker: Allows developers to apply custom scoring logic using scalar fields and user-defined functions.
-
Text & JSON Search: Native support for advanced text processing capabilities, including:
- Advanced tokenization for Asian languages (Japanese/Korean)
- JSON path/flat/key indexing
- Match & phrase queries
-
Sampling + Aggregation Queries: Built-in analytics tools for fast iteration during development.
Open Source & Trusted by Developers
Milvus is fully open source with permissive licensing (Apache 2.0) and developer-friendly architecture. With no black boxes, organizations can contribute, audit, and customize as needed—ensuring transparency and flexibility.
Already a Leader in Vector Search
Milvus has established itself as one of the most widely adopted vector databases in the world, powering AI applications at scale for more than 10,000 organizations worldwide across a wide range of use cases.
"The integration between Milvus 2.6 and MinIO AIStor creates a powerful solution for AI workloads requiring both performance and cost efficiency," said Anand Babu Periasamy, CEO and co-founder of MinIO. "By enabling data to be written directly to AIStor instead of using Kafka , Milvus 2.6 users can achieve significant infrastructure simplification, maintain performance for their vector search application, and support hot/cold tiered storage strategies."
For more information on Milvus 2.6 and its groundbreaking features, visit the Milvus website and read our blog post.
About Zilliz
Zilliz builds next-generation database technologies that help organizations unlock the value of unstructured data and rapidly develop AI and machine learning applications. By simplifying complex data infrastructure, Zilliz brings the power of AI within reach for enterprises, teams, and individual developers alike.
Headquartered in Redwood Shores, California, Zilliz is backed by leading investors including Aramco's Prosperity7 Ventures, Temasek's Pavilion Capital, Hillhouse Capital, 5Y Capital, Yunqi Partners, Trustbridge Partners, and others.
For more information about Milvus 2.6 or to download, visit https://milvus.io/.
Media Contact:
Chris Churillo, VP of Marketing
PR@zilliz.com
A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/9fdf77c1-1960-4c74-8088-57a78115c93e

Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
