Archives

Couchbase Announces New Features to Accelerate AI-Powered Adaptive Applications for Customers

Couchbase

Couchbase, Inc., the cloud database platform company, today introduced vector search as a new feature in Couchbase Capella Database-as-a-Service (DBaaS) and Couchbase Server to help businesses bring to market a new class of AI-powered adaptive applications that engage users in a hyper-personalized and contextualized way. Couchbase is the first database platform to announce it will offer vector search optimized for running onsite, across clouds, to mobile and IoT devices at the edge, paving the way for organizations to run adaptive applications anywhere.

“Adding vector search to our platform is the next step in enabling our customers to build a new wave of adaptive applications, and our ability to bring vector search from cloud to edge is game-changing,” said Scott Anderson, SVP of product management and business operations at Couchbase. “Couchbase is seizing this moment, bringing together vector search and real-time data analysis on the same platform. Our approach provides customers a safe, fast and simplified database architecture that’s multipurpose, real time and ready for AI.”

Also Read: Intapp Announces Plan to Acquire AI Software Company delphai 

Vector Search and the Rise of Adaptive Applications

Businesses are racing to build hyper-personalized, high-performing and adaptive applications powered by generative AI that deliver exceptional experiences to their end users. Common use cases include chatbots, recommendation systems and semantic search. For example, suppose a customer wants to purchase shoes that are complementary to a particular outfit. In that case, they can narrow their online search for products by uploading a photo of the outfit to a mobile application, along with the brand name, customer rating, price range and availability at a specific geographical area. This interaction with an adaptive application involves a hybrid search including vectors, text, numerical ranges, operational inventory query and geospatial matching.

As more organizations build intelligence into applications that converse with large language models (LLMs), semantic search capabilities powered by vector search — and augmented by retrieval-augmented generation (RAG) — are critical to taming hallucinations and improving response accuracy. While vector-only databases aim to solve the challenges of processing and storing data for LLMs, having multiple standalone solutions adds complexity to the enterprise IT stack and slows application performance. Couchbase’s multipurpose capabilities eliminate that friction and deliver a simplified architecture to improve the accuracy of LLM results.

Couchbase’s recent announcement of its columnar service, together with vector search, provides customers with a unique approach that delivers cost-efficiency and reduced complexity. By consolidating workloads in one cloud database platform, Couchbase makes it easier for development teams to build trustworthy, adaptive applications that run wherever they wish. With vector search as a feature across all Couchbase products, customers gain:

  • Similarity and hybrid search, combining text, vector, range and geospatial search capabilities in one.
  • RAG to make AI-powered applications more accurate, safe and timely.
  • Enhanced performance because all search patterns can be supported within a single index to lower response latency.

SOURCE: PRNewswire