Archives

Qlik Expands Agentic Data Engineering to Accelerate AI-Ready Data Delivery

Qlik

Qlik has announced a major expansion of its agentic execution strategy, extending it into data engineering to help organizations build, manage, and deliver trusted data more efficiently. The new capabilities aim to reduce manual effort in pipeline development and ensure faster access to reliable, AI-ready data across enterprise environments.

The company’s latest release reflects a growing pressure on data teams, who are increasingly tasked with supporting AI initiatives while maintaining speed, reliability, and cost efficiency. Much of the challenge stems from repetitive engineering work—building pipelines, maintaining transformations, and troubleshooting data flows—that slows down delivery and limits scalability.

Qlik’s updated approach introduces agentic capabilities directly into engineering workflows, allowing teams to translate intent into functioning data assets while preserving control and governance required in production systems.

“Most companies do not struggle to imagine AI use cases. They struggle to deliver the trusted, current data those use cases depend on,” said Mike Capone, CEO, Qlik. “As demand rises, data engineering becomes the critical path. Qlik is helping teams reduce friction, protect trust, and keep pace with the business.”

Also Read: BigPanda and ServiceNow Team Up to Cut Alert Noise and Accelerate Incident Resolution

The release includes several key enhancements. Declarative pipelines bring a new dimension of intuitiveness and guidance to constructing data flows, reducing the complexity associated with developing and evolving pipelines. An AI Assistant that can help create jobs, write SQL, and generate documentation for Talend Studio is expected to become available soon.

Another example of advances made by Qlik in the area includes extending its real-time routing functionality to accommodate agentic workflows, which facilitate connecting large language models and retrieval-augmented generation pipelines. In addition, Qlik has recently integrated its Open Lakehouse with streaming data, merging batch, CDC, and real-time event handling into one place.

Overall, such innovations attempt to transition data engineering from a process-oriented, labor-intensive effort to an intention-based, AI-powered activity.

“There is a big difference between an assistant that helps write code and a system that actually helps a data team move faster end to end,” said Robin Astle, Principal Developer, Valpak. “The interesting part of this announcement is the focus on pipeline creation, data quality, metadata, and stewardship together, because that is much closer to how real engineering work happens.”