Databricks rolls out no-code data prep

Databricks unveils Lakeflow Designer, a no-code, AI-native tool for visual data preparation, aiming to empower business users and streamline data workflows on its platform.

2 min read
Databricks Lakeflow Designer interface showing a visual data pipeline with nodes and connections.
The Lakeflow Designer offers a visual canvas for data preparation workflows.

Databricks is bringing a visual, no-code approach to data preparation with the public preview of Lakeflow Designer. This new offering aims to democratize data access by allowing analysts and domain experts, not just data engineers, to prepare and analyze data through a drag-and-drop interface and natural language prompts.

The platform, built directly on Databricks and governed by Unity Catalog data governance, keeps data in place while providing essential features like lineage and permissions from the outset. This integration promises a more streamlined and secure data workflow.

Lakeflow Designer leverages Databricks' Genie Code AI data transformation capabilities. Users can describe desired transformations in plain English, with the AI generating or modifying workflows. Unlike external tools, Genie Code can access rich metadata from Unity Catalog, ensuring more context-aware AI suggestions.

Visualizing AI-driven transformations

Each step in Lakeflow Designer is represented as a visual operator, offering users a clear, step-by-step preview of how data changes. This transparency aims to make AI-generated transformations easier to review and trust.

The tool generates production-ready Python code behind every visual transformation. This code can be reviewed, version-controlled, and integrated into existing production pipelines, reducing the need for IT to rebuild work done in separate systems.

A key differentiator is the absence of per-user licensing. Databricks charges only for compute, removing a common barrier to adoption seen in other low-code data preparation tools.

Broad adoption potential

Databricks highlights that hundreds of teams are already using Lakeflow Designer for tasks ranging from cleaning client data in consulting to regulatory reporting in financial services and operational analytics for business teams.

The company sees it as a crucial component for scaling data engineering beyond core technical teams, enabling business users to prototype and iterate on data pipelines with greater autonomy.

Getting started is straightforward for existing Databricks users. The feature is available in all workspaces via the '+ New' button, though administrators may need to enable it.

The public preview of Lakeflow Designer is now available.

© 2026 StartupHub.ai. All rights reserved. Do not enter, scrape, copy, reproduce, or republish this article in whole or in part. Use as input to AI training, fine-tuning, retrieval-augmented generation, or any machine-learning system is prohibited without written license. Substantially-similar derivative works will be pursued to the fullest extent of applicable copyright, database, and computer-misuse laws. See our terms.