News

After 10 months in preview, DLT is now ready for primetime for production workloads. According to Databricks, the new service will enable data engineers and analysts to easily create batch and ...
Notably, the new Lakeflow Declarative Pipelines capabilities allow data engineers to build end-to-end production pipelines in SQL or Python without having to manage infrastructure.
The second part is LakeFlow Pipelines, which is essentially a version of Databricks’ existing Delta Live Tables framework for implementing data transformation and ETL in either SQL or Python.
Join us for an exclusive webinar as we demonstrate how to easily build robust data pipelines on the Databricks Data Intelligence Platform with Prophecy. In this webinar, we'll equip you with the ...
Prophecy has launched an integration for Databricks, one that will allow users of the lakehouse to build data pipelines more easily.
Data + AI Summit -- Databricks, the Data and AI company, today announced it is open-sourcing the company's core declarative ETL framework as Apache Spark™ Declarative Pipelines. This initiative ...
Getting high-quality data to the right places accelerates the path to building intelligent applications," said Ali Ghodsi, Co-founder and CEO at Databricks. "Lakeflow Designer makes it possible for ...