Dataflow job builder now supports external Iceberg REST Catalogs as a
Share
Services
## Feature
Feature
Dataflow job builder now supports external Iceberg REST Catalogs as a source. You can now ingest data from external Apache Iceberg REST catalogs (IRC) directly into Google Cloud Lakehouse tables using Dataflow's job builder UI without writing code. For more information, see [Import data from external Iceberg catalogs to Lakehouse using Dataflow](https://cloud.google.com/dataflow/docs/guides/iceberg-df-lakehouse-integration).
## Feature
Feature
You can now add existing Apache Parquet files from cloud-based storage, for example, Cloud Storage or Amazon S3, to an Apache Iceberg table in Google Cloud Lakehouse using the Dataflow job builder. This process registers the files without moving or rewriting the underlying data. For more information, see [Import Parquet files from storage to Lakehouse using Dataflow](https://cloud.google.com/dataflow/docs/guides/parquet-df-lakehouse-integration).
What else is happening at Google Cloud Platform?
Read update
Services
Share
Read update
Services
Share
Read update
Services
Share
Read update
Services
Share
Starting May 7, 2026, new transfer configurations that transfer data from Google Ads using the
about 9 hours ago
Services
Share