Sign up for our newsletter
Technology / Cloud

Google goes after real-time data ‘the cloud way’

Google has made its Cloud Dataflow publicly available in beta, as well as unveiling new features for its BigQuery cloud.

The additions to BigQuery are designed to improve performance with the introduction of row-level permissions, designed to add flexibility to the solution. Google has also set a higher default ingestion limit of 100,000 rows per second table.

Another notable addition is the new feature for geographic data isolation for business. This is for businesses which want data stored in Google Cloud Platform European zones.

Cloud Dataflow is designed to have firms use its SDKs in order to write software that defines batch or streaming data-processing jobs.

White papers from our partners

William Vambenepe, Product Manager, wrote on the Google blog to say: "Google Cloud Pub/Sub is designed to provide scalable, reliable and fast event delivery as a fully managed service."

"Along with BigQuery streaming ingestion and Cloud Dataflow stream processing, it completes the platform’s end-to-end support for low-latency data processing. Whether you’re processing customer actions, application logs or IoT events, Google Cloud Platform allows you to handle them in real time, the cloud way."

This article is from the CBROnline archive: some formatting and images may not be present.