Google launches Java and Scala Procedures for BigQuery

Using stored procedures for Apache Spark with Java or Scala

Christianlauer
CodeX

--

Photo by Claudel Rheault on Unsplash

Some time ago, Google made BigQuery capable of using Apache Spark. Spark is an excellent way to execute data queries on vast data sets from various sources at high speed and with great performance. For this, the framework uses a distributed architecture and cluster computing. You could already use Apache Spark stored procedures that are written in Python[1].

After you create them, you can let them run easily with SQL, similar to running SQL stored procedures. A stored procedure in BigQuery is a collection of statements that can be called from other queries or stored procedures. A procedure can accept input arguments and return values as output.

Now, Google has announced that you can also create stored procedures for Apache Spark using Java or Scala and that you can also use the Google Cloud console PySpark editor to add options for stored Python procedures for Apache Spark[2].

--

--

CodeX
CodeX

Published in CodeX

Everything connected with Tech & Code. Follow to join our 1M+ monthly readers

Christianlauer
Christianlauer

Written by Christianlauer

Big Data Enthusiast based in Hamburg and Kiel. Thankful if you would support my writing via: https://christianlauer90.medium.com/membership