Google launches Java and Scala Procedures for BigQuery
Using stored procedures for Apache Spark with Java or Scala
Some time ago, Google made BigQuery capable of using Apache Spark. Spark is an excellent way to execute data queries on vast data sets from various sources at high speed and with great performance. For this, the framework uses a distributed architecture and cluster computing. You could already use Apache Spark stored procedures that are written in Python[1].
After you create them, you can let them run easily with SQL, similar to running SQL stored procedures. A stored procedure in BigQuery is a collection of statements that can be called from other queries or stored procedures. A procedure can accept input arguments and return values as output.
Now, Google has announced that you can also create stored procedures for Apache Spark using Java or Scala and that you can also use the Google Cloud console PySpark editor to add options for stored Python procedures for Apache Spark[2].