How to use Bigquery in ArcGIS
Minbin Jiang Lv4

By: Minbin Jiang

Introduction

Google BigQuery as data warehouse is renowned for it’s serverless, highly scalable and high performance in operation. Although it provides data visualization and geospatial analysis tools and solutions, while a GIS developer who normally might be not familiar with those tools and but more with ArcGIS platform.
ArcGIS provides a bunch of choices to do the job in different platforms and devices easily and consistently, the prerequisite is the feed-in data needs to be one type of the ArcGIS Geoservices. Therefore, this article will describe how to use Koop to convert the data from Bigquery into service of ArcGIS and consume it via ArcGIS Pro/Js API as example.

What’s Koop

There might be a lot of ways to convert the Bigquery into a ArcGIS Service, but here we are going to talk about one open source library named Koop which was initialized by Esri itself.

As per Koop’s official webside. Koop is a JavaScript toolkit which was built on Node.js that translate the GeoJson into the GeoServices specification supported by ArcGIS products. Below it is a Koop’s integration diagram from which we could take the koop as a middleware that covert feed-in data into GeoServices.As long as the feed-in data could be converted into geojson format.

Koop

The attractive feature of Koop is it uses plugin mechenism, we could dynamically add in Provider to connect the source data and translate them into GeoJSON and pair the Provider with various Output plugin from Esri GeoServices, OGC etc.

Koop already has some officially supported providers and outputs, which can be found via the website, however, unfortunately, Bigquery hasn’t been an official one yet. So we’re going to develop a custom Koop provider to fetch Bigquery data into GeoJson format.

Preparation

  • Load data into BigQuery

  • Get service account key

  • Install the latest Koop CLI

    npm install -g @koopjs/cli@latest

After installed the CLI, we could use Koop commands to create Koop applications and plugins

  • Create a Koop application

Koop new app Demo-app

The Koop core will be created along with app’s creation. We could just use Koop as middleware in any sort of Nodejs application, like Express etc.

Create the provider

After we have the Koop core, we then could plug in and out connectors namely provider and output.

In this case, we’re firstly create a provider to connect and fetch bigquery data.

Create the provider with the name provider-google-bigquery with the CLI. In the Demo-app folder

Koop new provider provider-google-bigquery

Then a folder with the given name is created in the root folder of the app and contains specific files and the structure.

Among these files, /config/default.json is the configuration file for this provider, we could put some environmental variable in it, model.js is the main file where we put our logic code.

After we create the provider, we need to register it to the Koop core by

Koop add provider provider-google-bigquery

After success, in demo-app/plugin.js we could see this provider has added to the list of the plugins.

const plugins = [providerGoogleBigquery];

````

Now we could start the server via

> Koop serve

Or jsut node js start command
> npm start

In the console, we could see the API structure generated, amazingly it is the same rest structure as ArcGIS MapServer and FeatureServer.

![](https://lh3.googleusercontent.com/pw/AM-JKLXNrYPmcfkvOlXzgKi2EDibd7JUYHsVMKjMr438Tq5z1dIgsy-Nj30ic_MaBnnX_EQP22pI4No8HuyO_IPX8QGnkm-uivgd6M6zyCaUBa4S1oNQaC5-yw2ZwKJw4DTa-uA9Wr83fv4AZ-7Q8ywxQ732tw=w814-h676-no?authuser=0)

Use Postman, we could test the API. Unfortunately, the version of the output service is 10.51 which not the latest.

![](https://lh3.googleusercontent.com/pw/AM-JKLXHqKP9dVjGdmBZjeqMqbeeGJ_LI2g1S_A7jO0qDLMN0RwDr5W0ktZ9w_Y8tUCiImlMUnybQRPatf7WpX8bShDl-vLEMUGeRSp_FPLRdbq5RV5iFnRE-uh8JpLixuBVm4PA3eZdT_GsYlU6vIvbR6poJQ=w843-h748-no?authuser=0)

Up to now, the customized provider has been successfully plug-in the Koop core and connecting with the default GeoSevices output, next we are going to add logic code to retrieve the data from the BigQuery.  

First, we need to install the BigQuery SDK for Node.js via
> npm install --save @google-cloud/bigquery@latest

Add reference in **model.js**
const { BigQuery } = require('@google-cloud/bigquery');

**model.js** is the template to extract source data and translate it into geojson, the process code within the function of **getData**

```Javascript
Model.prototype.getData = function (req, callback) {
}

The process could be roughly divided into steps:

  1. get the parameters from the requested Url
    Using code below to capture the parameters of the request Url
const { params: { id } } = req
  1. make a request to the Bigquery
async function query(id) {

  const bigquery = new BigQuery()
  const query = `SELECT * except(geometry), ST_X(geometry) as longitude, ST_Y(geometry) as latitude
      FROM \`bat-geo-dev.bat_farms.${id}\`

     `;

  const options = {
    query: query
    // Location must match that of the dataset(s) referenced in the query.

  };

  // Run the query as a job
  const [job] = await bigquery.createQueryJob(options);
  console.log(`Job ${job.id} started.`);

  // Wait for the query to finish
  const [rows] = await job.getQueryResults();

  // Print the results
  console.log(`Rows:${rows.length}`);
  // rows.forEach(row => console.log(row));

  return rows

}
  1. transfer the returned data into GeoJson
function convertBigqueryResultsToGeoJSON(json) {

  const { bigqueryResults: records } = json
  const features = json.map(convertBigQueryResultToGeoJSON)
  return {
    type: 'FeatureCollection',
    features
  }
}
function convertBigQueryResultToGeoJSON(record) {
  {

    const { longitude, latitude, ...properties } = record
    return {

      type: "feature",
      properties,
      geometry: {
        type: "Point",
        coordinates: [
          Number(longitude),
          Number(latitude)

        ]

      }
    }

  }
}
  1. add metadata to geojson

This optional, but highly suggested as it makes the returned data more descriptive.


//5. add metadata to geojson
   geojson.metadata = {

     geometryType: 'Point',
     description: 'google bigquery data'
   }
 geojson.metadata.idField = 'OBJECTID'

In the Postman we could test it out with the exact syntax of ArcGIS Rest API

We could use the powerful ArcGIS Pro to view the data layer and perform the spatial analysis

BigQuery into Vector Tiles

Up to here, we have created a custom provider and used the default output(GeoService) of Koop. BigQuery it is naturally for big data, so Feature Layer might not be the best visualization option when the data exceed a certain amount.
So it would be really helpful to use Feature Service for the query and data operation and use Vector Tiles for the visualization.

Fortunately, Koop has officially supported to output of the GeoJSON into its vector tile specification or I probably need to say MapBox specification as Esri adopted it.
Herer are steps:

  • Install the output

npm install –save @koopjs/output-vector-tiles

  • Register the output

Koop add output @koopjs/output-vector-tiles

Now when we start the application, we would find each registered provider has an extra output like:

ArcGIS API to consumer the service

We could use the ArcGIS Javascript API’s render to visualization the vector tiles.


var map = new Map("map", {
           basemap: "topo",
           zoom: 13,
           center: [-118.1445, 34.1478]
       });


       var pointlayer = new VectorTileLayer("http://localhost:8080/koop-provider-google-bigquery/rest/services/farms/VectorTileServer ");
       map.addLayer(pointlayer);
   });

Summary

This article gives us an example of how to use Koop to extend the capabilities of existing data. It inspired me that this would be really helpful when building the enterprise GIS system which will need integrate different source data in various formats.