Starting a cloud data stream from cloud functions

How to run Cloud Dataflow task from Google Cloud Function ? I would like to use Google Cloud Functions as a mechanism to enable cross-service composition.

+4
source share
1 answer

I have included a very simple example WordCount example below. Note that you will need to include a copy of the java binary in your Cloud Function deployment, as it is not in the default environment. Similarly, you will also need to package your deployment jar using the cloud feature.

module.exports = {
  wordcount: function (context, data) {
    const spawn = require('child_process').spawn;
    const child = spawn(
            'jre1.8.0_73/bin/java',
            ['-cp',
             'MY_JAR.jar',
             'com.google.cloud.dataflow.examples.WordCount',
             '--jobName=fromACloudFunction',
             '--project=MY_PROJECT',
             '--runner=BlockingDataflowPipelineRunner',
             '--stagingLocation=gs://STAGING_LOCATION',
             '--inputFile=gs://dataflow-samples/shakespeare/*',
             '--output=gs://OUTPUT_LOCATION'
            ],
            { cwd: __dirname });

    child.stdout.on('data', function(data) {
      console.log('stdout: ' + data);
    });
    child.stderr.on('data', function(data) {
      console.log('error: ' + data);
    });
    child.on('close', function(code) {
      console.log('closing code: ' + code);
    });
    context.success();
  }
}

, , , . SDK, .

+6

Source: https://habr.com/ru/post/1628882/


All Articles