Deploying multiple Google cloud features from the same repo

The documentation for Google's cloud functions is a bit vague - I understand how to deploy a single function contained in index.js, even in a specific directory, but how to deploy several cloud functions that are located in the same store?

AWS Lambda allows you to specify a specific file name and function:

  /my/path/my-file.myHandler

Lambda also allows you to deploy a zip file containing only the files you need to run, omitting all the optional transitive npm dependencies and their resources. For some libraries (such as Oracle DB), including node-modules/**, significantly increase the deployment time and possibly exceed storage limits (this is done on AWS Lambda).

The best I can use when deploying Google Cloud Function:

$ gcloud alpha functions deploy my-function \ --trigger-http --source-url https://github.com/user-name/my-repo.git \ --source-branch master \ --source-path lib/foo/bar --entry-point myHandler

... but I understand that it deploys lib/foo/bar/index.js, which contains function myHandler(req, res) {}... and all the dependencies combined in one file? This makes no sense - as I said, the documentation is a bit vague.

+4
source share
1 answer

The current deployment tool takes a simple approach. He zips up the directory and loads it. This means that you (currently) must move or delete node_modulesbefore executing the command, if you do not want them to be included in the deployment package. Note that, like lambda, GCF automatically resolves dependencies.

Regarding deployment, see: gcloud alpha functions deploy --help

In particular:

--entry-point=ENTRY_POINT
         The name of the function (as defined in source code) that will be
         executed.

-source , sans upload. Google . script, .

+2

Source: https://habr.com/ru/post/1664037/


All Articles