Open In App

Google Cloud Platform – Understanding Functions as a Service (FaaS)

Improve
Improve
Like Article
Like
Save
Share
Report

Serverless is much more than a buzzword. And in this article, you’ll understand how Google Cloud Platform makes serverless work for its users. If you look back, most of the early days of Google Cloud focused on what could be called a serverless offering, as they’ve implemented and offered products with this fundamental principle: You provide Google your code and Google will do the rest. There are no servers to manage and no cluster to set up, provision, or upgrade. In addition, the infrastructure auto-scales to meet the load demand, and you are only billed for the resources consumed. As a developer building web apps, mobile APIs, or back-end services, this gives users the ability to focus on the business logic of your apps. GCP offers a number of different products to help you accomplish exactly what you need.

Function as a service is a very popular approach to serverless computing. Developers write a piece of code (a function) that is triggered on an event or on an incoming HTTP request. This function code can talk to a database and to other services before it generates another event or simply sends a response back.

 Google Cloud Functions supports multiple programming languages, such as Python, Node.js, Go, Java, and more to come, and these can be used with a large set of libraries and frameworks. Simply deploy your function along with its list of dependencies, which will be resolved directly in the cloud, and configure the event that will trigger its execution.

This can be an HTTP event, but also a file upload to a Cloud Storage bucket, data changing in a database, a message being posted to a pub/sub queue or topic, or several other events. Functions can be assigned deploy time environment variables. They can be deployed to any and multiple regions from a single project, and they support up to several gigabytes of memory. Functions can also be configured with specific security constraints, such as least privilege access with per function identity. You can even use the Compute metadata server to fetch identity tokens at runtime. Cloud Functions are billed by the number of invocations, the compute time, and the outgoing network, or egress, with free first 2 million invocations every month. 

Cloud Functions are probably the easiest way to access one of the many powerful GCP services, from machine learning APIs to other storage and processing solutions that have been used by developers to implement everything from glue code to fully-fledged microservices-based applications. Now, as a developer, you may want even more freedom in the languages and frameworks that you use. And most importantly, you may rather want to hand over a carefully crafted Docker container image instead of source code. Cloud Run offers a true serverless experience if a stateless HTTP container image is what you’d like to use. Simply build the image, upload it to Cloud Registry, and create a Cloud Run service using that container. The below command can be used to do so:

$ gcloud beta run deploy --image gcr.io/project/image

Or, you can use the cloud console as shown below:

At this stage, your app is already deployed and running in the cloud. And you can forget about provisioning or managing servers. Cloud Run does that for you. It automatically and quickly scales up and down based on your incoming traffic, and even scales down to zero. You pay only for the resources that your app uses down to the nearest100th millisecond. Because Cloud Run is built with Knative, you can use it with your own Kubernetes Engine cluster too. With Cloud Run on GKE, you can build and deploy apps to your own Kubernetes cluster with the same easy experience and benefits. Whether running on GKE or not, Cloud Run supports deploying multiple services in a single GCP project, either in multiple regions or in specific namespaces when running in a GKE cluster. Each service exposes a unique endpoint. Each service deployment creates a revision, and requests are automatically routed as soon as possible to the latest healthy service revision. Finally, each revision receiving the request is automatically scaled to the number of container instances needed to handle all these requests. 

Note that a given container instance can receive multiple requests at the same time. With the concurrency setting, you can set the maximum number of requests that can be sent in parallel to a given container instance. 

Cloud Run essentially gives you the flexibility of modern container-based development with all the benefits of a fully serverless environment, auto-scaling to your application’s needs.

 If you are building an entire web application with multiple services and would still like to deploy source code while preserving all the serverless benefits, you should check out Google App Engine. App Engine is where you’ll find Google Cloud Platform’s serverless routes. App Engine is a managed platform that lets you pick a language you’re most comfortable with. Simply run the below command and App Engine will take care of uploading your code and running it on Google’s infrastructure.

$  gcloud app deploy

 The second-generation App Engine run-times, available for Java, Node, PHP, Go, and Python, offer a truly idiomatic experience with the ability to use any language API and any framework with read/write file system access and with isolation provided by GVisor, a powerful open-source sandbox technology. App Engine applications can be built using multiple services, each of which can use different languages and can be scaled independently. Each service can offer multiple versions, all active at the same time, and it is trivial for you to set up staged roll-outs or A/B testing across these versions with traffic splitting.

 App Engine takes out-of-the-box tooling one step further with full app performance management tools available, such as live debugging of production apps, tracing of requests flowing across your system, and even CPU and heap profiling of your app. 

Google App Engine is a mature serverless platform. Modern and more advanced applications can be written with a combination of Cloud Functions, App Engine, and Cloud Run. And both Cloud Pub/Sub and Cloud Tasks are popular and elegant solutions for integrating these different parts or even simply to combine multiple functions.

 Cloud Pub/Sub is a simple, reliable, and scalable event system, offering many-to-many asynchronous messaging that nicely decouples senders and receivers. It supports at-least-once delivery and is global by default with no infrastructure to manage. Just publish and consume up to hundreds of millions of messages per second. 

Cloud Tasks, on the other hand, offers a dispatch system to manage the execution of large numbers of distributed tasks. It is ideal for one-to-one asynchronous messaging and comes with rate limit controls. 

Cloud Scheduler is a fully managed cron job service, enabling you to schedule tasks that are invoked through HTTPS endpoints, Cloud Pub/Sub topics, or App Engine applications.

All of these solutions briefly discussed here are fully managed and monitored. Logging and error reporting is just built-in. 



Last Updated : 08 Jan, 2021
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads