Open In App

Google Cloud Platform – The Hello World of Cloud Computing

Improve
Improve
Improve
Like Article
Like
Save Article
Save
Share
Report issue
Report

The Hello World in the cloud is to provision, start, and connect a virtual machine. In this article, we will go through the Hello world of cloud computing on Google Cloud Platform.

Google Compute Engine enables you to create single VM instances from scratch by specifying a region, a machine type, an OS image, and other optional parameters. 

A few options that GCP really like to point out is the ability to add GPUs or TPUs (Tensor Processing Units) to your instance. 

It also has a plethora of OS images that are supported, including numerous Linux distributions and MS Windows, plus the support to use a custom image if you see fit. 

Once you click Create, give the system a little time to boot your instance, and then you’ll be able to SSH into it right from a browser.

You could also create a new VM from a previously saved template or from one of the numerous, ready-to-go solutions on the Marketplace. 

It also has other advance features, including fine-grained security access control, HTTPS connectivity, live migration of running applications, preemptive VMs, and more. These machines are on the Google network, which means that they’ll have minimal latency and great throughput for everything that requires accessing remote services. Now, one of the reasons that Compute Engine is so nice is that it gives you complete control. But with that control comes a great deal of management overhead. But, the Google Cloud Platform has offerings on the other side of the spectrum, where you just focus on code and Google handles the rest.

Now let’s talk about Google Cloud Functions. And to better understand the working of the entire cloud computing system on GCP, let’s take an example.

The serverless flavor of a Hello World to cloud computing has to be built on Cloud Functions. Here, we provided a simple snippet of code to listen to image file uploads into a storage bucket to create thumbnails for each one of them on the fly.

 To trigger this code to run, simply upload an image to your bucket and the function will do the rest. 

Now, Cloud Function supports a lot of triggers, such as changes in a database, messages from pub/sub, or even Compute Engine instances changing state. Cloud Functions can also be invoked using a standard HTTP request, meaning a user visits a URL and the function does the work.

These functions can be deployed in any region from a single project, and they’re authenticated with the rest of GCP services, making integration with other products and APIs from the platform quite natural and easy. For Cloud Functions to work, you don’t have to provision a cluster or deal with server configs. Google does that for you. And since most of the work you do with Cloud Functions is temporal, it means you only pay for the code that you’re running while it’s running. Now, one challenge users have with Cloud Functions is that while they’re easy to use, they make large-scale app development a little trickier. So if you need something larger than Cloud Functions, but still want serverless benefits, you should opt for GCP – App Engine, which has the same scale-on-demand but a lot more configurations for developers, such as services, versioning, and traffic splitting.

Another essential cloud component is Containers. The Hello World in the context of containers is to deploy a Docker image to an orchestrator. Since Google open-sourced Kubernetes, it has gotten a lot of feedback about what works and the challenges involved with deploying containers. This led to the development of Google Kubernetes Engine, or GKE, a fully-managed version of Kubernetes.

You can deploy your containerized applications to GKE which offers a production infrastructure guaranteeing uptime, while providing rich dashboard metrics and automated operations from auto-scaling to node repairs, to Kubernetes version upgrades. To deploy a containerized workload, simply describe the compute, memory, and storage resources your application containers require, and Kubernetes Engine will provision and manage the underlying cloud resources automatically. With GKE, you can also attach persistent storage. You can even run stateful workloads such as databases. You can also make use of GPUs and TPUs.

But best of all, your Kubernetes workloads are portable across all Kubernetes implementations, from your development environment to GKE, to other cloud or on-premises installations. In summary, GCP supports all types of computing workloads. It will run existing Linux and Windows applications on Compute Engine, containerized workloads on Kubernetes Engine, and functions or apps on Cloud Functions and App Engine. 


Last Updated : 04 Jan, 2021
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads