GCP Cloud SecurityGoogle Cloud CertificationsGoogle Cloud PlatformTraining Opportunities

Top 10 Google Cloud Platform (GCP) Security Best Practices

Google Cloud Platform Cloud Security Best Practices Posted On
Posted By MyBlockchainExperts

What you don’t know may hurt you. Even in the cloud

Chances are anyone using cloud services in Google Cloud Platform are using the basic services such as VPC firewalls, Cloud IAM and access control lists.

Google Cloud Platform is more than just those services and I wanted to identify some uncommonly implemented areas that I run into consulting or working with students. This is my top ten areas of best practices, some of which are straight forward for most architects and some are really meant for developers to implement as part of their application testing, deployments and production rollouts.

So lets get started..

10. Enabling flow logs

Capturing traffic packets that are moving through your VPC network interfaces is still a secret. I am surprised by how few implementations I have seen where this is a known option.

Cloud administrators need to enable flow logs for network subnets which are hosting VM instances. The main reason to use these flow logs are to identify, troubleshoot and analyze specific traffic when it is not reaching an instance. Flow logs are an essential way to identify traffic patterns, overspending, application issues and overall provide transparency.

The logs can be viewed via Stackdriver Logging and for that matter should be exported to BigQuery for analysis. The breadth of network logging is critical for a solid cloud security posture and performing security analytics. Flow Logs could be used with security analysis tools to investigate patterns that could indicate an Advanced Persistent Threat or the run of the mill threats.  Use it or you could lose more.

9. Utilizing the FULL Google Stackdriver suite of features effectively

Figure 1 – Stackdriver Trace

Stackdriver is a hybrid Monitoring, logging, and diagnostics suite for your applications on Google Cloud Platform. Monitoring and Logging are supported on AWS as well. Stackdriver monitors the clouds service layers in a single SaaS solutions. GCP Purchased Stackdriver and was rebranded to Google Stackdriver.

The Stackdriver suite has the following features.

  • Monitoring – Monitoring provides you immediate and actionable insight in your GCP environment.Monitoring identifies trends, helps you fix problems faster and even can reduce monitoring noise.
  • Logging – Use for maintain detailed records of your cloud resources. Measures the health of cloud resources and applications by providing visibility into metrics such as VM usage, CPU usage, disk I/O, memory, network traffic and much more. Monitoring is based on collectd, an open source daemon that collects system and application performance metrics.
  • Error Reporting – Analyzes, aggregates and digests the errors in your cloud applications. Notifies you when new errors are detected. Effectively Error Reporting brings you the processed data directly to provide insight and fix the root causes faster. Easy to use and enable email or mobile alerts via mobile applications. Export these errors to BigQuery and really get significant insight.
  • Debbugger -Inspect Applications and not have to stop it supports Java, Python or Go, even snapshots and logpoints. This is a developer dream which really can provide immediate insight into application issues. Developers need only to pass the sources for applications to start the debugger.
  • Trace is a latency detector essentially. Use to determine where in your code the latent processes, packets, bits really are. Trace can identify the root cause of your latent applications and provides reports around this.
  • Profiler is the newest addition to the Stackdriver suite. Profiler allows developers to analyze applications running in GCP and other cloud platforms, or on premises. Profiler effectively analyzes the performance of CPU or memory-intensive functions executed across an application(looks for CPU or Memory bound issues). Profiler presents the call hierarchy and resource consumption of the relevant function in an interactive “flame graph” (Heat graph) that developers can take immediate action with.

Another area around Stackdrive to ensure your logging is up to by creating log sinks and permissions.

To create or modify a sink, you must have the IAM roles Owner or Logging/Logs Configuration Writer in the sink’s parent resource. Then to view existing sinks, you must have the IAM roles Viewer or Logging/Logs Viewer in the sink’s parent resource

Then with Kubernetes Engine we need to be aware of logs that are available. Your GCP project has several logs that are relevant to a GKE cluster and these logs include the Admin Activity log, the Data Access log, and the Events log. Below are Resource types for example.

Figure 2. Stackdriver Kubernetes Engine Resource Types

I could literally spend a day walking through Stackdriver related capacity and functions… Lets move on for now.

8. Always run Cloud Web Security Scanner

Utilizing Cloud Web Security Scanner (Formerly Security Scanner)- In a true DevOps environment its always better to test applications before you deploy them. This tool can help you find security problems in your app so you can head off potential vulnerabilities such as

Figure 3 Cloud Web Security Scanner



  • Cross-site scripting (XSS)
  • Flash injection
  • Mixed-content
  • Clear text passwords
  • Usage of insecure JavaScript libraries

Web Security Scanner API currently supports the App Engine standard environment and App Engine flexible environments, Compute Engine instances, and GKE resources.

It is important to note that Web Security Scanner does not replace a manual security review process nor does it not guarantee that your application is free from security flaws. Think of this solution as the check before the check..

7. Using Kubernetes Containers? – Implement Binary Authorization.

Binary Authorization lets you restrict which images can be deployed to a Kubernetes Engine cluster by making sure they pass through the appropriate checkpoints in your deployment workflow. Think of Binary Authorization as a security guard validating you really do have access to the containers. Enable Cluster rules or deployment rules as well as what are “attestors”

Figure 4 Binary Authorization

Binary authorization should be used as part your overall container deployment strategy.

Cloud Native. Are you Ready?

6. Whitelisting with Cloud IAP

Cloud IAP works by verifying user identity and context of the request to determine if a user should be allowed to access an application or a VM. Think of Cloud IAP as both an efficiency since it provides faster sign in but also a security feature that will block unauthorized VM access.

The main selling point for IAP is to control access to your enterprise applications from unauthorized external networks while integrating with existing IAM policies. Effectively whitelisting meaning that your choosing your winners on the list.

SSH/TCP access can be used with App Engine, Kubernetes Engine and Compute Engine. HTTP based acces can used with the HTTPS Load balancer.

Figure 5 Cloud IAP

IAP provides two distinct levels of security authorization goodness.

  1. Resource Authorization — Uses what are Oauth2 flows which generate a signed access token. As expected IAP will use this token to validate identity for application level access.
  2. App Validation — This works at a user’s identity level by using signed headers that are generated by IAP. Consider this as a secondary level of protection since it would catch a bypass IAP attempt.

5. Use Oath 2.0 as an integrated authorization plan

Chances are if your reading this article your familiar with Oath 2.0. Oath 2.0 (RFC 6749) is a widely used authorization framework which has the main purpose of enabling applications to access resources in both cloud and on premise services.

OAuth 2.0 allows for arbitrary clients which are effectively “trusted” parties to access “trusted” resources.

Figure 6. High Level Steps to Oath2

High Level Steps in GCP

When implementing OAuth with GCP it opens up your world in Google Cloud Platform as a baseline for enabling and or complementing other services services such as Identity Aware Proxy, Cloud Endpoints,

Oath 2 is a complex subject and to implement it effectively especially for a hybrid approach you would be wise to review documentation or training around this area. Google Cloud has provide a very useful and pictorial driven tutorial. https://cloud.google.com/community/tutorials/understanding-oauth2-and-deploy-a-basic-auth-srv-to-cloud-functions

4. Data Loss Prevention (DLP)

Google Cloud has an awesome capacity for Data Loss Prevention (DLP) when properly planned and implemented to reduce risk around “sensitive data”.

One of the easiest ways to ruin public trust in any company is via a data breach or data exposure. Data exposure is when generally specific types of data such as Personally Identifiable Information (PII) is exposed to hackers or the internet.

“Google Cloud DLP API enables our security solutions to scan and classify documents and images from multiple cloud data stores and email sources. This allows us to offer our customers critical security features, such as classification and redaction, which are important for managing data and mitigating risk. Google’s intelligent DLP service enables us to differentiate our offerings and grow our business by delivering high quality results to our customers.” —  Sateesh Narahari, VP of Products, Managed Methods

Basically, you can classify and mask sensitive elements(PII, Credit Card, Passports, etc) in both structured data and unstructured data models.

In this figure it shows a simple US Social Security Number search template created.

Figure 7. Data Loss Prevention Template

For more on Data Loss Prevention https://cloud.google.com/blog/products/gcp/new-ways-to-manage-sensitive-data-with-the-data-loss-prevention-api

3. Implement Column Level Encryption for databases.

Yep, this one catches people off guard. Why, its generally a secret to everyone outside of the IT security and database realm. This is effectively another way to ensure that PII data is not transmitted in plain text in a database structure.

Column Level Encryption was intentionally designed to allow users to identify specific information/attributes/properties to be encrypted as opposed to encrypting an entire database.

The main benefits of using Column Level Encryption focused on ensuring privacy, reducing overhead on data transmissions, unique encryption key for the database and more.

So if your database is 220GBs and the PII data is only 20GB then encrypt the 20GB of PII data. Some data is more sensitive than others such as personal weight, religion, family info, etc. This really personal information can identify a person when put together so just secure it with encryption while in transit and of course at rest.

Its easy, just identify your data by “categorizing” it and encrypt the categories that hold the personal sensitive data. The best option is to use a database that has this support natively.

Figure 8. Cloud SQL Database Engines

GCP does not natively support this yet. . In GCP such as Cloud SQL Postgres database engine you need to manually setup encryption with Cloud KMS.

2. Use Organizations and open a breadth of additional security features

Cloud Platform is extremely flexible, secure and of course useful even without Organizations. However, to enable the full sprectrum of services in Google Cloud Platform an “Organization” is really needed and here comes GSuite.

An organization node is top-level node of the hierarchy is the Organization resource hierarchy. The Organization resource represents an organization such as yourcompany.com. The Organization resource provides central visibility and control over all resources further down the hierarchy.

Some of the security related services that could be enabled with an organization are

  • Cloud IAM Folders
  • Security Command Center
  • Threat Detection
  • Content-Award Proxy
  • Access Context Manager
  • VPC Service Controls

Another option is to consider using Cloud Identity which is an Identity as A Service from Google.

Below is the link for Cloud Identity comparing with Gsuite https://gsuiteupdates.googleblog.com/2017/06/enterprise-identity-made-easy-in-g-suite-with-cloud-identity.html

Are looking at getting certified as Google Cloud Professional? Consider reviewing this article to find out more.

1. Reference Google Cloud Platform Enterprise Best Practices

Yes, that right just the enterprise best practices routinely. Sounds simple but you would be surprise how many folks who use GCP do not routinely reference their best practices, whitepapers or even support issues.

Google Cloud has done an effectively solid job of maintaining their best practices documentation. Its the bible of Google Cloud best practices and effectively provides you direct insight into how Google manages security as well.

The Google Cloud Platform Enterprise Best Practices are here.

Figure 9. Best practices Marine Drill Sargent

https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations

Carry on my cloud friends and please do let me know any feedback or suggestions.

Joe Holbrook, the Cloud Tech Guy

Certified Blockchain Solutions Architect Exam
Lets talk about the exam overview first and then Ill cover the “Top Five Reasons” to take the CBSA exam now. The demand for blockchain expertise is multifold, the myth that the demand is for “developers” is false.. If your a developer and know Node.js, PHP, etc then your in great shape to hop on over to the blockchain world..





Related Post

leave a Comment