Skip to content

WHAT IS KNATIVE?


About Knative

Knative is a platform built on top of Kubernetes that enables you to build, deploy, and manage modern serverless workloads. Kubernetes is the infrastructure that manages containerized applications, while Knative is a platform running on top of Kubernetes to enhance the deployment experience of developers. 

In addition to supporting scaling and managing the lifecycle of deployments, the use of Kubernetes and containerized applications greatly simplifies challenges that came from managing virtual server environments. Thanks to these benefits, a new trend called serverless computing has picked up momentum. In a serverless computing cloud execution model, the cloud provider runs the server environment and manages the allocation of machine resources of the server. 

Different applications require different deployment environments which may be an issue when it comes to achieving scalability and efficiency. There was a need to standardize this process and bridge the gap between application code and packaging containers for deployment. Hence, the development of Knative. 

Knative was originally developed by Google and open-sourced; the project has further benefited from additional support and contributions from Pivotal, IBM, Red Hat, and SAP. It allows developers to write code without worrying about where the code will run. When using it, developers do not need to know the underlying infrastructure of Kubernetes such as pods, deployments, etc.

Knative can be used to deploy and manage applications across multiple platforms. To achieve this, Knative ships with a set of middleware components that enables building, deploying, and scaling container-based applications. 

Components of Knative

Build

The Build component manages the building of source code into containers. It follows sequential steps that define the process of building sources into a container image. It ensures faster builds through cached artifacts. It allows you to leverage Dockerfiles or build templates.

Serving

The serving component ensures your application is accessible to the world through HTTP requests. It handles hosting, seamless auto-scaling, and managing the lifecycle of an application. The Knative serving component can scale to zero when no traffic is received for a set timespan. When a new request comes in, it automatically launches a new pod to serve the request.

Eventing

The eventing component manages the subscription, delivery, and management of events. It connects event publishers and event consumers through a Knative broker that acts as a conduit. Event publishers and event consumers are independent of each other. A producer publishes event messages to the broker. A consumer is bound to the broker through a trigger. This independent model of eventing allows for a highly scalable event infrastructure. As a developer, you can write code packaged in containers to respond to certain event triggers.  You are free to build custom event pipelines to connect with your already existing systems.

How Knative fits in the Google Cloud?

In line with Knative and serverless computing, Google Cloud provides a set of tools to the general public to enable developers to focus on writing great code. 

Knative focuses on tasks involved in building and running applications on cloud native platforms. These tasks include managing source-to-container builds, routing and managing traffic during deployment, binding services to event ecosystems, and handling auto-scaling workloads.

Cloud Run – Google provides a serverless execution environment for running stateless HTTP-driven containers called Cloud Run. Cloud Run is built on Knative Serving APIs. Cloud Run gives developers an entry point into deploying serverless applications. 

Cloud Run for Anthos – developers can deploy their application code using Knative on Google Cloud Anthos. Anthos is an application management platform that provides a consistent development and operations experience across various environments whether in the cloud (Anthos on Google Cloud) or on-premises (Anthos on-prem), or other cloud providers.

Google Cloud backs the Knative open source project by continually improving it and developing platforms that can leverage its features to ensure highly scalable apps.

Knative benefits

  • This platform allows developers to write code using any language or framework and deploy across various platforms while minimizing switching costs. 
  • It manages the server for you, hence removing the overhead cost of server management. 
  • You only pay for execution time, when the functions are not executing, there’s no cost incurred.
  • The stateless nature of Knative services allows for scalability – they can scale up when traffic increases, and down to zero at no additional cost.
  • Knative supports popular tools and frameworks such as Django, Laravel, Ruby on Rails, Spring, GitOps, DockerOps, ManualOps, and more.
  • If you already have a deployment infrastructure for your project, Knative is designed to plug easily into an existing build and CI/CD workflow. 
  • The experience is the same, whether in the cloud, on-premises, or in a third party data center.
containerization in gcp
Containerization in GCP

Containerization is a topic that has greatly gained traction in the software development tech space over recent years. 

cloud run
Cloud Run for Anthos

The integration of Cloud Run with Anthos avails a flexible serverless platform for deployment in both hybrid and multi-cloud environments. 

Google Cloud Anthos

Applications need to be developed and deployed both on clouds and on-premises. Due to the complicated nature…

Sign up to our newsletter and stay up to date with the latest news!