Kubernetes in OpenStack

In this blog:

  • The Rapid Adoption of Kubernetes
  • Containers on OpenStack
  • Edge Computing Within Kubernetes
  • Security and On-Premises Considerations

Kubernetes With Unparalleled Flexibility and Control

Learn About Kubernetes on OpenStack >>

In February of 2022, CNCF saw record-breaking adoption of Kubernetes. Based on the numbers that the CNCF pulled, 96% of organizations are either using, attempting to use, or currently adopting Kubernetes. The 2021 Cloud Native Survey report received 3,829 responses, which is more than 38 times the amount of respondents from the same report done in 2016.

It’s pretty clear that organizations are utilizing Kubernetes at a mass level. In this blog post, you’ll learn about why the Open Infrastructure Foundation and OpenStack are keeping a sharp eye on Kubernetes and why it’s important in today’s microservice-driven world.

Records Are Being Broken

As you read in the opening of this blog post, 96% of organizations are attempting something with Kubernetes. That could be full production-level workloads running the entire tech stack or a few engineers running a Hello World app and orchestrating it with Kubernetes for fun. The point is, Kubernetes is a thing and the concept of what Kubernetes brings us, which is API-driven infrastructure, isn’t going away any time soon.

Without throwing too many buzzwords into this blog post, there is a true concept of Day One Ops vs Day Two Ops. Kubernetes allows developers and engineers to focus on Day Two Ops, which consists of actually running the application, deploying the app, and scaling it out, whereas Day One Ops is more about where and how the app will run. Containers give engineers the ability to literally test the app directly on their local computer using something like Docker Desktop or Minikube. Because of that, engineers are noticing the ability to move faster than they were able to before even in the testing phase.

Combining OpenStack and Kubernetes, you get the best of both worlds. Day One Ops can be done to get OpenStack up and running, then Day Two Ops can be done for the rest of the workflow as OpenStack gives you the true “cloud” feel. Teams can work together on both creating/deploying the infrastructure and deploying the applications all under one roof.

Records are being broken due to Kubernetes adoption, which comes for good reason – it’s making the lives of engineers (developers, infrastructure, DevOps, etc.) much easier.

Containers Are A Thing

As mentioned in the previous section, engineers love containers. Sure, there’s a big learning curve in the beginning. Things like:

  • Dependencies for an application
  • How to package the application
  • What ports are needed to run the application
  • CPU and memory requirements

But once all of those needs are understood, deploying a container is nothing like deploying a binary on a Linux server. Containers are much faster to deploy, much easier to scale, and far easier to maintain.

Running containers in OpenStack is the same thing. Containers run just as easily using Kubernetes in OpenStack as they do on an engineer’s localhost.

By 2023, Gartner expects that 70% of organizations will be running more than two containerized apps.

Edge Computing

By definition, edge computing is the ability to pull computation and data storage closer to the source, which provides better network bandwidth and faster response times.

Kubernetes is actually a pretty solid platform for edge cases.

For example, let’s say that you’re hosting applications on standard servers that are deployed and you decide that you want to implement edge computing. If you have multiple locations or data centers, that means you have to attempt to deploy a virtualized server as close to the originating data source as possible. Depending on how many data centers and locations you have, that could get pretty cumbersome.

With edge computing within Kubernetes, the data source is pretty close considering you’re deploying it as a pod running in the Kubernetes cluster that one of the edge nodes is running on. The pod with the data source and the edge node is inside of the same Kubernetes cluster, and you can’t get much closer to the source than that.

To run Kubernetes on the edge, you can use edge nodes, which are essentially Kubernetes Worker Nodes that have one purpose – edge-based deployments. KubeEdge is the most popular way to do this at the moment and it’s a project within the CNCF.

Security And On-Premises Considerations

There are a ton of reasons why organizations would want to use public cloud for security purposes. The reality is if you’re a smaller company, it’s probably easier to check all of the compliance checkboxes in a public cloud. However, there are also a lot of reasons why organizations want to host servers and platforms outside of a public cloud provider.

One of the biggest reasons why organizations want to run OpenStack versus using a public cloud is because they want the “feel” of using a public cloud, but have the ability to host it themselves. The second biggest reason is from a security and compliance perspective. Although smaller organizations may have to worry about checking a compliance checkbox for customers, larger organizations sometimes have specific legally binding contracts that they have to know the “when and where” of all data that are being hosted and consumed.

For example, let’s say you sign a contract with a customer/client that states you must have the ability to know where data is at any given time, as in, what server it’s running on, and what datacenter/region it’s running in. Chances are with a public cloud provider, you won’t be able to retrieve that information…and even if you do, you won’t be able to pinpoint the exact time/place/source due to the high amount of scaling in the public cloud. If you do figure out where the data was at let’s say, 1:00 PM, the data could’ve been somewhere completely different at 12:00 PM.


New call-to-action


About Our Guest Writer

Michael Levan is a consultant, researcher, and content creator with a career that has spanned countless business sectors, and includes working with companies such as Microsoft, IBM, and Pluralsight. An engineer at heart, he is passionate about enabling excellence in software development, SRE, DevOps, and actively mentors developers to help them realize their full potential. You can learn more about Michael on his website at: https://michaellevan.net/

Interested in Learning More?

OpenMetal and OpenStack are the perfect fit for Kubernetes. But don’t take our word for it:

More from OpenMetal…

Kubernetes Workloads

Ready to run Kubernetes Workloads on OpenStack? This page is our library of all Kubernetes Documentation, tutorials and Blogs created by our team and affiliates.

OpenMetal’s Cloud Cores support Kubernetes integration and gives users the freedom to choose their deployment and management systems…Learn More

Unleashing the Potential of Cloud-Based Applications with OpenShift.

Prefer to use OpenShift to run Kubernetes workloads on OpenStack? Explore how to streamline cloud-based application management with OpenShift. Learn more about its features and uses. Bonus OpenShift tutorial by LearnLinuxTv …Read More

Comparing Public, Private and Alternative Clouds- Which is Right for Your Organization?

Comparing Public, Private and Alternative Clouds – Which is Right for Your Organization?

With public and private clouds as the traditional options, innovative alternative clouds have emerged and are making waves. Deciding which cloud to use for your organization requires careful consideration of factors such as your unique business needs, budget, security  … Read More

Test Drive

For eligible organizations, individuals, and Open Source Partners, Private Cloud Cores are free to trial. Apply today to qualify.

Apply Now

Subscribe

Join our community! Subscribe to our newsletter to get the latest company news, product releases, updates from partners, and more.

Subscribe