Multi-cluster Management with GitOps

In this blog post we are going to introduce Multi-cluster Management patterns with GitOps and how you can implement these patterns on OpenShift.
If you’re interested in diving into an interactive tutorial, try this link.

In the introductory blog post to GitOps we described some of the use cases that we can solve with GitOps on OpenShift. In
today’s blog post we are going to describe how we can leverage GitOps patterns to perform tasks on multiple clusters.

We are going to explore the following use cases:

  • Deploy an application to multiple clusters
  • Customize the application by cluster
  • Perform a canary deployment

During this blog post we are not going to cover advanced GitOps workflows, instead we are going to show you basic capabilities around
the topic. More advanced posts around GitOps workflows will follow.

Environment

  • Two OpenShift 4.1 clusters, one for preproduction (context: pre) environment and one for production (context: pro) environment.
  • ArgoCD used as the GitOps tool
  • Demo files here

Deploy an Application to Multiple Clusters

In this first example, we are going to deploy our base application to both clusters.

As we are using ArgoCD as our GitOps tool an ArgoCD Server is already deployed in our environment as well as the argocd cli tool.

Our application definition can be found here

  1. Ensure we have access to both clusters
    $ oc --context pre get nodes
    NAME                                             STATUS   ROLES    AGE   VERSION
    ip-10-0-128-17.ap-southeast-1.compute.internal   Ready    master   19h   v1.13.4+ab8449285
    ip-10-0-136-41.ap-southeast-1.compute.internal   Ready    worker   19h   v1.13.4+ab8449285
    ip-10-0-151-90.ap-southeast-1.compute.internal   Ready    worker   19h   v1.13.4+ab8449285
    
    $ oc --context pro get nodes
    NAME                                              STATUS   ROLES    AGE   VERSION
    ip-10-0-140-239.ap-southeast-1.compute.internal   Ready    master   19h   v1.13.4+ab8449285
    ip-10-0-142-57.ap-southeast-1.compute.internal    Ready    worker   19h   v1.13.4+ab8449285
    ip-10-0-170-168.ap-southeast-1.compute.internal   Ready    worker   19h   v1.13.4+ab8449285
    
  2. Ensure we have our clusters registered in ArgoCD
    $ argocd cluster list
    SERVER                                      NAME  STATUS      MESSAGE
    https://api.openshift.pre.example.com:6443  pre   Successful  
    https://api.openshift.pro.example.com:6443  pro   Successful  
    https://kubernetes.default.svc                    Successful  
    
  3. Add our GitOps repository to ArgoCD
    $ argocd repo add https://github.com/mvazquezc/gitops-demo.git
    
    repository 'https://github.com/mvazquezc/gitops-demo.git' added
    
  4. Deploy our application to preproduction and production clusters
    # Create the application on Preproduction cluster
    $ argocd app create --project default --name pre-reversewords --repo https://github.com/mvazquezc/gitops-demo.git --path reversewords_app/base --dest-server https://api.openshift.pre.example.com:6443 --dest-namespace reverse-words --revision pre
    
    application 'pre-reversewords' created
    
    # Create the application on Production cluster
    $ argocd app create --project default --name pro-reversewords --repo https://github.com/mvazquezc/gitops-demo.git --path reversewords_app/base --dest-server https://api.openshift.pro.example.com:6443 --dest-namespace reverse-words --revision pro
    
    application 'pro-reversewords' created
    

    4.1 Above commands create a new ArgoCD Application named pre-reversewords and pro-reversewords that will be deployed on preproduction and production clusters in reverse-words namespace using the code from pre/pro branch located under path reversewords_app/base

  5. As we haven’t defined a sync policy, we need to force ArgoCD to sync the Git repo content on our pre and pro clusters

    $ argocd app sync pre-reversewords
    $ argocd app sync pro-reversewords
    
  6. After a few seconds we will see our application deployed on pre and pro clusters
    # Get application status on preproduction cluster
    $ argocd app get pre-reversewords
    
    Name:               pre-reversewords
    Project:            default
    Server:             https://api.openshift.pre.example.com:6443
    Namespace:          reverse-words
    URL:                https://argocd.apps.example.com/applications/pre-reversewords
    Repo:               https://github.com/mvazquezc/gitops-demo.git
    Target:             pre
    Path:               reversewords_app/base
    Sync Policy:        <none>
    Sync Status:        Synced to pre (306ce10)
    Health Status:      Healthy
    
    GROUP  KIND        NAMESPACE      NAME           STATUS  HEALTH
          Namespace                  reverse-words  Synced  
          Service     reverse-words  reverse-words  Synced  Healthy
    apps   Deployment  reverse-words  reverse-words  Synced  Healthy
    
    # Get application status on production cluster
    $ argocd app get pro-reversewords
    
    Name:               pro-reversewords
    Project:            default
    Server:             https://api.openshift.pro.example.com:6443
    Namespace:          reverse-words
    URL:                https://argocd.apps.example.com/applications/pro-reversewords
    Repo:               https://github.com/mvazquezc/gitops-demo.git
    Target:             pro
    Path:               reversewords_app/base
    Sync Policy:        <none>
    Sync Status:        Synced to pro (98bbfb1)
    Health Status:      Healthy
    
    GROUP  KIND        NAMESPACE      NAME           STATUS  HEALTH
          Namespace                  reverse-words  Synced  
          Service     reverse-words  reverse-words  Synced  Healthy
    apps   Deployment  reverse-words  reverse-words  Synced  Healthy
    
  7. Our application defines a service for accessing its API, let’s try to access and get the release name for both clusters
    # Get the preproduction cluster LB hostname
    $ PRE_LB_HOSTNAME=$(oc --context pre -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
    # Get the production cluster LB hostname
    $ PRO_LB_HOSTNAME=$(oc --context pro -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
    # Access the preproduccion LB and get the release name
    $ curl http://${PRE_LB_HOSTNAME}:8080
    
    Reverse Words Release: Base release. App version: v0.0.2
    # Access the production LB and get the release name
    $ curl http://${PRO_LB_HOSTNAME}:8080
    
    Reverse Words Release: Base release. App version: v0.0.2
    

As you have seen, we have been able to deploy to multiple clusters from a single tool (ArgoCD). In the next section we are going to explore how we can override some configurations depending on the destination cluster by using embedded Kustomize on ArgoCD.

Customize the Application by Cluster

In this second example, we are going to modify the application behavior depending on which cluster is deployed.

We want the application to have a release name preproduction or production depending on which environment the application gets deployed on.

ArgoCD leverages Kustomize under the hood to deal with configuration overrides across environments.

The way we organize our application in Git is as follows:

  • The Git Repository has two branches, pre which has manifests for preproduction env, and pro for production env.

Application overrides can be found in their respective folders and branch:

  1. We placed the application overrides in the Git repository, there is only one override that configures a release name different than the default based on the cluster the application gets deployed
  2. Deploy our Kustomized application to preproduction and production clusters
    # Create the application on Preproduction cluster
    argocd app create --project default --name pre-kustomize-reversewords --repo https://github.com/mvazquezc/gitops-demo.git --path reversewords_app/overlays/pre --dest-server https://api.openshift.pre.example.com:6443 --dest-namespace reverse-words --revision pre --sync-policy automated
    
    application 'pre-kustomize-reversewords' created
    
    # Create the application on Production cluster
    argocd app create --project default --name pro-kustomize-reversewords --repo https://github.com/mvazquezc/gitops-demo.git --path reversewords_app/overlays/pro --dest-server https://api.openshift.pro.example.com:6443 --dest-namespace reverse-words --revision pro --sync-policy automated
    
    application 'pro-kustomize-reversewords' created
    

    2.1 Above commands create a new ArgoCD Application named pre-kustomize-reversewords and pro-kustomize-reversewords that will be deployed on preproduction and production clusters in reverse-words namespace using the code from pre and pro branch respectively. Each application will get the code from a different folder in our overlays folder, that way the application will be customized depending on which environment it gets deployed on. Note that only the modified values are stored in the overlay folder, the base application is still deployed from the base folder, so we don’t end up having duplicate application files.

  3. As we have defined an automated sync policy we don’t need to force the sync, ArgoCD will start synching our application once it gets created. On top of that, if changes were made to the application repository, ArgoCD would re-deploy the changes for us.

  4. After a few seconds we will see our application deployed on pre cluster

    # Get application status on preproduction cluster
    $ argocd app get pre-kustomize-reversewords
    
    Name:               pre-kustomize-reversewords
    Project:            default
    Server:             https://api.openshift.pre.example.com:6443
    Namespace:          reverse-words
    URL:                https://argocd.apps.example.com/applications/pre-kustomize-reversewords
    Repo:               https://github.com/mvazquezc/gitops-demo.git
    Target:             pre
    Path:               reversewords_app/overlays/pre
    Sync Policy:        Automated
    Sync Status:        Synced to pre (306ce10)
    Health Status:      Healthy
    
    GROUP  KIND        NAMESPACE      NAME           STATUS  HEALTH
          Namespace                  reverse-words  Synced  
          Service     reverse-words  reverse-words  Synced  Healthy
    apps   Deployment  reverse-words  reverse-words  Synced  Healthy
    
    # Get application status on production cluster
    $ argocd app get pro-kustomize-reversewords
    
    Name:               pro-kustomize-reversewords
    Project:            default
    Server:             https://api.openshift.pro.example.com:6443
    Namespace:          reverse-words
    URL:                https://argocd.apps.example.com/applications/pro-kustomize-reversewords
    Repo:               https://github.com/mvazquezc/gitops-demo.git
    Target:             pro
    Path:               reversewords_app/overlays/pro
    Sync Policy:        Automated
    Sync Status:        Synced to pro (98bbfb1)
    Health Status:      Healthy
    
    GROUP  KIND        NAMESPACE      NAME           STATUS  HEALTH
          Namespace                  reverse-words  Synced  
          Service     reverse-words  reverse-words  Synced  Healthy
    apps   Deployment  reverse-words  reverse-words  Synced  Healthy
    
  5. Our application defines a service for accessing its API, let’s try to access and get the release name for both clusters
    # Get the preproduction cluster LB hostname
    $ PRE_LB_HOSTNAME=$(oc --context pre -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
    # Get the production cluster LB hostname
    $ PRO_LB_HOSTNAME=$(oc --context pro -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
    # Access the preproduccion LB and get the release name
    $ curl http://${PRE_LB_HOSTNAME}:8080
    
    Reverse Words Release: Preproduction release. App version: v0.0.2
    # Access the production LB and get the release name
    $ curl http://${PRO_LB_HOSTNAME}:8080
    
    Reverse Words Release: Production release. App version: v0.0.2
    

As you have seen, we have been able to deploy to multiple clusters and use custom configurations depending on which cluster we are using to deploy the application. In the next section we are going to explore how we can use GitOps to perform a basic canary deployment.

Perform a Canary Deployment

A common practice is to deploy a new version of an application to a small subset of the available clusters, and once the application has been proven to work as expected, then it gets promoted to the rest of the clusters.

We are going to use the Kustomized apps that we created before, let’s verify which versions are we running:

# Get the preproduction cluster LB hostname
$ PRE_LB_HOSTNAME=$(oc --context pre -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
# Get the production cluster LB hostname
$ PRO_LB_HOSTNAME=$(oc --context pro -n reverse-words get svc reverse-words -o jsonpath='{.status.loadBalancer.ingress[*].hostname}')
# Access the preproduccion LB and get the release name
$ curl http://${PRE_LB_HOSTNAME}:8080

Reverse Words Release: Preproduction release. App version: v0.0.2
# Access the production LB and get the release name
$ curl http://${PRO_LB_HOSTNAME}:8080

Reverse Words Release: Production release. App version: v0.0.2

As you can see the current deployed version is v0.0.2, let’s perform a canary deployment to v0.0.3.

  1. We need to update the container image that will be used on preproduction cluster, we are going to modify the Deployment overlay as follows:
    # reversewords_app/overlays/pre/deployment.yaml in git branch pre
    apiVersion: apps/v1
    kind: Deployment
    metadata:
    name: reverse-words
    labels:
       app: reverse-words
    spec:
    template:
       spec:
       containers:
       - name: reverse-words
         image: quay.io/mavazque/reversewords:v0.0.3
         env:
         - name: RELEASE
           value: "Preproduction release"
         - $patch: replace
    
  2. We send our changes to the git repository
    git add reversewords_app/overlays/pre/deployment.yaml
    git commit -m "Updated preproduction image version from v0.0.2 to v0.0.3"
    git push origin pre
    
  3. ArgoCD will detect the update in our code and will deploy the new changes, now we should see the version v0.0.3 deployed on pre and the version v0.0.2 deployed on pro.
    # Access the preproduccion LB and get the release name
    $ curl http://${PRE_LB_HOSTNAME}:8080
    
    Reverse Words Release: Preproduction release. App version: v0.0.3
    # Access the production LB and get the release name
    $ curl http://${PRO_LB_HOSTNAME}:8080
    
    Reverse Words Release: Production release. App version: v0.0.2
    
  4. Let’s verify that our application is working as expected
    $ curl http://${PRE_LB_HOSTNAME}:8080 -X POST -d '{"word":"PALC"}'
    
    {"reverse_word":"CLAP"}
    
  5. The application is working fine, now it’s time to update production to v0.0.3 as well. Let’s update the overlay:
    # reversewords_app/overlays/pro/deployment.yaml in git branch pro
    apiVersion: apps/v1
    kind: Deployment
    metadata:
    name: reverse-words
    labels:
       app: reverse-words
    spec:
    template:
       spec:
       containers:
       - name: reverse-words
         image: quay.io/mavazque/reversewords:v0.0.3
         env:
         - name: RELEASE
           value: "Production release"
         - $patch: replace
    
  6. Send the changes to Git
    git add reversewords_app/overlays/pro/deployment.yaml
    git commit -m "Updated production image version from v0.0.2 to v0.0.3"
    git push origin pro
    
  7. Get versions in use
    # Access the preproduccion LB and get the release name
    $ curl http://${PRE_LB_HOSTNAME}:8080
    
    Reverse Words Release: Preproduction release. App version: v0.0.3
    # Access the production LB and get the release name
    $ curl http://${PRO_LB_HOSTNAME}:8080
    
    Reverse Words Release: Production release. App version: v0.0.3
    
  8. We should now update the base deployment so newer deployments use v0.0.3 version

Final Thoughts

  • We have updated our application by modifying the application overlays in Git, this is a very basic scenario, advanced scenarios may include CI tests, multiple approvals, etc.

  • We have pushed our code to the pre/pro branches directly, that is not a good practice, in a real life scenario a more advanced workflow should be used. We will discuss GitOps workflows in future blog posts.

  • We have ArgoCD Cli, ArgoCD has a WebUI where you can do almost the same operations as with the cli, on top of that you can visualize your applications and its components.

Next Steps

In future blog posts we will talk about multiple topics related to GitOps such as:

  • GitOps Workflows in Production
  • Disaster Recovery with GitOps
  • Moving to GitOps
Categories
Containers, OpenShift Container Platform
Tags
, , ,