Create Rubrik SLAs from a CSV File

If you own a Brik, you are familiar with creating new SLAs as it is one of the first things you would do after getting a Brik deployed.

In the world of Rubrik, everything is build around Service Level Agreements. With Rubrik’s unique approach to use SLAs for Backup and Recovery tasks, we have dramatically simplified the Backup Admin’s daily work. Traditionally a Backup Admin would create multiple backup jobs for full, incremental, hourly, daily, weekly, monthly and yearly backups. Additionally the backup admin would create some archival and replication jobs. With Rubrik’s SLA approach all of this can done within a single SLA and dramatically simplify the operational overhead associated with backup jobs.

Sometimes, a single or even a handful of SLAs might not be enough. In this case using the Rubrik interface becomes time-consuming. Luckily Rubrik has an API-First Architecture which means everything you can do in the GUI, can also be done via the APIs.

To make Rubrik’s API even easier to digest, we have a build-in API Explorer. You can get access to it via https:///docs/v1/playground

Creating Rubrik SLAs from a CSV File

To get started, head over to my Github repo and clone or download the files for CSV_TO_SLA.

Before you start modifying the template, install the python requirements with:

After you installed the Python requirement, open up template.csv and start defining your SLAs including archival and replication requirements.

Note: You can find your Archival Locations and Replication Targets under the configuration menu within the Rubrik menu. Make sure to use the name of the Archival Location and Replication Target. Also, the file has to be saved as a CSV.

Once you’re ready to get the SLAs created, run the script as shown below:

Upon completion of the script, head over to the Rubrik GUI and check out your newly create SLA Domains.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.