• DevOps
    Case Study

    How we built a resilient multi-account, multi-cloud solution for a Health Tech service provider!

    READ CASESTUDY
    icon

    24/7 DevOps as a Service

    Round-the-clock DevOps for uninterrupted efficiency.

    icon

    Infrastructure as a Code

    Crafting infrastructure with ingenious code.

    icon

    CI/CD Pipeline

    Automated CI/CD pipeline for seamless deployments.

    icon

    DevSecOps

    Integrated security in continuous DevOps practices.

    icon

    Hire DevOps Engineers

    Level up your team with DevOps visionaries.

    icon

    Consulting Services

    Navigate success with expert DevOps consulting.

  • TechOps
    Case Study

    How we built a scalable Odoo solution for a Travel Tech service provider!

    READ CASESTUDY

    WEB HOSTING SUPPORT

    icon

    HelpDesk Support

    Highly skilled 24/7 HelpDesk Support

    icon

    Product Support

    Boost your product support with our expertise.

    MANAGED SERVICES

    icon

    Server Management

    Don’t let server issues slow you down. Let us manage them for you.

    icon

    Server Monitoring

    Safeguard your server health with our comprehensive monitoring solutions.

    STAFF AUGMENTATION

    icon

    Hire an Admin

    Transform your business operations with our expert administrative support.

    icon

    Hire a Team

    Augment your workforce with highly skilled professionals from our diverse talent pool.

  • CloudOps
    Case Study

    How we helped a Private Deemed University in India, save US $3500/m on hosting charges!

    READ CASESTUDY
    icon

    AWS Well Architected Review

    Round-the-clock for uninterrupted efficiency

    icon

    Optimize

    Efficient CloudOps mastery for seamless cloud management

    icon

    Manage

    Automated CI/CD pipeline for seamless deployments

    icon

    Migrate

    Upgrade the journey, Migrate & Modernize seamlessly

    icon

    Modernize

    Simplify compliance complexities with our dedicated services

    icon

    FinOps as a Service

    FinOps as a Service

  • SecOps
    Case Study

    How we built a scalable Odoo solution for TravelTech service provider!

    READ CASESTUDY
    icon

    VAPT

    Vulnerability Assessment and Penetration Testing

    icon

    Source Code Review

    Ensuring source code security ans safe practices to reduce risks

    icon

    Security Consultation

    On demand services for improving server security

    icon

    System Hardening

    Reduced vulnerability and proactive protection

    icon

    Managed SoC

    Monitors and maintains system security. Quick response on incidents.

    icon

    Compliance as a Service

    Regulatory compliance, reduced risk

  • Insights
    Case Study

    How we helped a Private Deemed University in India, save US $3,500/m on hosting charges!

    READ CASESTUDY
    icon

    Blog

    Explore our latest articles and insights

    icon

    Case Studies

    Read about our client success stories

    icon

    Flipbook

    Explore our latest Flipbook

    icon

    Events

    Join us at upcoming events and conferences

    icon

    Webinars

    Watch our educational webinar series

  • Our Story
  • Contact Us

Interested to collaborate?

Get in touch with us!

Ready to elevate your business with certified cloud expertise? Contact us today to learn how our team can help you leverage cloud technology to drive growth, streamline operations, and enhance security.

  • AWSAWS
  • Azure CloudAzure Cloud
  • Google CloudGoogle Cloud
  • Akamai CloudAkamai Cloud
  • OVHOVH
  • Digital OceanDigital Ocean
  • HetznerHetzner
  • Kubernetes Consultancy Services
  • K8s & Cloud native Solutions
  • 24/7 Infrastructure Monitoring
  • DevOps as a Service
  • Cloud CI/CD Solutions
  • White Labeled MSP Support
  • Our story
  • Life@SupportSages
  • Insights
  • Careers
  • Events
  • Contact Us

Connect with us!


LinkedInFacebookXInstagramYouTube

aws partneraws advanced partner
SupportSages

Copyright © 2008 – 2026 SupportSages Pvt Ltd. All Rights Reserved.
Privacy PolicyLegal TermsData ProtectionCookie Policy
Automating SSM Parameter store backup using Python.

Automating SSM Parameter store backup using Python.

Admin

  • 4 min read
Automating SSM Parameter store backup using Python.

Generating audio, please wait...

AWS Systems Manager parameter store is a well-known key-value store where the developers store the parameters in a secure manner. You can be able to store the parameters as normal strings and also as secured strings in the parameter store which can be accessible via API or CLI. You can reference the parameter store name in the scripts, SSM documents, and also in the Codebuild build spec.

Requirements.

  1. AWS CLI profile configured.
  2. Access to the Parameter store and S3 in your AWS account.
  3. python3 and boto3 should be installed on your machine.

Scenario

We all know we will manage our AWS RDS DB credentials in the parameter store and sometimes we need to change these values at the DB level. After changing this in DB level we need to update this in the parameter store. But there can be chances of errors in this process which is carried out by humans. So for critical systems and for major applications we always choose to take a backup of the parameters so that if anything goes wrong we can revert it using the parameter backup that we have.\

Solution

As a solution for this scenario, we can write these parameters’ existing values to a txt file. To automate this I wrote a Python script which is given below.

import boto3

session = boto3.Session(region_name="<Region>" , profile_name="{profile_name}")
client = session.client("ssm")
s3 = session.client('s3')
#if the parameter store value is /RDS/CLUSTER/CHAT_DBNAME enter /RDS/CLUSTER/ as "path" and CHAT as "prefix"
path = input("Please enter the common filter : ")
prefix = input("Please enter the prefix after the filter : ")
paginator = client.get_paginator('get_parameters_by_path')
file_name = f"ssm-backup-{prefix}.txt"
file = open(file_name , "w")

response = paginator.paginate(
Path = path,
Recursive = True
)

for param in response:
for entry in param["Parameters"]:
name = entry["Name"]
value = entry["Value"]
if name.startswith(path+prefix):
file.write(f"{name} = {value} \n")
file.close()

consent = input("Do you want to upload the backup file to S3? (yes/no)").strip().lower()

if consent == "yes":
bucket = input("Do you have a bucket to upload this file to ? (yes/no)").strip().lower()
if bucket == "yes":
bucket_name = input("Please enter the name of the existing bucket : ")
s3.upload_file(file_name , bucket_name , file_name)
print(f"The file {file_name} has been uploaded to the bucket {bucket_name}")
elif bucket == "no":
bucket_name = input("Please enter the name of the bucket you need to create : ")
try:
s3.create_bucket(
Bucket= bucket_name,
CreateBucketConfiguration = {'LocationConstraint': '<Region>'}
)
print(f"Bucket with name {bucket_name} has been created succesfully")
s3.upload_file(file_name , bucket_name , file_name)
print(f"The file {file_name} has been uploaded to {bucket_name}")
except Exception as e:
print(f"failed to create a bucket with name {bucket_name} : {str(e)}")
else:
print("Please enter a valid input (yes/no)")
elif consent == "no":
exit
else:
print("Please enter a valid input (yes/no)")

This Python script uses Boto3 and SSM client to take backup based on the path and prefix.

For example, if we need to take the backup of all the parameters with the prefix /RDS/CLUSTER/DB we can give the filter as /RDS/CLUSTER/ and give the path as DB. Now the script will take a backup of the values like /RDS/CLUSTER/DB_NAME, /RDS/CLUSTER/DB_USER, etc., and then upload the file to S3 based on the interest by the USER.

This will be written to a file named ssm-backup-DB.txt in the format of KEY = VALUE.

1_v3qdaJSvK4NRY14EROYqYw.webp

1_L-xozGgBPrkRfOyF2A8uAQ.webp

As you can see in the above screenshot the script asked for the common filter to the user and also asked for the prefix(Path). After the script execution, the file with the name ssm-backup-DB.txt is created, and the contents in the file are visible below. If you enter the path/prefix as CONFIG the the file name will be ssm-backup-CONFIG.txt. Also, the file has been uploaded to the specified S3 bucket.

If you don't have an S3 bucket you can go for option no and the script will prompt for the creation of a new S3 bucket.

1_3o896oidVq-632mNyJPMwg.webp

1_Qr_ku94QWEIV1cKobTpaQw.webp

You can see from the above screenshot that a new bucket has been created and the file has been uploaded to the S3 bucket.

1_7I6tLFi-42jOpAafZhS-Bg.webp

These are the test parameters that I created for testing the script.

Now we have successfully generated backups of our parameter store. But what if something goes wrong and we need to restore these backup files?

You can follow this URL (https://medium.com/supportsages/restoring-parameter-store-backup-automatically-using-python-fe5b3e0d0811) to get this done.

That’s all thank you.

Ready to fortify your AWS systems against unforeseen data mishaps? Access our step-by-step guide and Python script for effortless restoration of Parameter Store backups. Gain expert insights and master automated restoration for your AWS Parameter Store backups in just a few simple steps! Discover more at SupportSages to safeguard your AWS Parameter Store backups effortlessly!

  • AWS
  • DevOps

Continue Your Journey With…

DevOps as a Service

DevOps as a Service

Let us do the heavy lifting for you

Promotional banner
Promotional banner

5 Things You Should Know About AWS Well-Architected Framework Review

5 Things You Should Know About AWS Well-Architected Framework Review
  • AWS
logo

Automating IAM user Audit Using Python.

Automating IAM user Audit Using Python.
  • AWS
  • DevOps
logo

Create new user account in Argo CD with Read Only Access

Create new user account in Argo CD with Read Only Access
  • AWS
  • DevOps
logo

Effortless S3 Bucket Access Log Activation Across Your AWS Account with Python Automation

Effortless S3 Bucket Access Log Activation Across Your AWS Account with Python Automation
  • AWS
  • DevOps
logo

Posts by Admin