0% found this document useful (0 votes)
59 views7 pages

Python For Devops

Uploaded by

venkatesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views7 pages

Python For Devops

Uploaded by

venkatesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Python for DevOps

Why Python in DevOps?

 Easy to Learn

 Cross-Platform

 Vast Libraries

 Active Community

Key Use Cases of Python in DevOps

 CI/CD Automation: Automate pipelines with Jenkins, GitLab CI.

 Infrastructure as Code (IaC): Manage infrastructure via Python with Terraform &
AWS Boto3.

 System Monitoring: Create alerts with Nagios, Prometheus, or Grafana.

 API Integrations: Interact with APIs for cloud services (AWS, Azure).

Core Python Concepts for DevOps

1. Data Types

o int, float for numbers

o str for text

o list, tuple, set for collections

o dict for key-value pairs

2. Functions & Modules

o Define reusable code blocks with def 🖊

o Split large code into modules for better organization

3. File Operations

1. with open("file.txt", "w") as f:


2. f.write("Hello, DevOps!")
4. Error Handling

1. try:
2. result = 10 / 0
3. except ZeroDivisionError:
4. print("Can't divide by zero!")

5. Automation with Boto3

1. import boto3
2.
3. # Create an S3 client
4. s3 = boto3.client('s3')
5.
6. # Create a new S3 bucket
7. bucket_name = 'my-new-bucket'
8. s3.create_bucket(Bucket=bucket_name)
9.
10. print(f'S3 bucket "{bucket_name}" created successfully!')
11.

6. Log Management

Use Python to parse, analyze, and manage logs for real-time monitoring.

Real-World Examples of Python in DevOps

1. CI/CD Automation

Scenario: Automating Jenkins Job Creation

Solution:

1. import requests
2. from requests.auth import HTTPBasicAuth
3.
4. jenkins_url = 'http://localhost:8080'
5. job_name = 'example-job'
6. jenkins_user = 'admin'
7. jenkins_token = 'your_api_token'
8.
9. job_config = '''
10. <project>
11. <builders>
12. <hudson.tasks.Shell>
13. <command>echo "Hello, DevOps!"</command>
14. </hudson.tasks.Shell>
15. </builders>
16. </project>
17. '''
18.
19. response = requests.post(
20. f'{jenkins_url}/createItem?name={job_name}',
21. data=job_config,
22. headers={'Content-Type': 'application/xml'},
23. auth=HTTPBasicAuth(jenkins_user, jenkins_token)
24. )
25.
26. if response.status_code == 200:
27. print(f'Job "{job_name}" created successfully!')
28. else:
29. print(f'Failed to create job: {response.status_code}')
30.

2. Infrastructure as Code (IaC)

Scenario: Managing AWS S3 Buckets

Solution:

1. import boto3
2.
3. # Initialize S3 client
4. s3 = boto3.client('s3')
5.
6. # Create a new S3 bucket
7. bucket_name = 'my-new-bucket'
8. s3.create_bucket(Bucket=bucket_name)
9.
10. print(f'S3 bucket "{bucket_name}" created successfully!')
11.

3. System Monitoring

Scenario: Sending Alerts with Prometheus

Solution:

1. from prometheus_client import start_http_server, Summary


2.
3. # Create a metric to track processing time
4. REQUEST_TIME = Summary('request_processing_seconds', 'Time spent processing request')
5.
6. # Decorate function with metric
7. @REQUEST_TIME.time()
8. def process_request():
9. # Simulate request processing
10. import time
11. time.sleep(2)
12.
13. if __name__ == '__main__':
14. # Start Prometheus metrics server
15. start_http_server(8000)
16.
17. # Simulate processing requests
18. while True:
19. process_request()
20.
4. API Integrations

Scenario: Interacting with AWS EC2

Solution:

1. import boto3
2.
3. # Initialize EC2 client
4. ec2 = boto3.client('ec2')
5.
6. # Launch a new EC2 instance
7. response = ec2.run_instances(
8. ImageId='ami-0abcdef1234567890',
9. MinCount=1,
10. MaxCount=1,
11. InstanceType='t2.micro',
12. KeyName='my-key-pair'
13. )
14.
15. instance_id = response['Instances'][0]['InstanceId']
16. print(f'EC2 instance launched with ID: {instance_id}')
17.

5. Backup and Restore Databases

Scenario: Backup MySQL Database Daily

Solution:

1. import subprocess
2. from datetime import datetime
3.
4. # Backup configuration
5. db_name = 'your_database'
6. user = 'your_username'
7. password = 'your_password'
8. backup_dir = '/path/to/backup'
9. timestamp = datetime.now().strftime('%Y%m%d%H%M%S')
10. backup_file = f'{backup_dir}/{db_name}_{timestamp}.sql'
11.
12. # Command to create backup
13. backup_command = f'mysqldump -u {user} -p{password} {db_name} > {backup_file}'
14.
15. # Run the backup command
16. subprocess.run(backup_command, shell=True, check=True)
17.
18. print(f'Database backup created at {backup_file}')
19.

6. Automating Server Configuration

Scenario: Automatically Install and Configure Nginx on a Server

Solution:
1. import paramiko
2.
3. # Server connection details
4. hostname = 'your_server_ip'
5. port = 22
6. username = 'your_username'
7. password = 'your_password'
8.
9. # Commands to run
10. commands = [
11. 'sudo apt update',
12. 'sudo apt install -y nginx',
13. 'sudo systemctl start nginx',
14. 'sudo systemctl enable nginx'
15. ]
16.
17. # Create SSH client
18. ssh = paramiko.SSHClient()
19. ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
20. ssh.connect(hostname, port, username, password)
21.
22. for command in commands:
23. stdin, stdout, stderr = ssh.exec_command(command)
24. print(stdout.read().decode())
25. if stderr.read().decode():
26. print(f"Error: {stderr.read().decode()}")
27.
28. ssh.close()
29.

7. Deploying Applications with Docker

Scenario: Deploy a Dockerized Application to a Remote Server

Solution:

1. import paramiko
2.
3. # Server connection details
4. hostname = 'your_server_ip'
5. port = 22
6. username = 'your_username'
7. password = 'your_password'
8.
9. # Docker commands
10. docker_commands = [
11. 'sudo docker pull your_image:latest',
12. 'sudo docker stop your_container || true',
13. 'sudo docker rm your_container || true',
14. 'sudo docker run -d --name your_container your_image:latest'
15. ]
16.
17. # Create SSH client
18. ssh = paramiko.SSHClient()
19. ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
20. ssh.connect(hostname, port, username, password)
21.
22. for command in docker_commands:
23. stdin, stdout, stderr = ssh.exec_command(command)
24. print(stdout.read().decode())
25. if stderr.read().decode():
26. print(f"Error: {stderr.read().decode()}")
27.
28. ssh.close()
29.

8. Handling Alerts and Notifications

Scenario: Send an Email Notification on Critical Errors

Solution:

1. import smtplib
2. from email.mime.text import MIMEText
3.
4. # Email configuration
5. smtp_server = 'smtp.example.com'
6. smtp_port = 587
7. sender_email = '[email protected]'
8. receiver_email = '[email protected]'
9. password = 'your_password'
10.
11. # Email content
12. subject = 'Critical Error Notification'
13. body = 'A critical error occurred. Please check the logs for details.'
14. msg = MIMEText(body)
15. msg['Subject'] = subject
16. msg['From'] = sender_email
17. msg['To'] = receiver_email
18.
19. # Send the email
20. with smtplib.SMTP(smtp_server, smtp_port) as server:
21. server.starttls()
22. server.login(sender_email, password)
23. server.sendmail(sender_email, receiver_email, msg.as_string())
24.
25. print('Notification email sent successfully')
26.

9. Creating Custom Command-Line Tools

Scenario: A Python CLI Tool to Monitor System Resource Usage

Solution:

1. import argparse
2. import psutil
3.
4. # Function to get resource usage
5. def get_resource_usage():
6. cpu_usage = psutil.cpu_percent()
7. memory_info = psutil.virtual_memory()
8. return cpu_usage, memory_info.percent
9.
10. # CLI setup
11. parser = argparse.ArgumentParser(description='Monitor system resource usage.')
12. parser.add_argument('--cpu', action='store_true', help='Show CPU usage')
13. parser.add_argument('--memory', action='store_true', help='Show memory usage')
14. args = parser.parse_args()
15.
16. cpu_usage, memory_usage = get_resource_usage()
17.
18. if args.cpu:
19. print(f'CPU Usage: {cpu_usage}%')
20. if args.memory:
21. print(f'Memory Usage: {memory_usage}%')
22.

 If you like the content follow me on LinkedIn: https://www.linkedin.com/in/ashok-


sana

Follow my Whatsapp & telegram community:


https://chat.whatsapp.com/BzX1aruZIH645l29LxvgO3

https://t.me/ExplorewithAshok

Happy learning......!

You might also like