Makkena Koteswararao Resume
Makkena Koteswararao Resume
CARRIER HIGHLIGHTS:
AWS Devops Engineer with over 8+ years of extensive IT experience, Expertise in DevOps and Cloud Engineering
& UNIX, Linux Administration.
Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Developing,
Testing and implementing and Post-production analysis of the projects and methodologies such as Agile, SCRUM
and waterfall.
Extensive experience in Amazon Web Services (AWS) Cloud services such as EC2, VPC, S3, IAM, EBS, RDS, ELB,
VPC, Route53, Ops Works, DynamoDB, Autoscaling, CloudFront, CloudTrail, CloudWatch, CloudFormation,
Elastic Beanstalk, AWS SNS, AWS SQS, AWS SES, AWS SWF & AWS Direct Connect.
Created Automation to create infrastructure for Kafka clusters different instances as per components in cluster
using Terraform for creating multiple EC2 instances & attaching ephemeral or EBS volumes as per instance type
in different availability zones & multiple regions in AWS
Implemented AWS X-Ray allows you to visually detect node and edge latency distribution directly from the
service map Tools, like Splunk, Sumologic can be used for log analysis but when comes to distributed tracing
with in the AWS, X-Ray will be provided much better features with service map, traces with in depth analysis
with minimal configuration with not much maintenance.
Used different user interface technologies HTML, JavaScript, TypeScript, jQuery, Angular 2, ReactJS and JSON for
developing the GUI of the application.
Utilized AWS Beanstalk for conveying and scaling web applications and administrations created with Java, PHP,
Node.js, Python, Ruby and Docker on commonplace servers like Apache.
Knowledge of High Availability (HA) and Disaster Recovery (DR) options in AWS.
Hands on experience in Architecting Legacy Data Migration projects such as Teradata to AWS Redshift migration
and from on-premises to AWS Cloud.
Designed, built, and deployed a multitude application utilizing almost all AWS stack (Including EC2, R53, S3, RDS,
HSM Dynamo DB, SQS, IAM, and EMR), focusing on high-availability, fault tolerance, and auto-scaling.
Strong hands-on experience with Microservices like Spring IO, Spring Boot in deploying on various cloud
Infrastructure like AWS.
Expertise in configuration and automation using Chef, Chef with Jenkins, Puppet, Ansible and Docker
Experience in configuring Docker Containers for Branching and deployed using Elastic Beanstalk.
Experience in designing, installing and implementing Ansible configuration management system for managing
Web applications, Environments configuration Files, Users, Mount points and Packages.
Extensively worked on Jenkins and Hudson by installing, configuring and maintaining the purpose of Continuous
Integration (CI) and for End-to-End automation for all build and deployments and in implementing CI/CD for
database using Jenkins.
Configured SSH, SMTP, Build Tools, and Source Control repositories in Jenkins. Installed multiple plugins to
Jenkins and Hands-on experience in deployment automation using Shell/Ruby scripting.
Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Ruby, and
PowerShell scripts.
Extensive experience developing a green field app large app using AWS Cognito, Lambda, API gateway, node
backend, Postgres and React /Redux front end.
1|Page
Worked on designing Poc's for implementing various ETL Process.
Proficient in Informatica administration work including installing and configuring Informatica PowerCenter and
repository servers on Windows and UNIX platforms, backup and recovery, folder and user account maintenance.
Experience in using build utilities like Maven, Ant and Gradle for building of jar, war, and ear files.
Performed several types of testing like smoke, functional, system integration, white box, black box, gray box,
positive, negative and regression testing
Worked in container-based technologies like Docker, Kubernetes and OpenShift.
Expertise AWS Lambada function and API Gateway, to submit data via API Gateway that is accessible via Lambda
function.
Managed configuration of Web App and Deploy to AWS cloud server through Chef.
Created instances in AWS as well as worked on migration to AWS from data center.
Developed AWS Cloud Formation templates and set up Auto scaling for EC2 instances.
Championed in cloud provisioning tools such as Terraform and CloudFormation.
Responsible for distributed applications across hybrid AWS and physical data centers.
Wrote AWS Lambda functions in python for AWS's Lambda which invokes python scripts to perform various
transformations and analytics on large data sets in EMR clusters.
Used Amazon EMR for map reduction jobs and test locally using Jenkins.
Experience in setting up and managing ELK (Elastic Search, Log Stash & Kibana) Stack to collect, search and
analyze logfiles across servers, log monitoring and created geo-mapping visualizations using Kibana in
integration with AWS CloudWatch and Lambda.
Strong Experience in implementing Data warehouse solutions in Confidential Redshift; Worked on various
projects to migrate data from on premise databases to Confidential Redshift, RDS and S3.
Experience on Cloud Databases and Data warehouses (SQL Azure and Confidential Redshift/RDS).
Good knowledge on logical and physical Data Modeling using normalizing Techniques.
Experience in automation and provisioning services on AWS
Experience building and optimizing AWS data pipelines, architectures and data sets.
Good working experience on Hadoop tools related to Data warehousing like Hive, Pig and Hive involved in
extracting the data from these tools on to the cluster using Sqoop.
Experience in working with Teradata. And making the data to be batch processing using distributed computing.
Used principles of Normalization to improve the performance. Involved in ETL code using PL/SQL in order to
meet requirements for Extract, transformation, cleansing and loading of data from source to target data
structures.
Getting in touch with the Junior developers and keeping them updated with the present cutting-Edge
technologies like Hadoop, Spark.
Developed real time tracking of class schedules using Node JS (socket.io based on socket technology, Express JS
framework).
T ECHNICAL S KILLS :
O PERATING S YSTEM Linux (Red Hat 4/5/6), UNIX, Ubuntu, Windows 7,8,10 and iOS
V ERSIONING T OOLS S UBVERSION (SVN), C LEAR C ASE , G IT H UB , C ODE C OMMIT
CI T OOLS BAMBOO, JENKINS
P ROGRAMMING LANGUAGE C, C++, PYTHON, SQL, .NET, C#.
F RAMEWORKS R EACT JS, A NGULAR JS (1. X ), N ODE JS
CD T OOLS AWS CODEDEPLOY, AWS CODEPIPELINE, AWS DATA PIPELINE
CODE QUALITY CHECKMARX, SONARQUBE, NEXUSIQ
B UILD T OOLS ANT, MAVEN, GRADLE
B UG T RACKING T OOLS JIRA, R ALLY , R EMEDY
S CRIPTING L ANGUAGES SHELL, PYTHON, TYPESCRIPT
I NFRASTRUCTURE CREATION C LOUD F ORMATION , T ERRAFORM
W EB A PPLICATION SERVERS A PACHE T OMCAT , JBOSS, W EB SPHERE , N GINX
D ATABASES O RACLE 7. X /8 I /9 I /10 G /11 G , D ATA WAREHOUSE,TALEND, RDBMS
2|Page
B IG D ATA E COSYSTEMS S3, R EDSHIFT S PECTRUM , A THENA , G LUE , AWS R ED S HIFT
WEB SERVICES SOAP, REST, JavaScript, CSS, Angular JS, HTML
M ONITORING T OOLS A MAZON C LOUD W ATCH , N AGIOS , S PLUNK , NEXUS ,
C ONFIGURATION M ANAGEMENT T OOLS C HEF , A NSIBLE , PUPPET
V IRTUALIZATION T ECHNOLOGIES V S PHERE , VM WARE W ORKSTATION , O RACLE V IRTUAL B OX , H YPER -V
C ONTAINERS T OOLS D OCKER , K UBERNETES , ECS
TESTING TOOLS SELENIUM, J UNIT
N ETWORKING / PROTOCOLS FTP, HTTP, HTTPS, HTML, W3C, TCP, DNS, NIS, LDAP, SAMBA
REPOSITORIES NEXUS, GIT, ARTIFACTORY
AWS SERVICES LAMBDA, SNS, SQS, DYNAMODB, KINESIS, REDSHIFT,
ANTHENA
CLOUDWATCH, CLOUDTRAIL, EC2, ECS, VPC, IAM, CONFIG, AWS X-RAY
Education:
Bachelor’s degree Electrical and Electronics engineering with 3.58 GPA - Acharya Nagarjuna university,
India,2012.
Master of Science in Information Assurance with 3.62 GPA -Wilmington university, Delaware, USA,2016.
Achievements:
• Certified AWS Devops engineer professional in Amazon Web Services for cloud computing service
• Certified AWS Developer ssociate in Amazon Web Services for cloud computing service
W ORK E XPERIENCE :
AWS CLOUD DEVELOPER/ENGINEER
VANGUARD |MALVERN, PA JUNE’2019-
PRESENT
Responsibilities:
We help developers automatically build and deploy software into production multiple times a day safely while
maintaining compliance in a highly regulated financial industry. We use tools like Atlassian Bamboo, Bitbucket,
Confluence, JIRA, Jenkins, Sonar type Nexus and Nexus IQ, SonarQube, Grunt, and Maven to get the job done.
Created Function as a service is a category of cloud computing services that provides a platform allowing
customers to develop, run, and manage application functionalities without the complexity of building and
maintaining the infrastructure typically associated with developing.
Implemented a 'serverless' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS
Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive
events from your S3 bucket
Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex
analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements
from Aurora.
Writing code that optimizes performance of AWS services used by application teams and provide Code-level
application security for clients (IAM roles, credentials, encryption, etc.)
Using SonarQube for continuous inspection of code quality and to perform automatic reviews of code to detect
bugs. Managing AWS infrastructure and automation with CLI and API.
Creating AWS Lambda functions using python for deployment management in AWS and designed, investigated
and implemented public facing websites on Amazon Web Services and integrated it with other applications
infrastructure.
Creating different AWS Lambda functions and API Gateways, to submit data via API Gateway that is accessible
via Lambda function.
3|Page
Responsible for Building Cloud Formation templates for SNS, SQS, Elastic search, Dynamo DB, Lambda, EC2,
VPC, RDS, S3, IAM, Cloud Watch services implementation and integrated with Service Catalog.
Regular monitoring activities in Unix/Linux servers like Log verification, Server CPU usage, Memory check, Load
check, Disk space verification, to ensure the application availability and performance by using cloud watch and
AWS X-ray.
implemented AWS X-Ray service inside vanguard, it allows development teams to visually detect node and edge
latency distribution directly from the service map Tools.
Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3,
ORC/Parquet/Text Files into AWS Redshift.
Automate Datadog Dashboards with the stack through Terraform Scripts.
Developed file cleaners using Python libraries and made it clean.
Utilized Python Libraries like Boto3, NumPy for AWS.
Used Amazon EMR for map reduction jobs and test locally using Jenkins.
Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
Create external tables with partitions using Hive, AWS Athena and Redshift.
Developed the PySprak code for AWS Glue jobs and for EMR.
Cloud development and automation using Node.js, Python (Boto3), AWS Lambda, AWS CDK (Cloud Development
Kit) and AWS SAM (Serverless Application Model).
Use the AWS CDK to define your cloud resources in a familiar programming language like Python, Java, and
C#/.Net.
Good Understanding of other AWS services like S3, EC2 IAM, RDS Experience with Orchestration and Data
Pipeline like AWS Step functions/Data Pipeline/Glue.
Provide a streamlined developer experience for delivering small serverless applications to solve business
problems The Platform is a Lambda-based platform. It is composed of a pipeline and a runtime.
Find and resolve complex build and deployment issues while on-call 24*7 support and strong knowledge of
troubleshooting and debugging application team issues.
Experience in writing SAM template to deploy serverless applications on AWS cloud.
Design, develop and implement next generation cloud infrastructure at Vanguard.
Hands-on experience on working with AWS services like Lambda function, Athena, DynamoDB, Step functions,
SNS, SQS, S3, IAM etc.
Building a REST API in Node.js with AWS Lambda, API Gateway, DynamoDB, and Serverless Framework.
Creation of indexes, forwarder & indexer management, Splunk Field Extractor IFX, Search head Clustering,
Indexer clustering, Splunk upgradation.
Install and configured Splunk clustered search head and Indexer, Deployment servers, Deployers.
Designing and implementing Splunk - based best practice solutions.
Designed and Developed ETL jobs to extract data from Salesforce replica and load it in data mart in Redshift.
Responsible for Designing Logical and Physical data modelling for various data sources on Confidential Redshift.
Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.
Integrated lambda with SQS and DynamoDB with step functions to iterate through list of messages and updated
the status into DynamoDB table.
Designed AWS Cloud Formation templates to create VPC, subnets, NAT to ensure successful deployment of Web
applications and database templates.
Creating S3 buckets also managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and
backup on AWS.
Experience to manage IAM users by creating new users, giving them a limited access as per needs, assign roles
and policies to specific user.
Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC,LAM Cloud formation, CloudWatch, ELK Stack), Bitbucket,
Ansible, Python, Shell Scripting, PowerShell, NodeJS, Jira, JBOSS, Bamboo, Docker, Web Logic, Maven, Web sphere,
Unix/Linux, AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk,SonarQube.
AWS CLOUD ETL DEVELOPER
MCGRAW HILL EDUCATION | EAST WINDSOR, NJ
4|Page
JAN’
2018 -MAY’2019
Responsibilities:
Worked in Server infrastructure development on AWS Cloud, extensive usage of Virtual Private Cloud (VPC),
Cloud Formation, Lambda, Cloud Front, Cloud Watch, IAM, EBS, Security Group, Auto Scaling, Dynamo DB,
Route53, and Cloud Trail.
Designing and building multi-terabyte, full end-to-end Data Warehouse infrastructure from the ground up on
Confidential Redshift for large scale data handling Millions of records every day.
Supported AWS Cloud environment with 2000 plus AWS instances configured Elastic IP and Elastic storage
deployed in multiple Availability Zones for high availability.
Setup Log Analysis AWS Logs to Elastic Search and Kibana and Manage Searches, Dashboards, custom mapping
and Automation of data.
Wrote python scripts to process semi-structured data in formats like JSON.
Used ETL component Sqoop to extract the data from MySQL and load data into HDFS.
Good hands on experience with Python API by developing Kafka producer, consumer for writing Avro Schemes.
Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3,
ORC/Parquet/Text Files into AWS Redshift.
Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
Create external tables with partitions using Hive, AWS Athena and Redshift
Designed External and Managed tables in Hive and processed data to the HDFS using Sqoop
Create user defined functions UDF in Redshift
Migrate Adobe Marketing Campaign data from Oracle into HDFS using Hive, Pig, Sqoop
Managed Docker orchestration and Docker containerization using Kubernetes
Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.
Automated builds using Maven and scheduled automated nightly builds using Jenkins. Built Jenkins pipeline to
drive all microservices builds out to the Docker registry and then deployed to Kubernetes.
Extensively worked on Hudson, Jenkins for continuous integration and for End to End automation for all build
and deployments.
Load data into Amazon Redshift and use AWS Cloud Watch to collect and monitor AWS RDS instances within
Confidential
Resolved update, merge and password authentication issues in Bamboo and JIRA.
Developed and maintained Python/Shell PowerShell scripts for build and release tasks and automating tasks.
Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to
Kubernetes, Created Pods and managed using Kubernetes.
Designed and implemented large scale business critical systems using Object oriented Design and Programming
concepts using Python and Django.
Experienced in working Asynchronous Frameworks like NodeJS, Twisted and designing the automation
framework using Python and Shell scripting.
Used Ansible Playbooks to setup and configure Continuous Delivery Pipeline and Tomcat servers. Deployed
Micro Services, including provisioning AWS environments using Ansible Playbooks.
automated various infrastructure activities like Continuous Deployment, Application Server setup, stack
monitoring using Ansible playbooks and has Integrated Ansible with Jenkins.
Prepared projects, dashboards, reports and questions for all JIRA related services.
POC to explore AWS Glue capabilities on Data cataloging and Data integration.
Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Redshift, Cloud formation, CloudWatch, ELK Stack), Jenkins,
Ansible, Python, Shell Scripting, PowerShell, NodeJS, Microservice, Jira, JBOSS, Bamboo, Kubernetes, Docker, Web
Logic, Maven, Web sphere, Unix/Linux, Nagios, Splunk, AWS Glue.
AWS CLOUD ENGINEER
KROGER | CINCINNATI, OH OCT’ 2016-NOV’2017
Responsibilities:
5|Page
Responsibilities:
• Experience working in Agile Environment.
Well versed with Rally and Jira.
Working in DevOps model to define, develop, maintain and support products.
Created custom Modules in Puppet to support the applications.
Able to handle whole data using HWI using Cloudera Hadoop distribution UI.
Importing the complete data from RDBMS to HDFS cluster using Sqoop.
Well versed with testing the custom modules locally using Test Kitchen and Vagrant.
Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS
using Docker, Ansible, and Terraform.
Integrated Jenkins to do auto build when code is pushed to GIT.
Achieved the Continuous Integration and Continuous deployment (CI/CD) process using GIT, Jenkins, Puppet and
Custom Repositories.
Designed and developed ETL/ELT processes to handle data migration from multiple business units and sources
including Oracle, Postgres, Informix, MSSQL, Access and others.
Developed and executed a migration strategy to move Data Warehouse from an Oracle platform to AWS
Redshift.
Used BI Tools such as ThoughtSpot and SAP Tools to create and maintain Reports
SAP Data Services Integrator ETL developer with strong ability to write procedures to ETL data into a Data
Warehouse from a variety of data sources including flat files and database links (Postgres, MySQL, Oracle).
Worked with and managed Big Data tools like Cassandra and Spark.
Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS
using Docker, Ansible, and Terraform.
Write terraform scripts from scratch for building Dev, Staging, Prod and DR environments.
Configured servers to send the server and application data to Splunk.
Hands on experience with generating reports using Splunk.
Built and managed servers, firewall rules, storage and authentication to servers on OpenStack and AWS.
Well versed with AWS products such as EC2, S3, EBS, IAM, CloudWatch, CloudTrail, VPC, and Route53.
Good knowledge of High-Availability, Fault Tolerance, Scalability, Database Concepts, System and Software
Architecture, Security and IT Infrastructure.
Lead onshore & offshore service delivery functions to ensure end-to-end ownership of incidents and service
requests.
Getting in touch with the Junior developers and keeping them updated with the present cutting-Edge
technologies like Hadoop, Spark.
As per as business requirements we use Talend to integrate the data on cloud and make it accessible to the
offshore medical team.
Working with Informatica 9.5.1 and Informatica 9.6.1 Big Data edition. Scheduling the jobs
After the transformation of data is done, this transformed data is then moved to Spark cluster where the data is
set to go live on to the application using Spark streaming and Kafka.
Able to handle whole data using HWI (Hive Web Interface) using Cloudera Hadoop distribution UI.
Deployed the Big Data Hadoop application using Talen don cloud AWS (Amazon Web Services) and on Microsoft
Azure.
Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Cloud formation, CloudWatch, ELK Stack), Bitbucket,
Ansible, Python, Shell Scripting, PowerShell, GIT, Jira, JBOSS, Terraform, Redshift, Maven, Web sphere, Unix/Linux,
AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk,SonarQube.
Responsibilities:
6|Page
• Set up an AWS Lambda function that runs every 15 minutes to check for repository changes and publishes a
notification to an Amazon SNS topic.
• Develop push-button automation for app teams for deployments in multiple environments like Dev, QA, and
Production.
• Perform troubleshooting and monitoring of the Linux server on AWS using Zabbix, Nagios and Splunk
• Management and Administration of AWS Services CLI, EC2, VPC, S3, ELB Glacier, Route 53, CloudTrail, IAM, and
Trusted Advisor services.
• Created automated pipelines in AWS Code Pipeline to deploy Docker containers in AWS ECS using services like
CloudFormation, Code Build, Code Deploy, S3 and puppet.
• Created AWS Multi-Factor Authentication (MFA) for instance RDP/SSH logon, worked with teams to lockdown
security groups
• Responsible for monitoring the AWS resources using Cloud Watch and application resources using Nagios
• Integrated services like Bitbucket AWS Code Pipeline and AWS Elastic Beanstalk to create a deployment pipeline.
• Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content
for a web application.
• Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes
based on requirement.
• Possess good knowledge in creating and launching EC2 instances using AMI’s of Linux, Ubuntu, RHEL, and
Windows and wrote shell scripts to bootstrap instance.
• Used IAM for creating roles, users, groups and implemented MFA to provide additional security to AWS account
and its resources. AWS ECS and EKS for docker image storage and deployment.
• Used Bamboo pipelines to drive all micro services builds out to the Docker registry and then deployed
to Kubernetes, Created Pods and managed using Kubernetes.
• Design an ELK system to monitor and search enterprise alerts. Installed, configured and managed the ELK Stack for
Log management within EC2 / Elastic Load balancer for Elastic Search.
• Create develop and test environments of different applications by provisioning Kubernetes clusters on AWS using
Docker, Ansible, and Terraform.
• Worked on deployment automation of all the micro services to pull image from the private Docker registry and
deploy to Docker Swarm Cluster using Ansible.
• Installed Ansible Registry for local upload and download of Docker images and even from Docker Hub.
• Implemented domain name service (DNS) through route 53 to have highly available and scalable applications.
• Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.
• Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR.
• Migrated on premise database structure to Confidential Redshift data warehouse.
• Wrote various data normalization jobs for new data ingested into Redshift.
• The data is ingested into this application by using Hadoop technologies like PIG and HIVE.
• Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift
• Used JSON schema to define table and column mapping from S3 data to Redshift
• On demand, secure EMR launcher with custom spark submit steps using S3 Event, SNS, KMS and Lambda function.
• Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.
• Experienced in creating RDS instances to serve data through servers for responding to requests.
• Automated regular tasks using Python code and leveraged Lambda function wherever required.
• Knowledge on Containerization Management and setup tool Kubernetes and ECS.
Environment: AWS (EC2, S3, EBS, ELB, RDS, SNS, SQS, VPC, Cloud formation, CloudWatch, ELK Stack), Bitbucket,
Ansible, Python, Shell Scripting, PowerShell,ETL,AWS Glue, Jira, JBOSS, Bamboo, Docker, Web Logic, Maven, Web
sphere, Unix/Linux, AWS X-ray,Dynamodb,Kinesis,CodeDeploy,CodePieline,CodeBuild,CodeCommit,Splunk.
7|Page
•Installed and configured Web Logic Application server 8.x/9.x/10.x/11x using graphic, console and silent mode and
configured the Web Logic domain.
• Determined and suggested hardware and software specific to the System and customized it.
• Configured Node Manager for running managed servers.
• Installed and configured JBOSS 5.1/6.0, Apache Tomcat 6.0on different environments like Dev, Test, QA and
Production.
•Experience in designing, installing and implementing Ansible configuration management system for managing Web
applications, Environments configuration Files, Users, Mount points and Packages.
• Installed and configured Apache HTTP Server 2.0, Tomcat 7.0, IIS and Sun One Webservers in various environments
Installed and configured plug-in for Apache HTTP server and Sun One Web server to proxy the request for Web Logic
server.
•Developed and maintained the continuous integration and deployment systems using Jenkins, ANT, Maven, Nexus,
Ansible and Run deck.
• Experienced in creating Ansible playbooks, tasks, roles, templates.
• Completely automated the process of building OAuth, OpenID and SAML stacks with Ansible and Jenkins.
• Deployment and troubleshooting of JAR, WAR, and EAR files on both stand alone and clustered environment in JBOSS
5.1/6.0, Web Logic 8.x/9.x/10.x and Apache tomcat 6.0.
• Performed migration and Upgradation tasks like upgrading Web Logic server 9.x/10.x to Web Logic11.xand updating
JDK's and installing service packs and patches for Web Logic Server.
• Configure F5 load balancer with Web servers. Used F5 to capacity, performance and reliability of the applications.
• Developed and run UNIX shell scripts and implemented auto deployment process.
• Solved server hang issues such as Deadlock, Application and Database level lock by taking thread dump and analyzed
to get the root cause of the hang.
• Set up Wily for monitoring, notification, root cause analysis and data reporting.
• Performance monitoring and JVM Heap size and EJB monitoring using Wily Introscope and Load testing using Mercury
Load Runner and JMeter with Thread and Heap analysis Using Samurai thread dump.
• Used Subversion (SVN) to maintain present and historical source code versions and documentation.
• TDA and Heap Analyzer for detecting blocking and locked threads.
• Used HP OpenView for managing applications, network conditions and status across the platform.
• Implemented standard backup procedures for both application and Web Logic server.
cassandra
8|Page