top of page
Download CV

AWS, Docker, Python, Jenkins, Tomcat, Nagios

true

M

Mithilesh

Senior

Devops Engineer

* Zero Evaluation Fee

Summary
Technical Skills
Projects Worked On
Mithilesh
00:00 / 00:42
Mithilesh
00:00 / 01:04
Summary
  • Having 8 Years of working experience in information technology, experienced in executing DevOps strategy in various environments of Linux and Windows Servers along with espousing cloud strategies based on AWS.
  • Good understanding of Software Development Life Cycle (SDLC) like Agile and Waterfall Methodologies.
  • Experience in Java and python language.
  • Experience on DevOps essential tools like Ansible, Docker, Kubernetes, Terraform, GIT, Jenkins, Maven and AWS.
  • Experience in Designing, Installing and Implementing Ansible configuration management system and in writing playbooks for Ansible and deploying application.
  • Expertise in AWS Resources like EC2, S3, EBS, VPC, ELB, AMI, SNS, RDS, IAM, Routez53, Auto scaling, Cloud Formation, Cloud Watch and Security Groups.
  • Experience in optimizing volumes, EC2instances and created multiple VPC instances and created alarms and notifications for EC2 instances using Cloud Watch.
  • Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, TeamCity, Maven, Nexus, Slack and VSTS.
  • Expertise in automating builds and deployment process using Bash, Python and Shell scripts.
  • Expertise in areas of Jenkins such as Plugin Management, Securing and scaling Jenkins, integrating Code Analysis, Performance issues and worked with test Phases to complete the CI/CD pipelines within Jenkins.
  • Hands on experience in SCM tools like GIT, BitBucket for branching, tagging and maintaining the version across the environments on UNIX/Linux and Windows environment.
  • Hands on with Kubernetes cluster setup, deployment automation and service setup.
  • Experienced in troubleshooting, configuring and deploying Enterprise Web Application servers like Apache Tomcat, Nginx .
  • Experience with working on multi-platform servers using NAGIOS tool, utilized this tool to configure and monitor.
  • A highly motivated, energetic individual, a team player with excellent communication and interpersonal skills.
  • Possess strong analytical skills, an excellent team player with good leadership qualities and strong oral and written communication skills.
Technical Skills

AWS Services: Lambda, Glue, Athena, Aurora, Redshift, DMS, StepFunctions, SFTP, Databricks ,DataSync, Route53, CloudFront, API Gateway, AWS Quicksight, Pyspark, WAF
DevOps Tools: Ansible, Terraform, Docker, Kubernetes, SonarQube, Jenkins
Scripting/Languages: Python, Shell, Bash, Powershell, Java
Cloud: AWS (VPC, EC2, S3, IAM, EBS, Security Group, Auto Scalling, Lamda, Cloud Watch, Cloud Formation)
Tools: Nagios, Jira
Version Controls: Git, BitBucket, SVN
Servers: Apache Tomcat, WebSphere, Nginx
Operating Systems: Windows, Linux, Unix

Work Experience

Role: DevOps Engineer/Data Engineer
Duration: May 2021– Till Date
Project: BHP ( 2021 - Now)

Responsibilities:

  • Hands on experience in SCM tools like GITLAB for branching, tagging and maintaining the version across the environments on UNIX/Linux and Windows environment.
  • Automated infrastructure provisioning on AWS using Terraform and Ansible.
  • Hands on experience on GitLab Pipelines.
  • Worked on Java and python application.
  • Expertise in automating builds and deployment process using Bash, Python and Shell scripts.
  • Build out server automation with Continuous Integration - Continuous Deployment tools like Jenkins/Maven for deployment and build management system.
  • Deployed Build artifacts using Ansible Playbooks into Apache instances which were integrated using Python and Shell scripts.
  • Maintained configuration files for each application for build purpose and installed on different environments.
  • Involved in Scrum meetings, product backlog and other scrum activities in collaboration with the team.
  • Worked as ETL developer in designing solutions.
  • Worked on various data tools like Athena redahift s3 glue emr step functions.
  • Worked on creating solution for data implementation end to end with aws databricks.
  • Worked on governance for data analytics using databricks unity catalogue
  • Worked on lambda fucntion with api gateway and dynamodb to set up the application

 

Role: DevOps Engineer
Duration: Sept 2017– May 2021
Project: EDF Energy

Environment: Jenkins, Maven, AWS, Ansible, Kubernetes, Docker, Git, CI/CD, Terraform, Java, Python, Bash, Powershell, Linux , Windows.

Responsibilities:

  • Involved in user interactions, requirement analysis and design for the interfaces.
  • Involved in DevOpsmigration/automation processes for build and deploy systems.
  • Implemented Devops pipeline Automated Builds, Continuous Integration and Continuous Delivery with tools Jenkins.
  • Maintaining Servers configure management (CM process), System orchestration and to deploy apps using Ansible
  • Created Docker file and automated Docker image creation using Jenkins and Docker.
  • Automated infrastructure provisioning on AWS using Terraform and Ansible.
  • Deployed Build artifacts using Ansible Playbooks into Apache instances which were integrated using Python and Shell scripts.
  • Architected, planned, developed & maintained Infrastructure as code using CI/CD deployments using Terraform.
  • Setup load balancers in front of Auto scaling groups in AWS to create a dynamically scalable production environment able to handle large swings in load.
  • Created AWS S3 buckets, performed folder management in each buckets, Managed cloud trail logs and objects within each buckets.
  • Build out server automation with Continuous Integration - Continuous Deployment tools like Jenkins/Maven for deployment and build management system.
  • Created scripts for system administration and AWS using languages such as BASH, Python and Power Shell.
  • Worked on integrating GIT into the continuous Integration (CI) environment along with Jenkins.
  • Used GIT as source code repositories and managed GIT repositories for branching, merging, and tagging.
  • Managed local deployments in Kubernetes, creating local cluster, deploying application containers and services.
  • Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, tests deploy.
  • Designed and implemented fully automated server build, management, monitoring and deployment solutions spanning multiple platforms, tools and technologies.
  • Worked on Configuration for firewall and WAF.
  • Improve the accessibility of security through automation, continuous integration pipelines, and other means.
  • Worked as ETL developer in designing solutions.
  • Maintained all the Linux environments for deployments and Implemented Configuration Management, Change Management policies and procedures.
  • Maintained configuration files for each application for build purpose and installed on different environments.
  • Dwsigned and developed efficient pipeline to ingest, store and process data from multiple sources.
  • Implement advance data quality and governance process to decrease the data errors.
  • Worked on Structured streaming or delta lake or MLflow.
  • Involved in Scrum meetings, product backlog and other scrum activities in collaboration with the team.
  • Setup and build AWS infrastructure various resources like VPC, EC2, S3, IAM, EBS, Security Group, Auto Scaling and RDS in Cloud Formation JSON templates.
Social Share

How it Works

KNOW

SEND

LIKE

SEND

ON BOARD

How it Works

1.

SEND

2.

MATCH

3.

TRIAL

4.

ON BOARD

icons8-speech-to-text-90.png
Whatsapp
bottom of page