0% found this document useful (0 votes)
24 views1 page

Manideep GCP DataEngineer

Manideep Patnam is a GCP Data Engineer with 2 years of experience in cloud migrations and data pipeline design. He has expertise in building ETL/ELT pipelines, migrating workflows, and optimizing performance using various Google Cloud services. His work experience includes leading projects for companies like Pearson and Harper Collins, focusing on enhancing data processing and resource management.

Uploaded by

s0rvk11
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views1 page

Manideep GCP DataEngineer

Manideep Patnam is a GCP Data Engineer with 2 years of experience in cloud migrations and data pipeline design. He has expertise in building ETL/ELT pipelines, migrating workflows, and optimizing performance using various Google Cloud services. His work experience includes leading projects for companies like Pearson and Harper Collins, focusing on enhancing data processing and resource management.

Uploaded by

s0rvk11
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

MANIDEEP PATNAM +919381586906

Hyderabad
[email protected]

manideep-patnam
DATA ENGINEER

SUMMARY
Highly skilled and detail-oriented GCP Data Engineer with 2 years of experience in cloud migrations and designing data pipelines. Adept at
building ETL/ELT pipelines and automation engineering, with a focus on enhancing performance and cost efficiency. Proficient in migrating
workflows, constructing robust data pipelines, and converting SQL stored procedures to advanced data processing frameworks.
Demonstrates strong problem-solving abilities and excels in team-based environments.

TECHNICAL SKILLS
PYTHON : Pyspark , Pandas , Fast-Api
SQL /NoSQL : MYSQL , Oracle , MongoDB
Google Cloud : Google Cloud Storage , Bigquery , Cloud Composer , Dataflow , Dataproc, Pubsub , Cloud Functions
Others : Git , Bash

WORK EXPERIENCE
Techolution India Data Engineer • 2022 - Present
As a GCP Data Engineer, I have led multiple cloud migration projects. I developed DAGs in Google Cloud Composer for AutoSys workflow
replication, enhanced Apache Airflow, and improved monitoring for Pearson. For Globus Medical, I utilized DataFusion and Datastream for
complex data transformations and real-time replication to BigQuery. I built a Flask web app for internal resource management, designed a
scalable MongoDB schema, and deployed it on Google Cloud Run. Currently, I am re-engineering SQL stored procedures as Azure Synapse
pipeline jobs for Harper Collins, ensuring integration with Azure services and optimizing performance
Autosys to Cloud Composer Migration (Pearson) Dec 2021- Mar 2023
Developed Directed Acyclic Graphs (DAGs) in Google Cloud Composer to replicate the workflow of AutoSys.
Implemented custom functionalities in Apache Airflow to enhance workflow management.
Engineered a custom agent at the server level to facilitate status communication with Google Cloud Composer.
Achieved performance parity and introduced additional alerting mechanisms for improved monitoring and responsiveness.

Oracle to Bigquery Migration (Globus Medical) Mar2023 - Nov 2023


Utilized Google Cloud DataFusion to orchestrate complex data transformations and integration workflows.
Leveraged Google Cloud Datastream for real-time data replication and change data capture (CDC) from various sources to BigQuery.
Ensured seamless migration of business logic and data processing workflows to cloud-native BigQuery jobs
Implemented performance tuning techniques to enhance the efficiency of data pipelines

HVPD Resource Alignment (Internal) Nov 2023 - Feb 2024


Built a web application using Flask for managing internal resources, ensuring efficient resource allocation and tracking.
Designed and implemented a MongoDB schema to store and manage data, ensuring scalability and performance.
Containerized the Flask application using Docker and deployed it on Google Cloud Run for scalable and managed serverless execution.
Implemented caching mechanisms and optimized database queries to improve response times and application performance.

SQL Jobs to Azure Synapse Jobs (Harper Collins) Feb 2024 - Present
Re-engineered and optimized stored procedures as Azure Synapse pipeline jobs, ensuring compatibility and performance in a cloud.
Ensured seamless integration with other Azure services such as Azure Data Lake Storage, Azure SQL Database, and Azure Blob Storage.
Applied best practices for performance tuning, including partitioning, indexing, and query optimization, to enhance pipeline efficiency.
Implemented data validation and error-handling mechanisms to ensure data integrity during the migration process.

EDUCATION
Bachelors in Electronics and Communication Engineering
IIIT Guwahati • 2018–2022

CERTIFICATIONS
GCP - Professional Data Engineer
June 2023 - Jun 2025 Link

You might also like