0% found this document useful (0 votes)
7 views2 pages

K.Gangaanusha: Snowflake Devloper/Data Engineer

K. Ganga Anusha is a Snowflake Developer/Data Engineer with over 4 years of experience in building data warehouses and ETL pipelines using Snowflake and Python. She has expertise in SQL, cloud services, and various RDBMS, along with strong skills in SDLC and Agile methodologies. Currently, she works at Humana Health Care, focusing on data management and processing within the Snowflake environment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views2 pages

K.Gangaanusha: Snowflake Devloper/Data Engineer

K. Ganga Anusha is a Snowflake Developer/Data Engineer with over 4 years of experience in building data warehouses and ETL pipelines using Snowflake and Python. She has expertise in SQL, cloud services, and various RDBMS, along with strong skills in SDLC and Agile methodologies. Currently, she works at Humana Health Care, focusing on data management and processing within the Snowflake environment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

K.

GangaAnusha
Snowflake Devloper/Data Engineer
+918333965090
[email protected]

PROFESSIONAL SUMMARY:

• Having 4+years total experience with a Snowflake developer in building Enterprise Data
Warehouse, Data Marts, Operational data stores in major industry sectors using Snowflake.
• Experience and understanding of architecting, designing and operationalization of large-
scale data and analytics solutions on Snowflake Cloud Data Warehouse.
• Developed ETL pipelines in and out of data warehouse using Python and Snowflake.
• Experience in writing SQL queries against Snowflake.
• Experience in working with data pipelines, AWS cloud and Snowflake cloud data warehouse.
• Strong Focus on Software Development Life Cycle (SDLC) with design, concept,
architecture, planning, modeling, coding, development, testing, implementation of Business
Intelligence solutions using Data Warehouse/Data Mart Design.
• Expertise in working with relational databases such as Postgresql,SQL Server
2008/2015/2017, DB2, My Sql.
• Strong experience working with Business Analyst & Modelers to understand the
requirements.
• Strong experience in Enterprise Data Warehouse environment with Slowly Changing
Dimensions (SCDs).
• Expertise in OLTP/OLAP System Study, Understanding E-R modeling and Dimensional
Models using Star schema and Snowflake schema techniques, Identifying Facts and
Dimensions.
• Good communication skills with ability to work as Team player. Ability to work in groups
as well as independently with minimum supervision.
• Expert Agile & Waterfall methodology also expert in using Agile tools like Rally & Jira.

TECHNICAL SKILLS:

ETL Tools DBT,AWS Glue and AWS EMR


Cloud Services Snowflake Environment, and AWS s3, ec2
RDBMS Postgresql,SQL Server 2005/2008, MySQL
Scheduling Tools Airflow
Operating Windows
Systems
Languages Python, SQL, PL/SQL
PROFESSIONAL EXPERIENCE:

➢ Worked as a Senior Process Consultant in Sagility India Private Limited as


Snowflake Developer from 28 April 2020 to 12 Nov 2024.

Humana Health Care, Bangalore, KA 28 April 2020 – 12 NOV 2024

Role: Snowflake Data Engineer

Project Description:

Humana HealthCare is among the world’s foremost investigation in the field of health
insurance. Financial Recovery processing of claims investigation with our responsible to
deliver excellent results and achieve standard across.Ensuring the process goals are met in a
timely by the efficient and effective management of personal and resources.

Responsibilities:

• Created database objects in the Snowflake and exported all the SQL Server data to a CSV
File using the SQL Server Management Studio export wizard.
• Uploaded CSV files from local instance to AWS S3 Bucket and loaded it into Snowflake.
• Converted SQL Server Stored procedures and views to work with Snowflake.
• Responsible for designing, development and maintenance of Snowflake database objects
such as tables, views, stored procedures, and SQL scripts.
• Handled large and complex sets of XML, JSON, and CSV from various sources and
databases.
• Implemented Snow-Pipe for continuous data loads, Stage and file upload to Snowflake
database using copy command.
• Experience in querying data in staging files before data ingestion using external tables,
loading structured and semi-structured data.
• Created internal, external stage and transformed data during load.
• Cloned Production data for code modifications and testing and used time travel to recover
the data.
• Experience in setting up resource monitors and Multi cluster virtual warehouse in
Snowflake.
• Developing scripts with Python etc. to do Extract, Load and Transform data.

Environment: Snowflake, MS SQL Server, AWS S3 Bucket, Python, XML, JSON, and CSV,
Snow-Pipe, Windows.

Education / Qualifications

Bachelor of Degree in computer science, 2017 From Yogi Vemana University, Kadapa.

You might also like