8+ years of experience of development of ETLs using Informatica BDM and Power Center
Connect and model complex distributed data sets to build repositories, such as data warehouses, data lakes, using appropriate technologies.
Managing data related contexts ranging across addressing small to large data sets, structured/unstructured or streaming data, extraction, transformation, curation, modelling, building data pipelines, identifying right tools, writing SQL/Java/Scala code, etc.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ?big data? technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members
Work with data and analytics experts to strive for greater functionality data systems
Oversee and direct, in conjunction with the respective Systems Analysts / Programmers or the analysis of business requirements, development of functional and program specifications, relational database design, programming, testing, and implementation.