Develop & manage Data Pipelines, ETL components & jobs for data migration, data warehousing & reporting needs using services on AWS platform
The daily routine will include :
Build an understanding of existing data infrastructure & various source datasets through discussions with IT & business stakeholders.
Recommend suitable data warehousing & ETL architecture based on the size of datasets, velocity, reporting & other business needs.
Develop necessary ETL jobs, reliable and fail-safe, for data migration on one time & incremental basis. You will be given freedom in choosing components specific to types, size & velocity of data.
Deployment and management of ETL jobs and data warehousing platform.
Generate & automate ad-hoc reports if needed.
Skills & Expertise :
3 years plus overall experience as ETL developer & data warehousing role.
1 year plus experience on Cloud-based platforms like AWS.
Knowledge of business intelligence tools like Tableau, QlikView, Power BI is plus.
You should be able to work independently with minimum supervision and be an authority in your field of expertise.
Communicate clearly with different stakeholders, business & technical.
Programming Skills : SQL (Necessary), Either one of Python or Java
Platform : AWS EMR, AWS Glue, AWS Redshift, AWS DynamoDB, AWS Data Pipeline
Engineering Background : Preferably B.Tech or Masters in Computer Science but not a showstopper for the right candidate.