Design and Build ETL jobs to support SVB’s Enterprise data warehouse.
Write Extract-Transform-Load (ETL) jobs using any standard tools and Spark / Hadoop / AWS Glue jobs to calculate business metrics
Partnering with business team(s) to understand the business requirements, understand the impact to existing systems and design and Implement new data provisioning pipeline process for Finance / External reporting domains.
You will also have the opportunity to display your skills in the following areas : AWS Cloud, Big Data technologies, Design, implement, and build our enterprise data platform (EDP).
Design and develop data models for SQL / NoSQL database systems
Monitor and troubleshoot operational or data issues in the data pipelines
Drive architectural plans and implementation for future data storage, reporting, and analytic solutions
Bachelor's degree in Computer Science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience
7+ years of relevant work experience in analytics, data engineering, business intelligence or related field, and 5+ years professional experience
2+ years of experience in implementing big data processing technology : AWS / Azure / GCP, Hadoop, Apache Spark, Python and good to have : understanding of Redshift, Snowflake.
Experience using SQL queries, experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
Detailed knowledge of databases like Oracle / DB2 / SQL Server, data warehouse concepts and technical architecture, infrastructure components, ETL and reporting / analytic tools and environments
Hands on experience on major ETL tools like Informatica IICS, SAP BODS and / or any cloud based ETL tools.
Hands on experience with scheduling tools like Redwood, Control-M or Tidel. Expects good understanding and experience on reporting tools like Tableau, BOXI etc.
Hands on experience in cloud technologies (AWS / google cloud / Azure ) related to Data Ingestion tool ( both real time and batch based), CI / CD processes, Cloud architecture understanding , AWS, Big data implementation.
AWS certification is a plus and working knowledge of Glue, Lamda, S3, Athena, Redshift is a plus.
Graduate degree in Computer Science, Mathematics, Statistics, Finance, related technical field
Strong ability to effectively communicate with both business and technical teams
Demonstrated experience delivering actionable insights for a consumer business
Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc.)
Basic Experience with Cloud technologies
Experience in banking domain is a plus