The Data Engineer is responsible for aggregating data from various systems into a central data lake and warehouse to be used for statistical analysis, reporting, and dashboards.
This position will utilize modern cloud data warehouse technologyand works with both unstructured and structured data. This is not a traditional data warehouse positionand will not involve traditional schema-based relational database design.
The successful candidate will have an understanding of cloud data warehouse technology, as well as how to organize and store data for use with advanced analytics.
Maintain, support, and enhance the business intelligence data backend, including data warehouses and data lakes.
Performs needed assessments
Implement data transformations and data structures for data warehouse and lake / repository.
Manage cloud and / or on-premises solutions for data transfer and storage.
Establish data structures for all enterprise data stored in business intelligence systems.
Be the Subject Matter Expert of Confluent Kafka platform and lead the Kafka implementation in critical initiatives
Provide guidance to junior team members on best practices and implementation standards
Manage and guide the enterprise on adoption and implementation of Kafka as a critical integration platform
Collaborate and work with data analysts in various functions to ensure that data meets their reporting and analysis needs.
Establish interfaces between the data warehouse and reporting tools, such as PowerBI.
Assist data analysts with connecting reporting and analytics software to data warehouses, lakes, and other data sources.
Manage access and permissions to data.
Provide technical guidance for design and implementation of data governance systems and policy.
Work with the data governance team to manage an enterprise wide data governance framework, with a focus on improvement of data quality and the protection of sensitive data through modifications to organization behavior policies and standards, principles, governance metrics, processes, related tools, and data architecture.
Monitor data quality, identify data quality issues, oversee remediation plans, implementation of data controls, and manage data quality remediation strategies.
Define data quality strategy and participate in a data quality working group.
Oversee and ensure that new systems implemented at the enterprise level follow data quality guidelines.
Keeps abreast of new business intelligence technologies; makes periodic recommendations for overall improvements
Bachelor’s Degree plus at least 1-3 years of experience managing data warehouse and / or business intelligence systems. An advanced degree or certifications in a related field is a plus.
Knowledge, Skills & Abilities :
Demonstrated experience with setting up data structures (tables and views) for use with modern analytics software.
Expertise in leading and developing Kafka initiatives. Prior experience in creating publication and consumption of Kafka topics, resiliency, monitoring through real time data streams.
Knowledge of KSQL and K-streams is a significant plus.
Expertise with Snowflake Data Warehouse, Amazon Web Services, SQL-based database systems, and / or other enterprise data warehouse solutions.
Experience working with integration tools such as APIs, Web Services, JDBC / ODBC connectors, and other integration technologies.
Solid experience working with programming languages used in ETL and / or ELT environments, such as SQL and Python.