Description :
If you desire to be part of something special, to be part of a winning team, to be part of a fun team winning is fun. We are looking forward to a Machine Learning QA Build Release Engineer based in Pune, India.
In Eaton, making our work exciting, engaging, meaningful; ensuring safety, health, wellness; and being a model of inclusion & diversity are already embedded in who we are - it’s in our values, part of our vision, and our clearly defined aspirational goals.
This exciting role offers opportunity to :
ESSENTIAL FUNCTIONS :
Ensure quality issues and defects are appropriately identified, documented, tracked and resolved.
Work with ML Ops to develop pipelines that measure and report QA metrics (Code Coverage, Security scores, etc) across all Application and Infrastructure as code pipelines.
Work with the ML Ops team and contribute toward architecting and improving all stages of the CI / CD pipeline.
Augment tests within CI / CD pipelines, ensuring that software is built, tested, and deployed into our production environment using an iterative, continuous improvement basis.
Responsible in maintaining and tracking code quality and code security scores (vulnerability scanning tests).
Conduct internal audits on all opensource code ensuring that it passes all licensing and security vulnerability requirements using Application security testing (AST) tools.
Own the testing system to ensure all data permutations and scenarios are considered when testing code, work with data analysts to grow the test base preemptively.
Demonstrate and document solutions by using flowcharts, diagrams, code comments, code snippets.
Develop and execute agile work plans for iterative and incremental project delivery
Explore and recommend new tools and processes which can be leveraged to support the ML pipelines from a Quality and security perspective.
Integrate multiple sources of data through efficient data connectors and other workflows
Collaborate broadly across multiple functions (data science, engineering, product management, IT, etc.) to readily make key data readily available and easily consumable.
Requirement :
Requires a minimum of a bachelor’s degree in computer science or software engineering
2+ years of progressive experience in delivering unit test solutions in a Big Data platform production environment.
Demonstrate knowledge and ability to develop unit tests for Scala application (using SBT) and / or Python applications.
Solid understanding of QA testing on software required for distributed spark applications and multicore applications.
Experience with CI / CD systems such as Azure DevOps or Jenkins.
Broad server management expertise covering build, deployment, code coverage, & unit testing frameworks.
Excellent communication (verbal, presentation, documentation) skills, working with teams that are geographically dispersed, to produce solutions that satisfy functional and non-functional requirements.
Knowledge of Big Data tools such as Redis, Cassandra and Spark.
Understanding of virtualization and container technologies e.g. Docker, Docker Swarm, Kubernetes.
Experience with OS installation, log management, remote access, and shell scripting.
Experience with Software Composition Analysis (SCA) tools like Blackduck.
Experience building and deploying cloud-based solutions on Azure, AWS or similar.
Experience authoring and maintaining APIs.
Experience building end-to-end solutions utilizing back-end development and / or database driven functionality.
Excellent verbal and written communication skills including the ability to effectively explain technical concepts.
Experience delivering software through Agile development methodologies and concepts.
Keeps abreast of upcoming software development / engineering tools, trends, and methodologies.
Good judgment, time management, and decision-making skills.