Bosch is at the forefront of innovation in the connected world and is impacting millions of lives every day through its products in mobility, industrial technology, consumer goods and energy and smart building sectors.
We rely on data for every aspect of our product lifecycle and operations and we collect a lot of it every day. Our team is responsible for enabling a wide variety of Bosch product teams to make use of advancements in AI through centralized services that involve streaming data from connected services to building and deploying state-of-art AI solutions.
You will be part of a global team to make this happen!
We are looking for a talented professional who is passionate about building fault-tolerant data services and analytics tools targeted towards cutting-edge applications.
Your work will be used by hundreds of Bosch engineers and will have global impact by improving the quality and value of Bosch products.
Job Requirements :
Troubleshoot, diagnose, and fix software problems in production.
Develop and improve software monitoring and alerting solutions.
Develop tools to automate deployments, maintenance, and operations.
Responsible for setting up, managing, and maintaining on-prem and cloud software infrastructure.
Set up and optimize the existing continuous integration and coordinate the central monitoring and alerting.
Drive the release cycle of our internal products and platforms. Use automation to ensure high quality code and accelerate the pace of development
Collaborate with data scientists and engineers to deploy new machine learning and deep learning models into complex and mission critical production systems
Select the right tool(s) for the job and make it work in production.
Constantly research and be cognizant of latest emerging technologies in DevOps Engineering.
Should be available for on-call support on a rotational basis.
Should be a team player with excellent communication skills as well as the ability to prioritize multiple tasks in a fast-paced environment.
Technical Skills Expected
Experience with Agile, CI / CD framework and Azure cloud platform.
Strong programming skills in a variety of languages (e.g. Python, Bash, Java, Scala)
Experience of DevOps (Dockers, Ansible, Kubernetes) tool administration.
Experience as an administrator of BigData stack would be a plus.
Solid experience building CI / CD Pipelines with Jenkins, Git.
Strong experience as a systems integrator with Linux (SUSE, Ubuntu) systems and shell scripting
Hands-on experience supporting code deployments into distributed systems.
Good understanding of technologies like Hadoop, Spark, Kafka, Impala.
Experience in Log Analysis tools (e.g. ELK Stack), Monitoring and performance testing.
Bachelor’s / Master’s in computer science engineering or related technical field.
6 -8 years of relevant experience in DevOps or Software Development.