As a Systems Engineer, your main responsibility will be technical design, planning, implementation, and the highest level of performance tuning and recovery procedures for critical enterprise systems.
You will also serve as a technical authority in the area of system administration for complex SaaS, local and cloud-based systems.
Your responsibilities will also extend into designing the philosophies, tools and processes to enable high-speed delivery of evolving products.
This position combines proficiency in Operating Systems, Big Data, Databases and SaaS systems with the ability to communicate clearly and collaborate with Technical Product Managers to turn broad requirements into deliverable work items.
The individual will also work with multi-functional teams to address system, interface, data and vendor issues.
Design, configure and document cloud-based infrastructures using AWS Virtual Private Cloud.
Create, manage and configure EC2 instances in AWS.
Configure, secure and monitor hosted production SaaS environments from 3rd party partners.
Define, document and manage network configuration within AWS VPCs and between VPCs and data center networks, including firewall, DNS and ACL configurations.
Lead the design and review developer work on DevOps tools and practices.
Task automation using Python / PowerShell and BASH scripts as well as enterprise scheduling systems such as Control-M or Stone branch.
Support Airflow DAGs in the Data Lake which were using Spark framework and Big Data Technologies
Administrating Databases, data back-ups, DB monitoring and data rotation.
Working with RDBMS & NoSQL systems and leading stateful data migration between different data systems.
Bachelors or Masters degree in information science, computer science, business or equivalent work experience.
3-5 Years Experience with Microsoft Windows Operating Systems and SQL Server database platforms.
3-5 Experience with Amazon Web Services, most importantly VPC, S3, EC2 and EMR. Experience setting up new VPCs and integrating them with existing networks is highly desirable.
Experience in maintenance of Data Lake / Big Data systems which are built on Spark framework, Hadoop technologies.
Experience with Active Directory and LDAP setup, maintenance and policies.
Workday certification preferred but not required.
Strong verbal and written communication skills.
Understanding of Agile project methodologies, including Scrum and Kanban, required.