JD : High level JD
Help to deliver GCP development projects (including : design, development / coding, testing & deployment into Production) e.g.
example 1 : Build event driven pipeline : a message is published by an HSBC system >
model prediction >
save results in BigQuery.
example 2 : Build batch process : large file is generated in an HSBC Hadoop cluster >
batch job transfers file to Google Cloud Storage >
data is copied to BigQuery.
example 3 : Write a complex Dataflow / Beam job in Java or Python to merge customer & reference data >
update the customer profile in Bigtable >
re-calculate customer risk score.
example 4 : Build a secure analytics environment in GCP for Data Scientists (they want to use tools like... Python, R, Jupyter notebook, ML libraries, BigQuery).
Development role on a large cloud implementation / migration project, preferably GCP (optional).
Technical Skills - must-have
Some knowledge of cloud architecture & services (preferably GCP, but AWS is ok).
Good number+ years Java / Python experience.
Linux , Bash / shell scripting.
DevOps principles & tools (e.g. CICD , Bamboo / Jenkins , Github / Bitbucket, JIRA, Confluence).
Nice-to-have, either of 2
GCP or AWS certified.
Apache Beam and / or Google Dataflow
Containers ( Docker , Kubernetes)
Experience of streaming data using frameworks / tools ( Spark )
Automation toolset ( Terraform , Ansible )
Databases ( SQL , no-SQL).
Big Data ( Hadoop ecosystem, Hortonworks, Spark).