Client | Startup with the mission to enable companies to run the world’s most intelligent procurement – human and artificial. |
---|---|
Category | Data Science |
Published | |
Uplink fee | 10% for 12 months More info |
Job type | Direct More info |
Your tasks:
- Development of an intelligent data platform for procurement in close collaboration with the Lead Architect Data Platform and our founding CTO
- Define and build a data architecture that scales across multiple large enterprise companies running on AWS
- Create Fort Knox-like data security and monitoring setup with intrusion detection and continuous security testing
- Build scalable, serverless data ingestion pipelines from structured and unstructured data
- Combine data engineering with backend development to transform ingested data into our procurement data graph
Requirements:
- You love data and coding as well as testing- from the top-level architecture design down to the last line of code
- You are an AWS services expert for data lakes and data engineering (e.g. ETL/ELT pipelines, Kinesis, S3, Glue, Databricks, Spark on AWS)
- You build server-less event-driven services in Python, Scala and/or JavaScript
- You know how to run an AWS enterprise stack – deep knowledge of security (authentication and authorisation architecture) and data protection setups
Must have skills |
Nice to have skills |
Start date |
Length |
Engagement |
Remote |
Language requirements |
Budget |
This job is already closed and applications are therefore no longer possible.
Join Uplink now to not miss out on similar jobs in the future!