Client | B2B Search Engine |
---|---|
Category | Backend |
Published | |
Uplink fee | 10% for 12 months More info |
Job type | Recruiter More info |
Our client who is a Berlin based startup working on a B2B search engine application, is in need of a Data Pipeline Engineer to support for a data pipeline which is implemented in microservices based on python and connected via gRPC.
They are developing their own microservices internally (9 total) but need help in a) Defining an expandable database structure to hold metadata b) Stitching together the whole pipeline.
The product works as a market intelligence software where their main aim is to collect unstructured public data about companies and connect it to their profile.
Some more details include:
- The pipeline should be implemented in python, ideally the services as well. Optionally, services could be implemented in typescript.
- Services should use gRPC for communication and docker containers for deployment.
- The pipeline should use Kafka/Faust for messages.
- They use poetry for managing packages and dependencies and protodep for proto files generation.
- They use an internal python package to handle common functions, including grpc connection.
- The current infrastructure utilizes MongoDB and Azure blobs for storage and Redis for caching.
- Deployment and CI are managed via GitHub actions to Kubernetes.
Must have skills |
Start date |
Length |
Engagement |
Remote |
Language requirements |
Budget |
Onsite locations |
This job is already closed and applications are therefore no longer possible.
Join Uplink now to not miss out on similar jobs in the future!