Full-time employment, full remote or on-site at Central Jakarta (Menteng), your choice. You will be responsible for developing and maintaining cloud-based data processing pipeline capable of digesting Petabytes of volume.
Responsibilities
Develop and maintain cloud-based data processing pipeline capable of digesting Petabytes of volume
Design data analytics engine to facilitates the configuration and monitoring of deep insights of time-series technical and business metrics
Collaborate with the professional service team to analyze and fulfill customer requests/customizations
Requirements
Bachelor's degree (or above) in Computer Science, Statistics, Mathematics, or a similar quantitative field
5+ years of enterprise software development experience
Solid computer science fundamentals in data-structures and algorithms
Excellent object-oriented design/programming in Python, Go, Scala, or Java
Proficient with a few well-known ETL stacks (e.g. DataFlow, MySQL, Spark, Kafka, ...)
Experience with data services in the public cloud (AWS, GCP, Azure)
Deliverables
A complete implementation of the product per the requirements and architecture design
A sufficiently thorough documentation of the key building blocks and deployment aspects
A fully agreed-upon and comprehensive test plan for all the major and minor product features