KitRUM: Scribd - Middle Big Data Engineer
KitRUM is a one-stop custom software development company headquartered in sunny Florida with development centers in Ukraine and Poland.
Basic requirements (education, skills):
- 3+ years of experience in data engineering creating or managing end-to-end data pipelines on large complex datasets;
- Experience in developing scalable software using big data technologies (e.g. Hadoop, Spark, Hive, Flink, Samza, Storm, Elasticsearch, Druid, Cassandra, etc).;
- Expertise in Scala, Java, or Python;
- Fluency with at least one dialect of SQL;
- Level of English: Upper-Intermediate;
- Experience with of Streaming platforms, typically based around Kafka;
- Strong grasp of AWS data platform services and their strengths/weaknesses;
- Strong experience using Jira, Slack, JetBrains IDEs, Git, GitLab, GitHub, Docker, Jenkins, Terraform;
- Experience using DataBricks.
What you will be doing (functional duties):
- Manage data quality and integrity;
- Assist with building tools and technology to ensure that downstream customers can have faith in the data they're consuming;
- Cross-functional work with the Data Science or Content Engineering teams to troubleshoot, process, or optimize business-critical pipelines;
- Work with Core Platform to implement better processing jobs for scaling the consumption of streaming data sets.
What we offer (social package, benefits, bonuses):
- High compensation according to your technical skills;
- Long-term projects (12m+) with great Customers;
- 5-day working week, 8-hour working day, flexible schedule;
- Democratic management style & friendly environment;
- WFH mode;
- Annual Paid vacation — 20 b/days + unpaid vacation;
- Paid sick leaves — 6 b/days per year;
- Ukrainian official holidays;
- Corporate Perks (external training, English courses, corporate events/team buildings);
- Cozy office in the center of the city;
- Coffee, cookies and other goodies;
- Professional and personal growth.