As a Senior Big Data Developer, you will be developing and maintaining our team’s product. This will include developing complex stream pipelines for data analysis and extraction, developing and maintaining microservices, keeping accurate and complete project documentation, performing quality testing and data assurance, and working with project managers and team members to satisfy product needs. The person in this role will need to be adaptable and be able to pick up whatever new skills and expertise are needed to complete the project at hand.
Core Skill Preferences:
Java/ Scala, Kafka Streams, AWS Cloud, Docker, Kubernetes, RESTful API’s
Gather product and technical requirements
Design and Implement RESTful API’s to enable real-time data consumption.
Design and Implement distributed Stream Processing frameworks like Kafka/ Kinesis streams Apache Spark. Requirements:
Minimum 5 years of experience with Java development
Experience working with Kafka Streams/ Akka/ Flink/ Spark streaming
Experience working on AWS Cloud Experience designing RESTful API’s based WEB Service
Experience with NoSQL databases, such as MongoDB
BSC in Computer Science or equivalent practical experience
Experience with Docker and Kubernetes preferred
Experience working with reactive programming
Experience implementing machine learning
Knowledge of build tools like SBT/Maven and source control like Git
Experience working with agile software development methodologies such as Scrum.