Responsibilities:
Build, scale and maintain high-performance, fault-tolerant, scalable distributed Big Data systems.
Provide architectural solutions for complex data issues resulting from large-scale and rapid growth needs.
Improve the reliability and efficiency of data processing.
Code using Python.
Required Skills:
BS or MS in Computer Science or related field or equivalent background from technological army units.
5+ years of experience working on large scale applications in a SaaS environment.
Experience with one of the following big data frameworks: Storm/ Spark/ Flink/ Samza.
Experience with one of the following programming languages: Python/Java/C#(.Net)/Ruby/C++.
Knowledge of Unix / Linux systems and shell scripts.
Sharp troubleshooting skills and can-do approach.
Willingness to take initiative and contribute beyond your own responsibilities.
Ability to work independently and as part of a team.
Advantage to:
Experience with messaging systems at high through puts (RabbitMq, Kafka).
Experience with NoSQL databases (MongoDB, Redis, ElasticSearch, Cassandra, Aerospike, Redshift) stack.
Experience with Amazon Web Services (AWS).