Architect and build a highly scalable data pipeline for diversified and complex data flows
Guide our data engineers / BI Developvers by setting technical directions and providing standards, architectural governance, design patterns, and practices
Influence our BI solution roadmap strategy and coordinate it with the architecture vision
Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities
Proactively identify gaps in data consumption and define processes to complete them
Work closely with tech teams on the design and implementation of data solutions.
Requirements
At least 8 years of relevant experience as BI developer / Data engineer
Experience with cloud platforms (GCP/AWS/Azure)
Experience with data lake storage (cloud) and data formats (Parquet/ORC etc)
Experience with big data solutions like BigQuery/Snowflake/RedShift
Experience with high-scale, high-volume relational databases, and SQL language
Significant knowledge in big data language – familiarity with a variety of big data technologies in the big data world
DevOps skills such as Docker, Kubernetes, Cloudformation, etc. – advantage
BSc Computer Science/Data Management or equivalent
Experience with workflows and data processing pipelines like AirFlow – big advantage.
In-depth understanding of database management systems and ETL (Extract, transform, load) framework
Experience in the online industry.
Quick learner, a team player, independent and motivated individual
Able to multitask, prioritize, and manage time efficiently