Software Engineer, Data

Lime

What You’ll Do:

  • Design and implement our data models for optimal storage and performant retrieval to meet critical product and business requirements
  • Own at least one data content area and build data pipelines for analytical and algorithm needs
  • Ensure all data systems i.e. our central data warehouse, data lake, ETL, and realtime processing uptime satisfy business requirements
  • Democratize data within the organization by formulating and enforcing best practices across teams
  • Embrace your role as a founding member of our data infrastructure team

You Have:

  • 1+ year relevant industry experience
  • Degree in Computer Science or equivalent areas
  • Experience working with big data technologies e.g Snowflake, Airflow, Kinesis, Kafka, Spark, Flink
  • Knowledge of / or experience with large-scale data warehousing architecture and data modeling
  • Working knowledge of relational databases (SQL)
  • Demonstrate the ability to analyze large data sets to identify gaps and inconsistencies in the ETL pipeline and provide solutions for pipeline reliability and data quality