At PINTU, We are building the #1 crypto investment platform to focus on new investors in Indonesia and Southeast Asia. We know that 99% of new investors are underserved because existing solutions cater to the 1% who are pros and early adopters hence we built an app that helps them to learn, invest and sell cryptocurrencies with one click away.
We’re looking for a Data Engineer to join our Engineering team, to maintain PINTU’s data pipeline and data pipeline observability. This role is the subject matter expert for PINTU’s data pipeline management.
What You’ll Be Doing
You will be managing PINTU’s data pipeline. Using your technical knowledge and passion for data management, you will team up with our data team to build a seamless data pipeline to fasten the company-wide data decision-making.
In this role, you will:
- Conceptualize and generate infrastructure that allows big data to be accessed and analyzed
- Design, develop and maintain data pipelines (external data source ingestion jobs, ETL/ELT jobs, etc)
- Continuously seek ways to optimize existing data processing to be cost and time efficient
- Ensure good data governance and quality through build monitoring systems to monitor data quality in data warehouse
- Liaise with coworkers and specific stakeholders to elucidate the requirements for each task
- Keep up-to-date with blockchain standards and technological advancements that will improve the quality of your outputs
Requirements:
- Fluent in Python and advanced-SQL
- At least 2+ years of relevant experience as a data engineer
- Fluently working in data warehouse environments (eg: Google BigQuery, AWS Redshift, Snowflake)
- Familiar with data transformation or processing framework (eg: dbt, spark, Hive, etc)
- Familiar with data processing technology (Google Cloud Function, Google Dataflow, Google Dataproc, etc)
- Fluently working with a data orchestration tool (eg: Airflow, Prefect, Dagster etc)
- Familiar with data storage (eg: Google Cloud Storage, AWS S3)
- Understand cloud data warehousing concept and experience in data modeling and measure + improve data quality
- Preferably understand basic containerization and microservice concept (eg: Docker, Kubernetes)
- Able to build and maintain good relationship with stakeholders
- Able to translate business requirements to data warehouse modeling specifications
- Able to demonstrate creative problem solving skill
- A team player who loves to collaborate with others and can work independently when needed