Data Engineering Intern
Help our team build reliable ETL pipelines and improve data quality for the Pipevia platform.
About the role
You will work with senior engineers to build connectors, transformation logic and data tests. You’ll learn how modern data stacks move data from sources to warehouses reliably.
What you’ll do
- Implement small features in connectors and transforms
- Write SQL to clean and validate datasets
- Add tests and assertions to improve data quality
- Document findings and share learnings with the team
Requirements
- Basic SQL and one scripting language (Python preferred)
- Understanding of tables, joins and incremental loads
- Clear communication and willingness to learn
Nice to have
- Course project using Snowflake/BigQuery/Redshift
- dbt basics and data testing experience
- Familiarity with Git and command line
Ready to apply?
Send your resume and links to projects.
