Role: Data Engineer
Location: Poland
Type of work : remote
Responsibilities:
- Design, develop, and maintain scalable data pipelines using Python, ADF, and Databricks
- Implement ETL process to extract, transform, and load data from various sources into Snowflake
- Ensure data is processed efficiently and is made available for analytics and reporting
• 8+ years of experience in data engineering, with a focus on Python, ADF, Snowflake, Databricks, and ETL processes.
• Strong experience with data modeling, data warehousing, and database design.
• Proficiency in SQL and experience with cloud-based data storage and processing.
• Strong problem-solving skills and the ability to work in a fast-paced environment
Want more jobs like this?
Get jobs in Krakow, Poland delivered to your inbox every week.
• Excellent communication skills, with the ability to work directly with customers and understand their needs
• Experience with Agile methodologies and working in a collaborative team environment.
• Certification in Snowflake, Azure, or other relevant technologies is an added advantage
• Bachelor's degree in computer science engineering, Information Systems or equivalent field