Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Senior Data Engineer (with Snowflake)

AT Exadel
Exadel

Senior Data Engineer (with Snowflake)

Sofia, Bulgaria

We are seeking a highly skilled Snowflake Data Engineer to design, develop, and manage our cloud-based data infrastructure using Snowflake. The ideal candidate will have deep experience in data warehousing, ETL/ELT development, and cloud data platform architecture, with a strong focus on performance, scalability, and reliability.

Work at Exadel - Who We Are

We don’t just follow trends—we help define them. For 25+ years, Exadel has transformed global enterprises. Now, we’re leading the charge in AI-driven solutions that scale with impact. And it’s our people who make it happen—driven, collaborative, and always learning.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field
  • 3+ years of background in data engineering or data platform development
  • Hands-on experience with Snowflake, including performance tuning, data modeling, and advanced SQL
  • Proficiency in SQL and scripting languages such as Python or Scala
  • Competency in ETL/ELT frameworks and orchestration tools (e.g., dbt, Airflow, Talend)
  • Knowledge of cloud platforms such as AWS, Azure, or GCP
  • Understanding of data warehousing concepts, star/snowflake schema design, and normalization/denormalization techniques
  • Practice in agile development environments and using tools like Jira or Confluence

Nice to have

Want more jobs like this?

Get Data and Analytics jobs delivered to your inbox every week.

Select a location
By signing up, you agree to our Terms of Service & Privacy Policy.
  • Snowflake certification (e.g., SnowPro Core/Advanced)
  • Experience with data governance and privacy frameworks (GDPR, HIPAA, etc.)
  • Exposure to machine learning pipelines and streaming data (Kafka, Kinesis)
  • Familiarity with CI/CD pipelines, Git, and infrastructure-as-code tools like Terraform

English level

Upper-Intermediate

Responsibilities

  • Design, implement, and optimize data pipelines in Snowflake to support business intelligence, analytics, and data science needs
  • Develop and manage scalable ELT/ETL workflows using tools like dbt, Apache Airflow, Matillion, or similar
  • Model and maintain data warehouses/data lakes, ensuring best practices in dimensional modeling and data partitioning
  • Create and maintain secure and governed data environments, enforcing data access policies and roles
  • Monitor data pipelines for performance, reliability, and cost-efficiency
  • Collaborate with data analysts, data scientists, and other engineering teams to understand data needs and deliver solutions
  • Integrate Snowflake with external data sources and tools such as AWS S3, Azure Data Lake, Kafka, or third-party APIs
  • Troubleshoot and resolve data issues, ensuring data quality and consistency across platforms
  • Automate and maintain CI/CD pipelines for data infrastructure deployments
  • Stay up to date with Snowflake features and industry best practices
Client-provided location(s): Sofia, Bulgaria; Georgia; Hungary; Lithuania; Poland; Romania
Job ID: 5549971004
Employment Type: Other