Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Data Engineer II

Meijer

Meijer

Data Engineer II

Grand Rapids, MI

As a family company, we serve people and communities. When you work at Meijer, you’re provided with career and community opportunities centered around leadership, personal growth and development. Consider joining our family – take care of your career and your community!

 

Meijer Rewards

  • Weekly pay

  • Scheduling flexibility

  • Paid parental leave 

  • Paid education assistance

  • Team member discount

  • Development programs for advancement and career growth

 

Please review the job profile below and apply today!

Want more jobs like this?

Get jobs in Grand Rapids, MI delivered to your inbox every week.

By signing up, you agree to our Terms of Service & Privacy Policy.

We are seeking a skilled Data Engineer II to design, develop, and maintain data pipelines and solutions on Azure Cloud. The ideal candidate will have expertise in Azure Data Factory, Databricks, Synapse, Cosmos DB, and Azure Microservices to support real-time and batch processing needs.

This position will follow a hybrid schedule: Monday-Wednesday in Grand Rapids MI office, Thursday-Friday remote.


 

What You'll Be Doing:

  • Design, develop, and optimize ETL pipelines using Azure Data Factory (ADF). 
  • Implement big data processing solutions using Azure Databricks and Apache Spark. 
  • Develop and manage data models and data warehouses on Azure Synapse Analytics. 
  • Work with Cosmos DB to build and optimize NoSQL database solutions. 
  • Utilize Azure Microservices (Azure Functions, Kubernetes, Logic Apps) for scalable data processing. 
  • Implement data security, governance, and compliance best practices. 
  • Troubleshoot performance issues and optimize queries for efficient data processing. 
  • Collaborate with Data Scientists, Analysts, and Business Teams to provide high-quality data solutions. 

What You Bring with You (Qualifications):

  • Bachelor’s or Master’s degree in computer science, Data Engineering, or a related field. 
  • 5-8 years of experience in data engineering, focusing on Azure-based solutions. 
  • Strong experience with Azure Data Factory (ADF), Databricks, and Synapse Analytics. 
  • Proficiency in SQL, Python, or Scala for data transformation and scripting. 
  • Hands-on experience with Azure Cosmos DB and other Azure data storage solutions. 
  • Understanding of Azure Microservices and containerized services (Kubernetes, Docker). 
  • Knowledge of data governance, security, and compliance best practices. 
  • Strong problem-solving and analytical skills. 

Client-provided location(s): Grand Rapids, MI, USA
Job ID: 13754_R000579357
Employment Type: Full Time