At Broadridge, we've built a culture where the highest goal is to empower others to accomplish more. If you're passionate about developing your career, while helping others along the way, come join the Broadridge team.
Key Responsibilities:
- Analyzes and solve problems using technical experience, judgment and precedents
- Provides informal guidance to new team members
- Explains complex information to others in straightforward situations
- Design & Develop Scalable Data Pipelines: Leverage AWS technologies to design, develop, and manage end-to-end data pipelines with services like ETL, Kafka, DMS, Glue, Lambda, and Step Functions.
- Orchestrate Workflows: Use Apache Airflow to build, deploy, and manage automated workflows, ensuring smooth and efficient data processing and orchestration.
- Snowflake Data Warehouse: Design, implement, and maintain Snowflake data warehouses, ensuring optimal performance, scalability, and seamless data availability.
- Infrastructure Automation: Utilize Terraform and CloudFormation to automate cloud infrastructure provisioning, ensuring efficiency, scalability, and adherence to security best practices.
- Logical & Physical Data Models: Design and implement high-performance logical and physical data models using Star and Snowflake schemas that meet both technical and business requirements.
- Data Modeling Tools: Utilize Erwin or similar modeling tools to create, maintain, and optimize data models, ensuring they align with evolving business needs.
- Continuous Optimization: Actively monitor and improve data models to ensure they deliver the best performance, scalability, and security.
Want more jobs like this?
Get jobs in Bangalore, India delivered to your inbox every week.
- Cross-Functional Collaboration: Work closely with data scientists, analysts, and business stakeholders to gather requirements and deliver tailored data solutions that meet business objectives.
- Data Security Expertise: Provide guidance on data security best practices and ensure team members follow secure coding and data handling procedures.
- Innovation & Learning: Stay abreast of emerging trends in data engineering, cloud computing, and data security to recommend and implement innovative solutions.
- Optimization & Automation: Proactively identify opportunities to optimize system performance, enhance data security, and automate manual workflows.
- Snowflake Data Warehousing: Hands-on experience with Snowflake, including performance tuning, role-based access controls, dynamic Masking, data sharing, encryption, and row/column-level security.
- Data Modeling: Expertise in physical and logical data modeling, specifically with Star and Snowflake schemas using tools like Erwin or similar.
- AWS Services Proficiency: In-depth knowledge of AWS services like ETL, DMS, Glue, Step Functions, Airflow, Lambda, CloudFormation, S3, IAM, EKS and Terraform.
- Programming & Scripting: Strong working knowledge of Python, R, Scala, PySpark and SQL (including stored procedures).
- DevOps & CI/CD: Solid understanding of CI/CD pipelines, DevOps principles, and infrastructure-as-code practices using tools like Terraform, JFrog, Jenkins and CloudFormation.
- Analytical & Troubleshooting Skills: Proven ability to solve complex data engineering issues and optimize data workflows.
- Excellent Communication: Strong interpersonal and communication skills, with the ability to work across teams and with stakeholders to drive data-centric projects.
- Bachelor's degree in computer science, Engineering, or a related field.
- 7-8 years of experience designing and implementing large-scale Data Lake/Warehouse integrations with diverse data storage solutions.
- Certifications:
- AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect (preferred).
- Snowflake Advanced Architect and/or Snowflake Core Certification (Required).