Data Modeler
Infosys is looking for Data Modeler. You would be a Polyglot focusing on core programming in multiple programming languages, while ensuring performance, quality, scalability and extensibility. Drive Innovation in chosen domain. Work with teams to build enterprise level software applications with predictable agile DevSecOps model. The full stack developer will join a small team that uses new technology to solve challenges for both the front-end and back-end architecture, ultimately delivering amazing experiences for global users.
Required Qualifications:
Candidate must be located within traveling distance of Richardson, TX or Raleigh, NC or be willing to relocate to the area.
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
Min 3 years of core development experience in below stacks:
Deep expertise in Scala or Python for Spark application development
Proficiency in designing conceptual, logical, and physical data models.
Strong SQL skills and experience with Snowflake-specific features (e.g., Snowpipe, Streams, Tasks, Time Travel).
Familiarity with data modeling tools (e.g., ER/Studio, ERwin, dbt, or similar).
Understanding of data warehousing principles, dimensional modeling, and normalization techniques.
Experience integrating structured and semi-structured data (e.g., JSON, XML) in Snowflake.
Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
Preferred Qualifications:
Experience in data warehousing technologies, ETL/ELT implementations
Sound Knowledge of Software engineering design patterns and practices
Strong understanding of Functional programming.
Experience with Ranger, Atlas, Tez, Hive LLAP, Neo4J, NiFi, Airflow, or any DAG based tools
Knowledge and experience with Cloud and containerization technologies: Azure, Kubernetes, OpenShift and Dockers
Experience with data visualization tools like Tableau, Kibana, etc
Experience with design and implementation of ETL/ELT framework for complex warehouses/marts Knowledge of large data sets and experience with performance tuning and troubleshooting
Building end-to-end data integration and data warehousing solutions for analytics teams.
Want more jobs like this?
Get jobs in Phoenix, AZ delivered to your inbox every week.

Planning and Co-ordination skills
Experience and desire to work in a Global delivery environment.
Ability to work in team in diverse/ multiple stakeholder environment.
The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face.
Perks and Benefits
Health and Wellness
- Health Insurance
- Life Insurance
- HSA
- Short-Term Disability
Parental Benefits
- Birth Parent or Maternity Leave
- Non-Birth Parent or Paternity Leave
- On-site/Nearby Childcare
Work Flexibility
Office Life and Perks
- Commuter Benefits Program
Vacation and Time Off
- Paid Vacation
- Paid Holidays
- Personal/Sick Days
- Sabbatical
Financial and Retirement
- 401(K)
- Relocation Assistance
Professional Development
- Learning and Development Stipend
Diversity and Inclusion
- Employee Resource Groups (ERG)