Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Research Scientist Intern (TikTok-Privacy Innovation Lab-GPU Systems & Model Optimization) - 2026 Start (PhD)

Today San Jose, CA

Responsibilities

At TikTok, we treat privacy as our top priority in our product design and implementation. Privacy is not just about regulation compliance, but also about a more trusted way to enable technology innovation by respecting users' privacy choices!

About the Team
Privacy Innovation (PI) Lab is established to explore the next frontier of privacy technology and theory in the digitalized world. We provide key insights and technical solutions on privacy-related innovation for all TikTok's products. Furthermore, we also collaborate with worldwide technical and academic communities to build an open ecosystem to promote a privacy-friendly digital experience.

We are looking for talented individuals to join us for an internship in 2026. PhD Internships at our company aim to provide students with the opportunity to actively contribute to our products and research, and to the organization's future plans and emerging technologies.

PhD internships at our company provides students with the opportunity to actively contribute to our products and research, and to the organization's future plans and emerging technologies. Our dynamic internship experience blends hands-on learning, enriching community - building and development events, and collaboration with industry experts.

Applications will be reviewed on a rolling basis. We encourage you to apply early. Please state your availability clearly in your resume (Start date, End date).

About the Role
We are building next-generation generative foundation models, with a strong focus on diffusion-based and unified generation-understanding architectures, deployed in privacy-sensitive, production environments.
This role sits at the intersection of
- Large-scale model training systems
- GPU-first architecture and kernel-level optimization
- Diffusion / DiT / unified multimodal foundation models
- Privacy-preserving and compliant training pipelines

You will work on end-to-end training architecture design, from model-parallel execution and GPU efficiency to robust, fault-tolerant, privacy-aware training infrastructure.

Responsibilities:
You will work directly on core operators and system-level performance optimization for large-scale models, including but not limited to:
1. Design and implement high-performance GPU kernels for core components such as: Transformer / Attention / MoE / Diffusion
2. Perform end-to-end optimization for large model training workloads
3. Conduct in-depth analysis of GPU execution bottlenecks, including compute, memory, and scheduling
4. Use and extend Triton / CUDA / CUTLASS, and integrate optimized kernels with PyTorch / XLA / custom runtimes
5. Collaborate closely with model research teams to: Translate new model architectures into efficient, production-ready implementations
6. Reproduce, benchmark, and improve state-of-the-art system optimization techniques, validating gains in real training and inference settings

Qualifications

Minimum Qualifications
1. Currently pursuing PhD in Computer science, computer engineering, or a related technical discipline.
2. Solid understanding of GPU architecture and execution models
3. Proficiency in CUDA C++ or Triton, with the ability to independently write and optimize kernels
4. Strong familiarity with Transformer / Attention computation patterns and performance bottlenecks
5. Ability to read, reproduce, and reason about systems papers or open-source implementations

Preferred Qualifications
1. Hands-on experience with large-scale model training
2. Familiarity with PyTorch internals (e.g., Autograd, dispatcher, ATen)
3. Experience with kernel profiling and performance tuning (e.g., Nsight, nvprof, nsys)
4. Publications, open-source contributions, or performance benchmark results

By submitting an application for this role, you accept and agree to our global applicant privacy policy, which may be accessed here: https://careers.tiktok.com/legal/privacy

Want more jobs like this?

Get jobs in San Jose, CA delivered to your inbox every week.

Job alert subscription
Client-provided location(s): San Jose, CA
Job ID: TikTok-7602699537740892469
Employment Type: INTERN
Posted: 2026-02-06T19:59:59

Perks and Benefits

  • Health and Wellness

    • Health Insurance
    • Dental Insurance
    • Vision Insurance
    • HSA
    • Life Insurance
    • Fitness Subsidies
    • Short-Term Disability
    • Long-Term Disability
    • On-Site Gym
    • Mental Health Benefits
    • Virtual Fitness Classes
  • Parental Benefits

    • Fertility Benefits
    • Adoption Assistance Program
    • Family Support Resources
  • Work Flexibility

    • Flexible Work Hours
    • Hybrid Work Opportunities
  • Office Life and Perks

    • Casual Dress
    • Snacks
    • Pet-friendly Office
    • Happy Hours
    • Some Meals Provided
    • Company Outings
    • On-Site Cafeteria
    • Holiday Events
  • Vacation and Time Off

    • Paid Vacation
    • Paid Holidays
    • Personal/Sick Days
    • Leave of Absence
  • Financial and Retirement

    • 401(K) With Company Matching
    • Performance Bonus
    • Company Equity
  • Professional Development

    • Promote From Within
    • Access to Online Courses
    • Leadership Training Program
    • Associate or Rotational Training Program
    • Mentor Program
  • Diversity and Inclusion

    • Diversity, Equity, and Inclusion Program
    • Employee Resource Groups (ERG)

Company Videos

Hear directly from employees about what it is like to work at TikTok.