Research Internship (F/M/NB) - Efficient Neural Representation of Large-Scale Environments - La Forge

Sep 29, 2024
Bordeaux, France
... Not specified
... Internship
Full time
... Office work

A ray casting operation involves shooting a ray from a point into a 3D scene to detect intersections with objects. It is a fundamental task in game engines, forming the core of key systems such as rendering, collision detection, and even AI behavior. Given its widespread use across multiple subsystems, having a robust and efficient ray casting implementation is essential for real-time performance.

Ray casting is often accelerated with a bounding volume hierarchy data structure (BVH) [1, 2] to quickly find intersections, as it allows skipping empty space when tracing the ray. However, BVH traversal is an irregular algorithm, heavily influenced by the complexity and size of the scene, as well as the specific query (starting point and direction). This results in divergence in memory access and branch execution, making it less efficient on GPUs. Moreover, BVHs can have a significant memory footprint, especially when handling large, open worlds.

Neural methods have shown impressive potential for data compression and representation. More importantly neural network (NN) execution, especially fully connected ones, is considered a regular algorithm, relying on dense matrix multiplications with predictable memory access patterns, which are GPU-friendly. Recent research has explored replacing the BVH with neural networks, but most of these methods are focused on high-quality, dense objects [3, 4] or limit the network to output only visibility information [5].

The goal of this internship is to design an efficient neural representation capable of learning a large-scale scene and outputting high-dimensional information beyond simple visibility (e.g., distance, material semantics), providing a more comprehensive solution for ray casting in complex environments.

References :

[1] Meister D. et al. “A survey on bounding volume hierarchies for ray tracing”. Computer Graphics Forum (2021).

[2] Meister D. et al. “Performance comparison of bounding volume hierarchies for gpu ray tracing”. Journal of Computer Graphics Techniques (JCGT) (2022).

[3] Weier, P. et al. “N-BVH: Neural ray queries with bounding volume hierarchies.” ACM SIGGRAPH (2024).

[4] Fujieda, S. et al. “Neural Intersection Function.“ arXiv preprint arXiv:2306.07191  (2023).

[5] Zhi Y. et al. “Efficient Visibility Approximation for Game AI using Neural Omnidirectional Distance Fields.” Proceedings of the ACM on Computer Graphics and Interactive Techniques (2024).

  • Currently a second-year master’s student or a third-year engineering student.
  • Solid foundation in Machine Learning, linear algebra, and signal processing.
  • Knowledge of computer graphics fundamentals, including Raytracing, is a plus.
  • Proficiency in Python, and familiar with deep learning frameworks (e.g., PyTorch, TensorFlow).
  • Familiarity with C++ is a plus.
  • Proficient in English, both written and spoken, with the ability to clearly communicate technical concepts and collaborate effectively with an international team.

This job is open for an internship (6-month contract).

Supervision :

Antoine Houdard – antoine.houdard@ubisoft.com.
Georges Nader – georges.nader@ubisoft.com

Remote: hybrid model

Process:

  • Interview with our recruiter
  • One or more technical and project interviews with the manager and his team

If your application is not retained, you will receive a negative answer.

We are working to enrich players’ lives through unique and memorable gaming experiences and by improving the positive impacts of our games. To get there, we are creating a safer, more inclusive work environment, we are giving back to the communities where Ubisoft operates by working with local non-profit partners and by working to reduce the environmental impact of our business. Learn more on our Social Impact here

Check out this guide to help you with your application, and learn about our actions to encourage more diversity and inclusion.