Read the interview with Simone Porcu and his winning proposal: QoE evaluations for multi-cloud streaming (QEST)

Simone Porcu and University of Cagliari / SJM TECH

Simone Porcu is an Assistant Professor at the University of Cagliari and a member of the Networks for Humans Laboratory, a research group specialising in multimedia Quality of Experience (QoE) and immersive technologies. SJM TECH is a small-to-medium enterprise with extensive experience in 3D modelling, AR/VR development, and cultural heritage applications. Both SJM TECH and the Networks for Humans Laboratory share a common research focus: enhancing human communication by advancing technologies for transmission and interaction. Their partnership paves the way for the development of innovative solutions in immersive streaming and next-generation point cloud-based experiences, combining academic research with real-world application.

Can you give a brief overview of your winning proposal?
What are its key objectives and innovative aspects?

The QEST project aims to explore the limits of immersive streaming by examining how point cloud compression impacts user experience on two of the most advanced head-mounted displays currently available: the Apple Vision Pro and the Meta Quest 3. To do this, we first extend the SPIRIT testing platform to support these next-generation devices, enabling subjective evaluations of point clouds. In parallel, we introduce an adaptive streaming algorithm designed to handle multiple point clouds in real time within the same 3D scene. By adjusting the Level of Detail for each point cloud according to the user’s field of view, network bandwidth, and device capabilities, the algorithm ensures a seamless experience while staying within technical constraints. QEST will also make available a public dataset of 3D tourist sites. Altogether, by combining algorithmic innovation with platform expansion, QEST strengthens the SPIRIT infrastructure’s scalability and compatibility.

What motivated you to apply for the SPIRIT Open Call?

Since the beginning of my academic journey, I’ve focused on QoE for multimedia — from video streaming to today’s immersive technologies. My goal has always been to understand how users perceive and interact with content across devices and networks. The SPIRIT Open Call aligned perfectly with this trajectory. Its focus on immersive telepresence, 5G, and XR infrastructure provided the ideal opportunity to take my research a step further. It allowed us to validate our work on adaptive streaming in real-world scenarios while collaborating with excellent researchers and developers in the XR field.

How do you envision this project making an impact?

QEST will drive innovation in immersive media from two key perspectives. First, by extending the SPIRIT platform to support the latest HMDs, we unlock new possibilities for QoE testing on devices not yet studied in depth. This provides practical insights for XR developers and manufacturers. Second, our adaptive algorithm enables real-time streaming of multiple point clouds in a 3D scene, selecting the Level of Detail that maximises user QoE while staying within bandwidth and device limits. Together, these contributions pave the way for scalable, high-quality XR experiences and support future research and standardisation in immersive streaming.