Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 1 - Hyper360 (Enriching 360 media with 3D storytelling and personalisation elements)

Teaser

The media sector is important for the economy since it provides high value added jobs and growth potential. The European media can build upon a history of excellent creativity - deeply rooted in the centenary European culture. However, it faces challenges from both traditional...

Summary

The media sector is important for the economy since it provides high value added jobs and growth potential. The European media can build upon a history of excellent creativity - deeply rooted in the centenary European culture. However, it faces challenges from both traditional competitors and emerging ones.
New technologies can help the European media sector aligned to compete worldwide, while by unleashing new forms of expressive creativity.
Hyper360 results being a complete solution for the capture, production, enhancement, delivery and consumption of free viewpoint videos (FVV) to OTT media, can do just that. Envisioning the convergence of omnidirectional (360°) and 3D content for increasingly effective immersive experiences Hyper360 will add new storytelling features to current 360° videos.
As a result, the European media sector will have a better base to compete in the global arena.
An example of this is the possibility of providing clickable dynamic placeholder objects in conjunction with user profiling and preferences gathering. By leveraging these features broadcasters using the Hyper360 toolset will be able to offer more effective advertisements and this in turn will allow them to obtain higher revenues from advertising.

Work performed

During the first reporting period the consortium defined the scenarios, collected broadcaster requirements, drafted the overall architecture and deployed the research infrastructure (source code repositories, automated unit tests).
It also decided to adopt an incremental research approach, able to provide enough flexibility to improve/refine the major architectural choices (main components, communication among them) being made.
So far project results are on schedule.
The Hyper360 toolset is being realized as a software suite supporting three main phases, namely capture (or recording), post-production (or annotation) and personalised delivery.
- Capture phase
The capturing tool OmniCap will support professional 360° video recording with camera setups ranging from simple dual lenses integrated devices to sophisticated multicamera 360° rigs. We achieved a first operational prototype showing satisfactory features/performance and already connected to the online Quality Check component to analyse the recorded footage and identify defects/problems in near real time.
In addition, the cost effective CapTion’s multi-view 3D capturing tool allows for the capturing of multi-view RGB-D data by leveraging a new, easy-to-use multi-sensor calibration process. Recent sensors are employed with a modular design allowing the integration of future sensors. The work performed so far delivered a fully functional capturing tool that was successfully set up in one of the content partners premises.
- Post-production phase
The post-production phase will be supported by two main tools: OmniConnect, and CapTion.
OmniConnect is used to add interactive elements to 360° videos . A first prototype is operational and supports the overlay of text elements, HTML and audio/video containers, providing the more relevant features with satisfactory performance. It allows different configurations (sets of overlay elements) to be added to the same 360° video file, keeping post flexible and able to define different experiences for different viewers.
The CapTion tool will be used to add 3D storytelling characters (Mentors). Mentors will be narrative element able to tell a story and guide the viewer during the immersive experience. We developed a first operational prototype, producing 3D free viewpoint content from the
multi-view data of human performances acquired by our multi-view capturing tool. In addition, a ray-casting engine was used to develop the fusion engine that will drive and - in the future guide - the merging of the two heterogeneous media (360° video and 3D free viewpoint video). With respect to guidance, preliminary research already produced state-of-the-art results in 360° scene understanding, as well as a new dataset that has been publicly offered for research. The final engine will use AI to guide the fusion process and achieve higher photorealism when merging 360° and 3D video.
- Presentation phase
The presentation phase will be supported by three main results:
1) OmniPlay will be the multiplatform Hyper360 player technology offering the full set of features on several platforms (not only playback of enriched 360° media but also the back transmission of user behaviour data to the server - to evaluate user preferences). The work performed so far led to a first operational prototype of the OmniPlay on iOS.
2) personalization services, be delivered by the Profiling and Recommendation Engines. The Profiling Engine will support implicit understanding of user preferences (interests vs. disinterests) based on users’ viewing behaviour, while the Recommendation Engine will carry out inferences based on complex domain and user-specific knowledge as well as the situational context to provide personalized content delivery. The work performed so far resulted in a baseline version for semantic interpretation of content and in extend of user preferences, as well as a first version of a tractable, cross-platform Recommendation Engine.
3) OmniCloud, the de

Final results

The 360° video domain is a dynamic and rapidly evolving sector in which technical progress happens on a sustained rate.
In relation to the state-of-the-art, Hyper360 has already delivered a state-of-the-art open low-cost and easy-to-use multi-view RGB-D capturing system and a novel and fast performance capture method based on multi-view RGB-D data. In addition, the Hyper360 research results currently represent the state-of-the-art in 360° depth estimation which were achieved using a novel dataset generated within the Hyper360 project.
Moreover, Hyper360 has extended and delivered a state-of-the-art, lightweight yet effectively expressive fuzzy reasoning service, used for real-time content-to-user preference matchmaking in the context of content recommendation, as well as a user-pertinent, cross-cutting ontology for the networked media domain.
Hyper360 will strive to keep its results at the worldwide state of the art, and go beyond it.
To do so the consortium plans to leverage key features of the Hyper360 concept and design:
- Hyper360 will realise a complete and integrated software suite,
- Information layered by Hyper360 tools on 360° video will be supplement the original video as a separate set of data, therefore better preserving flexibility in the postproduction annotation and personalization process,
- Hyper360 will adopt open standards whenever possible
- Hyper360 will deliver all advanced functionalities such as the support for moving (dynamic) hotspots linked to moving objects shot in the original video or the support for recommendation and personalisation of content linked to past viewer preferences and behavior as well as current situational context.

Website & more info

More info: http://www.hyper360.eu/.