Research Profile: Jason Kimball

From Gravity
Jump to: navigation, search

Jason Kimball
jkimball@ucsd.edu

Two 4K videos from Baraka

Ph.D Research Projects

Low bandwidth Desktop and Video Streaming for Collaborative Tiled Display Environments [1]
High-resolution display environments built on networked, multi-tile displays have emerged as an enabling tool for collaborative, distributed visualization work. They provide a means to present, compare, and correlate data in a broad range of formats and coming from a multitude of different sources. Visualization of these distributed data resources may be achieved from a variety of clustered processing and display resources for local rendering and may be streamed on demand and in real-time from remotely rendered content. The latter is particularly important when multiple users want to concurrently share content from their personal devices to further augment the shared workspace. This paper presents a high-quality video streaming technique allowing remotely generated content to be acquired and streamed to multi-tile display environments from a range of sources and over a heterogeneous wide area network.

The presented technique uses video compression to reduce the entropy and therefore required bandwidth of the video stream. Compressed video delivery poses a series of challenges for display on tiled video walls which are addressed in this paper. These include delivery to the display wall from a variety of devices and localities with synchronized playback, seamless mobility as users move and resize the video streams across the tiled display wall, and low latency video encoding, decoding, and display necessary for interactive applications. The presented technique is able to deliver 1080p resolution, multimedia rich content with bandwidth requirements below 10Mbps and low enough latency for constant interactivity. A case study is provided, comparing uncompressed and compressed streaming techniques, with performance evaluations for bandwidth use, total latency, maximum frame rate, and visual quality. ()

Collaborative video session

Exploration with Live Stereoscopic 3D Video in Mixed Reality Environments [2]
This paper describes an integrated system for the real time acquisition, streaming, and display of stereoscopic 3D video in mixed reality environments. These mixed reality environments combine real-time stereoscopic video with rendered 3D environments composed of sampled and modeled 3D data from actual landscapes or structures. While live video itself is an important part of exploration, reconnaissance, and documentation, the ability to overlay this video with existing models can significantly add to the context for analysis and decision making. We describe the components of a high-resolution stereoscopic video streaming system integrated with a virtual reality visualization system viewed on high-resolution stereoscopic 3D display walls. The 3D display wall provides a field of view for visualization which is able to exceed the field of view of the human visual system, allowing for a more immersive and natural experiences, and far extending the visual canvas provided by a stereoscopic video stream. We demonstrate a live stereoscopic video feed streamed to a visualization wall and mixed in real-time with a virtual model of that location and present several different usage scenarios exploring how this new visualization technique would be beneficial. ()

Stereoscopic live video feed overlayed on a rendered model of the environment.

Ridiculously Scalable Video Playback [3]
This paper introduces a distributed approach for playback of video content at resolutions of 4K (digital cinema) and well beyond. This approach is designed for scalable, high-resolution, multi-tile display environments, which are controlled by a cluster of machines, with each node driving one or multiple displays. A preparatory tiling pass separates the original video into a user definable n-by-m array of equally sized video tiles, each of which is individually compressed. By only reading and rendering the video tiles that correspond to a given node's viewpoint, the computation power required for video playback can be distributed over multiple machines, resulting in a highly scalable video playback system. This approach exploits the computational parallelism of the display cluster while only using minimal network resources in order to maintain software-level synchronization of the video playback. While network constraints limit the maximum resolution of other high-resolution video playback approaches, this algorithm is able to scale to video at resolutions of tens of millions of pixels and beyond. Furthermore the system allows for flexible control of the video characteristics, allowing content to be interactively reorganized while maintaining smooth playback. This approach scales well for concurrent playback of multiple videos and does not require any specialized video decoding hardware to achieve ultra-high resolution video playback. (http://youtu.be/kYpPC63el2w)

3 High resolution videos on an 8 screen display


Tera-Scale Atomistic Visualization [4]
This research explores visualization algorithms for real time exploration of massive, time varying particle datasets produced by atomistic simulations. Challenges include developing level of detail algorithms for unstructured point data and representing sub-pixel features such as occlusion and intersections. The massive size of these datasets (gigabytes per timestep) make simply loading and rendering an image a challenging task. We are developing new algorithms for the efficient exploration of tera-scale datasets which contain millions to billions of objects as well as thousands of timesteps.

This paper describes the rendering techniques used to produce the 3D volume hierarchy as well as the adaptive volume renderer used for interactive visualization. Three case studies are discussed as examples of interactive visualization. Two are from a molecular dynamics simulation, one containing approximately 22 million particles per time-step over 100 time-steps and one containing a range of particles containing a total of 230 million particles over 9 timesteps. The third example is a single timestep containing 128 million particle from a cosmology simulation.

This project is in collaboration with Mark Duchaineau of Lawrence Livermore National Laboratories. (Read More)

Far and near views of two different dislocation dynamics datasets both containing approximately 20 million data points rendered with the volumetric imposter based level of detail algorithm.

Media-Rich Streaming for Remote Simulation and Training [5]
In this paper we present an implementation intended for the training of control console operators which allows remote visualization and interaction with a simulation at HD resolution over a 1.5 Mbps T-1 data line. This allows users to train while on active deployment by using a dummy console which is able to receive and display the video stream while sending user input events back to the simulation computer. The dummy console costs significantly less than an actual simulation console and can be deployed to virtually anywhere in the world with a satisfactory internet connection. Furthermore, we demonstrate the option of using virtual and augmented reality mixed with the streamed simulation content. This allows for training when console workspaces are either unavailable or can not be taken offline from their active use. ()

Using virtual reality to overlay simulation video streams on simulated consoles.

HD Teleconferencing and Video Mobility
Middleware for the acquisition, streaming and presentation of HD video and audio sources. This project ties together video capture hardware with a real-time texture compression library and multicast streaming protocol to delivery multiple HD resolution AV streams over gigabit networks with support for mobility -- which is useful for tiled displays. Input sources include HD video cameras, laptop/desktop machines and even video game consoles. (Read More)

A 720p60 HD video stream displayed on HIPerSpace that is responsive enough to play video games.

Dynamic IBR techniques for fixed cost stereoscopic support [9]
This project presents a GPU based implementation of an image-based rendering method for reducing the cost of stereoscopic rendering to the cost of rendering a single monoscopic image plus a smaller fixed cost. Our approach is to use the color and depth information from the rendered image of one eye to produce a reconstructed depthsprite which is rendered for the other eye. A GPU hardware accelerated technique for producing and rendering this depthsprite at rates above 60 Hz is presented. Our technique enables the real time stereoscopic display of complex and data intensive objects, which are currently constrained to monoscopic rendering technology. (Read More)

3D anaglyph rendered using fixed cost algorithm



Collaborative Visualization Environment [10] [11]
Imaging techniques such as MRI, fMRI, CT and PET have provided physicians and researchers with a means to acquire high-quality biomedical images as the foundation for the diagnosis and treatment of diseases. This research presents a framework for collaborative visualization of biomedical data-sets, supporting heterogeneous computational platforms and network configurations. The system provides the user with data visualization, annotation and the middleware to exchange the resulting visuals between all participants, in real-time. A resulting 2D visual provides a user specifiable high-resolution image slice, while a resulting 3D visual provides insight into the entire data set. To address the costly rendering of large-scale volumetric data, the visualization engine can distribute tasks over multiple render nodes. (Read More)

A picture of the CVE program with a human head CT scan.

Publications

  • [1] Kimball, J., Wypych, T., Kuester, F. Low Bandwidth Desktop and Video Streaming for Collaborative Tiled Display Environments, Future Generation Computer Systems. In press.
  • [2] Kimball, J., Wypych, T., Kuester, F. Exploration with Live Stereoscopic 3D Video in Mixed Reality Environments, in Aerospace Conference, 2014 IEEE, pp. 1-7.
  • [3] Kimball, J., Ponto, K., Wypych, T., Kuester, F. RSVP: Ridiculously Scalable Video Playback on Clustered Tiled Displays, in International Symposium on Multimedia, 2013 IEEE, pp. 1-7.
  • [4] Kimball, J., Duchaineau, M., Kuester, F. Interactive Visualization of Large Scale Atomistic and Cosmological Particle Simulations. Aerospace Conference, IEEE 2013, pages 1-7.
  • [5] Kimball, J., Wypych, T., Hoepner, S., Kuester, F. Media-Rich Streaming for Remote Simulation and Training. Aerospace Conference, IEEE 2012, pages 1-7.
  • [6] Seracini, M., Kuester, F., De Vita, M., Olsen, M.J., Ponto, K., Kimball, J., Corazzini, S., and Bonini, C. (2010), Alla riscoperta di Palazzo Medici Riccardi, Campagna di indagini diagnostiche per lo studio e la caratterizzazione dell’ evoluzuione architettonica del monumento [In English: “Rediscovering Palazzo Medici Riccardi. Diagnostic Investigation to Study and Characterize the Monument’s Architectural Evolution”], pages 241-249, 2010
  • [7] Olsen, M.J. and Ponto, K and Kimball, J. and Seracini, M. and Kuester, F. (2010). 2D open-source editing techniques for 3D laser scans. Computer Applications and Quantitative Methods in Archeology - CAA’2010.
  • [8] Ponto, K. and Wypych, T. and Doerr, K. and Yamaoka, S. and Kimball, J. and Kuester, F. (2009). VideoBlaster: A Distributed, Low-Network Bandwidth Method for Multimedia Playback on Tiled Display Systems. 11th IEEE International Symposium on Multimedia. 201-206.
  • [9] Kimball, J., Petrovic, V., and Kuester, F. (2006). Dynamic ibr techniques for fixed cost stereoscopic support. In Proceedings of the IEEE VR 2006
  • [10] He, Z., Kimball, J., and Kuester, F. (2005). Distributed and collaborative biomedical data exploration. Lecture Notes in Computer Science, 3804:271 – 278.
  • [11] Kuester, F., He, Z., Kimball, J., Quintos, M., and Tresens, M. A. (2005). Collaborative biomedical data exploration in distributed virtual environments. In et al., J. D. W., editor, Medicine Meets Virtual Reality 13, Studies in Health Technology and Informatics. IOS Press.

Media Highlights

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox