Projects

Projects

At the Research and Computing Center’s Visualization Lab, we foster collaborative, cutting-edge research in the physical sciences, the social sciences, and the humanities. Our projects are diverse and include computational models of cell behaviors, the creation of interactive stellar and interstellar maps, development of medical tools for enhanced cancer detection and diagnoses, VR models of the human heart that aid in cardiac surgery, and geospatialization of historical cultural data, among so much more. In order to bring this work to life, we partner with researchers from diverse fields across the University of Chicago’s campus, as well as with researchers at Fermilab, Argonne National Laboratory, and the Adler Planetarium.

3D Visualization

Scientific Visualization

Information Visualization

Medical Imaging

Stereoscopic Visualization

Virtual Reality

3D Visualization

3D visualization is the creation of 3-dimensional computational models from both 2-dimensional and 3-dimensional datasets. We’ve leveraged this form of visualization to model binary data, histopathology images, MRI/CT data, electron microscopy data, single-plane illumination microscopy, and ultrasound data, among much more. Our work in this field has supported researchers as they explore the adaptations of nonhuman animals.

3D Reconstruction of Segmented Leukocyte Cells in Cephalopod Skin

The optical properties of cephalopod skin provide the basis for their sophisticated camouflage capabilities. These properties are due to two cell types – leucocytes and iridocytes – each of which contain biological nanoparticles that diffuse and reflect light in unusual ways. In collaboration with the University’s Marine Biology Laboratory, we created a 3D reconstruction of cephalopod leucocyte cells in order to computationally model their optical behaviors.

Scientific Visualization

The University of Chicago’s labs collect and generate large datasets ranging from massive star envelopes, to high resolution planetary imaging, from simulated cell behavior to simulated chemical interactions of tens of thousands of substances. Using the Research Computing Center’s Midway supercomputing clusters, we process these data and visualize the resulting information in forms that support ongoing research across the sciences and the humanities.

Visualization of Villi Cells

Intestinal villi are finger-like projections that line the interior of the intestines and thereby increase its absorptive surface area. In collaboration with the University’s Department of Surgery, we are creating agent-based simulations of villi cells in order to study inflammatory diseases of the intestines. Where villi grow out of an underlying substrate of ‘crypt’ cells, our simulations translate this substrate into a planar surface in which 3 different species of simulated villi cells grow. The goal of this project is to develop a rendering method for villi cell visualizations.

Information Visualization

Information Visualization is the display of abstract data such as to tell a story and/or reconstruct the context from which that data was drawn. It can be leveraged to produce compelling visual narratives that convey humanistic as well as scientific data. In collaboration with the Neubauer Collegium, we are supporting researchers from sociology, the history of science and medicine, political science, and romance languages and literatures as they reconstruct historical events, trace textual transmission, and represent contemporary issues in journalism.

Visualization for Understanding and Exploration (VUE)

Visualization for Understanding and Exploration (VUE) is a research project that explores how data visualization may be leveraged to conduct and convey research in the humanities and the humanistic social sciences. VUE is supported by the Neubauer Collegium for Culture and Society. Through the Vis Lab, VUE provides ongoing technical support to humanist researchers with projects that seek to visualize research that engages with text analysis and geographical mapping. More information on VUE can be found here

Medical Imaging

Medical imaging refers to the visualization of the interior of the human body through noninvasive means. It consists of a suite of technologies, including MRI, x-ray, ultrasound, endoscopy, thermography, etc. Collectively, these technologies are used to monitor changes in the body across time, as well as to diagnose illness, and select treatment strategies. In collaboration with doctors from the University of Chicago Medical Center, the University’s MRI research Center, and leading technology companies, we are developing tools to improve prostate cancer detection and to train the next generation of medical professionals.

Deep Learning Tools for Automated Prostate Cancer Detection

Prostate Cancer (PCa) is the most common non-cutaneous malignancy and one of the most common causes of cancer-related death of men in the United States. In collaboration with Lenovo, Intel, and the MRI Research center at The University of Chicago, we’ve developed tools for the automatic segmentation of the prostate gland as a whole as well as of the discrete sectors of the prostate gland. We’ve additionally developed tools for the automatic detection of the significant tumor findings.

Learn Radiology: Prostate

A web application for training radiologists in cancer detection

Prostate cancer accounts for 20% of all new cancer cases amongst US men with approximately 174,650 new men diagnosed each year. In 2019 alone it accounted for approximately 31, 620 deaths, which is the highest cancer related death after lung, pancreatic and colon & rectal cancers. In collaboration with researchers in the University’s Department of Radiology, we are developing Learn Radiology: Prostate in order to better train radiology residents and clinicians to recognize prostate pathology and to address the epidemic of prostate related mortality and morbidity rates. This web application uses multi-parametric MRI, dynamic contrast enhanced MRI, T2-weighted, diffusion-weighted, and T1-weighted Imaging, and whole mount histology datasets. Learn Radiology: Prostate can be found here.

Pcamp Review Tool

In collaboration with radiologists in the MRI Research Center at the University of Chicago, the RCC has developed Multiparametric Image Review (mpReview), a 3D Slicer module that is intended to support review and annotation of multiparametric image data. The driving use case for the development of this module is the review and segmentation of regions of interest in prostate cancer multiparametric MRI.

Brain Atlas

Brain Atlas is an interactive digital tool that allows users to explore the major neuroanatomical structures of the brain. Through its online interface and its database of MRI scans, this project supports neurobiological teaching and learning, as well as testing and correlating lesions found on patients’ scans. The development of Brain Atlas was headed by Professor Hekmat-Panah MD, a neurosurgeon and neurologist with 35 years of experience.

Stereoscopic Visualization

Stereoscopic visualizations are produced by projecting two nearly identical images onto a single screen and thereby producing an illusion of 3-dimensionality. This form of visualization supports intuitive visual comprehension of physical material gathered from scans and of data gathered from simulated interactions of physical substances and phenomena. Through stereoscopic visualization of their data, we support University of Chicago researchers as they train the next generation of surgeons, radiologists, and chemists.

Chemistry Lab 

Once a year, molecular engineering students use the stereoscopic projector to visualize protein simulations that they have developed using the RCC’s Midway supercomputing cluster. Students have modeled 2m4j, a molecule found in the brains of Alzheimer’s patients, and 2zoi, a molecule that is important for the study of photochemisty.

Virtual Reality

Virtual Reality (VR) refers to a simulated environment in which users may interact with realistic virtual objects. The experiential verisimilitude of VR makes it useful for a wide array of fields and practices. These include entertainment applications such as video games, occupational training of pilots, surgeons, astronauts and others, and the recreation of real environments – such as the Louvre or the surface of Mars – that are inaccessible to most users. At the Visualization Lab, we’ve used VR to model the human heart, map human migration patterns, and explore data generated by CERN’s supercollider.

Visualization of Genes Mirroring Geography

Developed as a collaboration between John Novembre, Professor of Human Genetics, and computational scientists at the Research and Computing Center, this project used principal component analysis (PCA) to analyze genetic variation across Europe. Results of this analysis were then represented through a 3D visualization produced by the RCC’s Data Visualization Lab.

Bento Box

Bento Box is a virtual reality visualization environment that simulates the blood flow around a cardiac lead in the right atrium of the heart. Designed to support cardiologists as they train medical students and prepare for heart replacement valve surgery, it allows for the generation and sub-volume selection of multiple time-varying data instances. Bento Box is a collaborative tool between Dr. H. Birali Runesha, Assistant Vice President for Research Computing and Director of the Research Computing Center, and the University of Minnesota.

Page Reader Press Enter to Read Page Content Out Loud Press Enter to Pause or Restart Reading Page Content Out Loud Press Enter to Stop Reading Page Content Out Loud Screen Reader Support