A scene in Star Wars. Life-like avatars. Cave Automatic Virtual Environment. They all germinated in UIC’s Electronic Visualization Lab. A place where the clock is always 15 years fast
By Stuart Luman
On a sunny August afternoon, undergraduates are perusing the tables arranged around the East Campus, learning about clubs and groups to join, while still others are enjoying the warm day reading under the shade of trees. At the same time, 25 Electronic Visualization Laboratory graduate and undergraduate students have gathered in the lab’s Cyber-Commons on the second floor of UIC’s Engineering Research Facility to discuss such hard-core geek topics as tele- immersion, autostereoscopic 3-D imaging and large-scale computer display technologies.
The Cyber-Commons is a 1,000 square-foot, fluorescent-lit conference space within the lab’s warren of cinder-block offices and hallways. Students sit at two long grey tables, focused almost entirely on their laptops. Nine of them busily tweak PowerPoint slides for a presentation to peers on their latest research. Each of their computers also shares a portion of a giant 16-foot-long electronic computer display—made up of eight-tiled flat panel screens—at the front of the room.
Their computers wirelessly communicate with the wall via a system known as SAGE or Scalable Adaptive Graphics Environment, which was designed at EVL. The software currently runs display walls at more than 70 major institutions around the globe, including NASA, Argonne National Laboratory, Beijing University, the Russian Academy of Sciences and King Abdullah University of Science and Technology in Saudi Arabia.
For an outsider, the technology is extraordinary. The mouse pointers of each student skip and race past each other on the giant screen. With a click, an individual can take over the digital workspace or even collaborate with others who have their own displays thousands of miles away, from sharing documents to hosting a live videoconference in multiple windows at speeds that dwarf the fastest home connections.
Since its founding in 1973 as the Circle Graphics Habitat by artist and physicist Daniel Sandin and computer scientist Tom DeFanti, EVL has been a bridge between technology and the arts. A joint program of UIC’s College of Engineering and the School of Art and Design, it was one of the nation’s first such labs.
For the students, however, the current setup is a step down. Earlier in the summer, they had the opportunity to play on a 20-foot-wide display wall (it’s since been disassembled). Equipped with multi-touch technology, it allowed multiple students to interact simultaneously with data, paint on the wall or play video games. Associate Lab Director Maxine Brown claims it was the largest such screen in the world. EVL researchers are currently planning its replacement, which will be unveiled by year end and incorporate 3-D technology.
Arthur Nishimoto ’10 las, a second-year, computer-science master’s degree student, sits at the front of the room. He peers intently at his computer through small-framed metal glasses. Although reserved in person, Nishimoto recently became a minor Internet celebrity, thanks to a YouTube video viewed by more than 600,000 people. It showed him and his fellow students demonstrating Fleet Commander, a Star Wars-themed game he created.
The game allows multiple students to move their fingers over the screen to launch and command Imperial and Rebel Alliance starships, and has a level of natural interaction that is reminiscent of science fiction films such as Minority Report. Representatives from LucasArts were so enamored with the demo that they offered Nishimoto an internship for next summer.
“We’re very proud that our students are so in demand,” says Brown, almost cooing with pride as she speaks. She lists a handful of the institutions where EVL students have gone over the years, including NASA, Argonne, Intel, the Field Museum and Adler Planetarium; and the TV shows and movies they’ve worked on, including Max Headroom, Star Wars, Shrek and The Lord of the Rings. In Nishimoto’s case, Brown also was concerned that he’d drop out of school for a full-time job.
This isn’t the first time that EVL (and EVLers, as they’re known) has received such attention. Jason Leigh, phd ’98 eng, the lab’s director, was recently profiled on PBS’s Nova ScienceNOW and the Discovery Channel’s Popular Science’s Future Of.
What caught the producers’ attention was Leigh and his colleagues’ efforts to create life-like computer avatars. The project, fittingly called LifeLike, seeks to digitally capture a person’s likeness, facial expressions and speech patterns and to create virtual reproductions that can interact with people far into the future. It’s all part of an effort by Leigh—and those at EVL—to push the boundaries of technology. “It’s one step closer to living in my Star Trek future,” says the 46-year-old Leigh, who has spent the last 20 years at EVL, first as a graduate student and then a professor, before becoming its director in 2005. “When I was a kid, I wanted to live in the science fiction world that Captain Kirk was living in. All we’re trying to do is to make it real.”
Star Wars and GRASS
Since its founding in 1973 as the Circle Graphics Habitat by artist and physicist Daniel Sandin and computer scientist Tom DeFanti, EVL has been a bridge between technology and the arts. A joint program of UIC’s College of Engineering and the School of Art and Design, it was one of the nation’s first such labs. EVL students can graduate with an MFA in electronic visualization (perhaps, the oldest degree of its kind in the country) or an MS or Ph.D. in computer science.
Although not as well known or well funded as institutions such as MIT’s Media Lab, EVL has a long history of radically changing the way people visualize and interact with data and technology. “EVL may be the most important center for visualization innovation in the United States,” says Larry Smarr, director of the California Institute for Telecommunications and Information Technology at the University of California, San Diego, and founding director of the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, where he first started working closely with EVL faculty and students. “It’s looked at globally as a center for innovation in visualization and the interface between computer science and art.”
EVL pioneered ways to use and program computers to create and manipulate video, the high-tech medium of the time. Those early EVLers were as much hippies as technologists and that culture is still a core part of the lab’s ethos and identity. DeFanti created a language in the 1970s that he dubbed GRASS, which stood for Graphics Symbiosis System, a breakthrough computer language specifically intended for artists. Computer animator Larry Cuba used GRASS to create the famous “Attacking the Death Star will not be easy” sequence in the original 1977 Star Wars. Sandin built his eponymous Sandin Image Processor, an inexpensive analog computer that used patch cables and dials to distort and manipulate video in real-time, and then gave the design away for free. Spiral 5 PTL and Wandawega Waters, both examples of his early video art, are in the permanent collection of the Museum of Modern Art in New York City.
Hanging on the walls of the lab today are photos from that time of long-haired computer scientists and artists, sitting in front of now-ancient machinery that projected fantastic swirls and colors onto displays set to synthesized music at public shows. Sandin, DeFanti and their students and colleagues held public computer art shows of their work to reach audiences that had never seen anything like it before. “Both Tom and I were completely convinced that our work was part of a transformative process that was changing culture, changing politics, changing education and changing our lives,” says Sandin of those early years.
As technology has advanced, EVL’s focus has shifted from video art to virtual reality and advanced computer displays. In 1992, EVL unveiled the CAVE—Cave Automatic Virtual Environment. Visitors wore special glasses to experience in 3-D a fully immersive 10-foot-cube space—pictures were rear-projected onto three walls and the floor—which did away with the awkwardness of wearing a heavy, head-mounted display system.
Featured in early issues of Wired and other tech magazines, the first application of the CAVE immersed viewers into the Big Bang. “They were literally floating in [the sky] and watching the filaments of the universe form,” says Brown, who has been an integral part of EVL since DeFanti hired her in 1986. “It’s the closest thing to a holodeck, where you are in a room, the walls disappear and you see 3-D.”
Researchers at corporate, university and government labs across the nation and throughout the world soon became interested in building their own CAVEs. Early uses included creating virtual tornados for atmospheric scientists (they could stand in the middle of the windstorm’s eye); modeling molecules, so researchers could manipulate and move intricate collections of atoms; and performing brain simulations to demonstrate the connections of neurons and electronic pulses as they fired. General Motors built its own CAVE to quickly visualize new vehicle designs.
Twenty years later, such advanced computer display applications are still rare. EVL Associate Professor Andy Johnson, who first visited the lab at that time, was so awed by what he saw that he switched the focus of his Ph.D. research and later joined the lab’s faculty. “It was so far beyond what a normal human being had access to.”
EVL has since created other visualization display technologies that have gone on to find real uses in the world. One such application is CoreWall, which allows geophysicists and geologists to view and annotate high-resolution scans of drilled core samples to help them better understand how the Earth has changed over millennia.
Before CoreWall, scientists relied on photographic methods to record and study
hundreds of feet of earthen cores, a lengthy and difficult process. “Basically
this was a discipline that hadn’t entered the digital age yet,” relates
Johnson, who advised on the project.
Artificial intelligent sentries
That said, the lab remains focused on research that might not pay off for at least 15 years and that corporations can’t or won’t indulge in. One example is advanced networking, which EVL can undertake because it benefits from an incredibly fast fiber-optic network that can process data at 30-gigabits-per-second, roughly 30,000 times faster than even the fastest home broadband. At the same time, the lab is building larger and larger flat-panel, tiled displays, which users can touch, interact with and link to via SAGE. This will allow groups to enjoy greater collaboration and to work together anywhere on the planet, eliminating distance barriers.
Leigh cites futurist writers’ predictions of a coming “singularity”—a point in time when computer intelligence eclipses human control or understanding —as something that research at EVL might hasten.
All of our projects have “always [shared] the same vision; the CAVE was part of that and the tile displays were part of that,” says Johnson, sitting at a table while eyeing the lab’s present-day, display-less walls, ceiling and floor. “We completely believe in a future where all walls are going to be displays, tables are going to be displays, the floor at some point, the ceiling, eventually all these rooms will be surfaces that we interact with.”
According to Leigh, such developments are just the beginning. He cites futurist writers’ predictions of a coming “singularity”—a point in time when computer intelligence eclipses human control or understanding—as something that research at EVL might hasten.
For instance, the lab is currently working on using its life-like avatars as artificial-intelligent sentries to communicate with and identify people as naturally as a human receptionist. Human augmentics, another far-reaching EVL project, seeks to employ wireless technologies to gather real-time health data to provide constant medical monitoring and keep people healthier; to expand the capabilities and characteristics of humans; to explore the ethical implications of such technologies; and to rehabilitate those who have lost skills or abilities due to injury or disease. In the short term, however, EVL is planning to build a next-generation CAVE, which will incorporate 3-D flat-panel tiled screens that extend from floor to ceiling and curve to completely surround a viewer. Although earlier CAVEs only allowed you to walk in a small space (due to the size and quality of projector technology at the time), the new CAVE will be 30-feet wide and allow for much larger exploration. Leigh also envisions screen resolutions so high that the human eye won’t be able to distinguish any lines or pixels, creating highly believable virtual spaces. “When people see it,” he promises, “it will be a religious experience.”