I am part of a small group of students at Northwestern that develops and deploys the DVS. The DVS tracks the progress and medical information of thousands of runners, keeping race officials stationed at forward command tents informed about the status and well-being of participants.
On race day, it's used by medical professionals, public safety officials, federal government agencies, and race organizers. It allows them to monitor race progress and safety on their phones, tablets, laptops, and on large displays in forward command.
The system is currently in use across four events, keeping over 100,000 runners from around the world safe every year:
During the past two years, I've helped transition the DVS to a responsive, mobile-friendly web app with new features and an improved design.
My biggest contribution was rewriting and redesigning the dynamic course map implementation that displays runner density, medical information, GPS-tracked runners, weather, emergencies, and more. I built the map using the Mapbox GL JS API.
Through user testing and collaboration with Northwestern data visualization researchers at the Visual Cognition Lab, I optimized the design for rapid data absorption.
I also spend much of my time on the project facilitating communication between our team and race officials, ensuring we meet their needs when building the system. I am on site during race days, deploying and operating the live system.
Before the DVS, race officials used a whiteboard to keep track of critical runner and medical information, a solution that was messy, hard to update, and of limited access. The first DVS team created a digital solution, which was continuously refined over the years. But the team I joined worked hard to make it as accessible, functional, and beautiful as possible.
In April 2018, our system won first place in the INFORMS Innovative Applications in Analytics Award. We competed against other finalists at the 2018 INFORMS Business Analytics Conference in Baltimore, including IBM, Macy's, BNSF, and Schneider.
Our work will be published in an upcoming edition of the operations research journal Interfaces.
Citation:
Mehmet Başdere, Gabriel Caniglia, Charles Collar, Christian Rozolis, George Chiampas, Michael Nishi, and Karen Smilowitz. SAFE: A Comprehensive Data Visualization System. Forthcoming 2019 in Interfaces.
This project was a collaboration with Dr. Özge Samanci, a new media artist and Northwestern professor in the Radio, Television, and Film Department. In early 2017, we started the project based on an idea she had for an interactive piece that would symbolize humanity's connection to nature. The installation demonstrates that human thinking has a direct impact on the environment.
I was in charge of all programming and implementation for the installation. I built the project using the Unity Game Engine and C#. The installation uses the Neurosky Mindwave, a consumer-oriented neuroheadset we chose for its inexpensiveness and ease of use. A single participant wears the neuroheadset, which measures their general level of focus. The more focus they have, the stormier the ocean becomes. The project was funded by the Undergraduate Research Assistant Program.
Dr. Samanci and I worked in close collaboration, meeting often to refine the installation's design and direction. I spent much of my time working to make the ocean and sky as photorealistic as possible given the constraints of real-time rendering.
In August 2017, we filmed the demo video for the final project, which you can watch at the top of the page. In this video, you'll see how the installation works and the wonderful ways a person's thoughts can turn an ocean from calm to stormy. Since filming the video, the project has become quite a success, traveling to many selective exhibitions around the world.
Dr. Samanci and I wrote an article about "You Are the Ocean" for our exhibition at the Creativity & Cognition conference sponsored by ACM SIGCHI. Read the article to learn more about the artistic motivation, technical implementation, and participant interaction for the installation.
We also wrote a shorter article for Leonardo, a journal at the intersection of the arts, science, and technology, published by The MIT Press. It debuted at SIGGRAPH 2018, the world's largest computer graphics conference.
Citations:
Özge Samanci and Gabriel Caniglia. You Are the Ocean.
Leonardo, Volume 51, Issue 4, MIT Press.
Özge Samanci and Gabriel Caniglia. You are the Ocean: Interactive Installation. In Proceedings of the 2019 Conference on Creativity and Cognition (C&C '19). ACM, New York, NY, USA, 414-421.
I began working at The Garage when it first opened, during the fall of 2015. As a technical consultant, I help manage the technology contained within the space, which includes teaching students, faculty, and visitors how to use the equipment.
A problem quickly became apparent: The Garage had no central location for its growing collection of 3D printers, CNC mills, and other maker equipment. During the summer of 2016, I had the unique opportunity to convert one of the classrooms into a Makerspace as a full-time internship. The goal was to create a dedicated learning and working environment for students interested in AR, VR, and hardware.
About two thirds of the space is the Prototyping Lab, dedicated to The Garage's large collection of maker equipment. Along with the IT Manager at The Garage, I designed the layout of the space, housing 3D printers, CNC mills, laser cutters, and woodworking and electronics equipment.
The Garage also had a growing collection of AR and VR headsets, but no dedicated place to use them. I designed the AR/VR Lab to fill the other third of the space. It contains a dedicated setup with powerful gaming computers to run the latest VR headsets, as well as green walls and flooring for photography, videography, and "mixed reality" recordings of people in VR experiences.
I still work at The Garage during the school year, continuously maintaining the Makerspace and updating it with the latest technology. The space is used every day by student-led startups that currently incubate at The Garage, especially those working on hardware or AR/VR development. It's also open to the entire Northwestern community for demonstrations of devices such as the Magic Leap One, HTC Vive Pro, and Microsoft HoloLens.
I also maintain the pages of The Garage website that detail the equipment contained in the Prototyping Lab and AR/VR Lab. I've also written user guides for all the equipment, accessible by clicking on the equipment icons on The Garage site.
Oscillations is a startup working on new ways to create and consume digital entertainment. It consists of a global network of dancers, acrobats, and other performers (Oscillations calls them movement artists). They collaborate with researchers and technologists, who are exploring the latest advancements in neuroscience and immersive technologies to create compelling new experiences.
Oscillations has a research partnership with Northwestern's Knight Lab for Media Innovation. I became involved with the company when I took a class at the Knight Lab where we consulted on emerging technologies for Oscillations. The company is also actively working on VR music videos and other productions.
At the Knight Lab, I researched audience engagement metrics for immersive media, cost-effective motion capture techniques, and spatial audio. These were areas that Oscillations wanted to learn more about.
Our work in the class culminated in a panel presentation we did alongside Oscillations and a group of professors working in interdisciplinary areas of media, technology, and art. Check out our final presentation on the Knight Lab website.
I wanted to do more after the class ended, so I joined Oscillations as a UX Intern over the summer. During my work there, I delved deep into one of the areas I formely researched: spatial audio, which involves situating audio in a 3D environment. Over the summer I developed a novel audio spatialization technique for the company's first VR music video project.
The project involved assigning individual instruments from the song to individual dancers in the video. This created an immersive experience, where the dancers seemed to create sounds with their movements. I worked on the implementation using the Unity Game Engine and Facebook Spatial Workstation plugins for the Reaper DAW.
Unfortunately, the projects I worked on have not yet been released to the public, so I don't have much to share currently. However, the interactive VR music video has been submitted to many film festivals, so expect updates soon and an eventual public release.
Algorithms and Society was an interdisciplinary seminar class taught by Professor Brent Hecht. Each week, we covered many of the alarming impacts algorithms have on society, from the sharing economy to the filter bubble. As part of a quarter-long group project, we had to address one of the problems discussed in class. My group attempted to tackle a big one: fake news.
Building off of existing research on algorithmic methods of fake news detection, we addressed user interface concerns for what a fake news detection tool would look like. We targeted the Facebook News Feed because of its well-known issues with fake news during the 2016 U.S. presidential election. We devised an interface motivated by relevant HCI and cognitive psychology research, ran user testing experiments on our prototypes, and analyzed our findings in a final paper.
In this Knight Lab Studio class, I joined a group of three other students to explore the fledgling space of VR data visualizations. We spent our quarter researching best practices of 2D visualizations and virtual reality, and designing low-fidelity prototypes using tools like Tilt Brush. We then compiled our findings into an interactive website, complete with a comprehensive list of existing resources for data visualization in VR.
We found that the added dimensionality, immersion, and interaction afforded by VR makes it an ideal platform for a new generation of data visualizations that can grant the user unprecedented perspective. At the same time, many data visualizations do not translate well into VR, and care must be taken to create an optimal VR experience.
Check out our complete findings on the Knight Lab website. There are also a ton of other VR prototypes you can play around with like the one above!
Catnip began as a design project between me and two other students. We went through countless iterations and user testing trials to create a solution that would benefit as many students as possible. Initially known as Project Catnip (Go 'Cats!), by the end of the quarter we called it NU 101.
The main features of the app are a priority-based calendar, an automated calendar entry system, and school-based messaging. Our trials showed students benefitted from a calendar that didn't just show time allocated to tasks, but also priority given to those tasks. Our priority calendar was also easy-to-use because it could scrape from existing calendars and Northwestern sites.
The messaging feature allowed students to connect with peers in their classes or student organizations. We also devised a rollout strategy that would integrate with the freshman orientation program to encourage high adoption rates for the app.
The idea was popular enough that it was adopted by upper-level design students as a two-quarter project to further refine and implement our ideas. While our final prototype was purely visual and nonfunctional, this group of four students was tasked with bringing in a level of functionality.
My original group and I acted as clients for the new group, offering continuous feedback and guidance to meet the original goals of our project. By the end of their project period, the app was capable of some of our original goals, such as scraping from Canvas, our class management site, as well as Caesar, our general Northwestern portal. They also created new app renderings, such as the one above.
One rainy day during the fall of 2016, I walked by the dumpsters outside of Northwestern's fraternities, and saw this beautiful Mortal Kombat II arcade cabinet just sitting there. I couldn't imagine why someone would throw it out, until I got closer and realized it was missing its entire controls panel, with severed wires dangling from the opening. I got a group together to rescue it from the rain. We talked about restoring it and making it the centerpiece of our dorm, but classes and a lack of funds for new parts got in the way.
In early 2017, I asked the executive director of The Garage if I could bring it there to work on it. Not only did she accept, but she also offered to buy the necessary parts for its restoration. I began to repair it, with help from Ben Williams, Akash Borde, Ryan Miller, and Beth Mallon at different points throughout the process.
We started by simply replacing the frayed power cable, which turned out to be all it needed to work again, although it wasn't of much use without its joysticks and buttons. Throughout the spring of 2017, we continued to restore it, assembling a new controls panel and rewiring all the parts. By the end of the spring, it worked flawlessly, as if the panel had never been missing.
But I wanted it to do more. I ordered a Raspberry Pi and conversion board to hook up the buttons, joysticks, and original CRT with the Pi, and after some software fiddling, we had a versatile arcade cabinet that could play everything from Pac Man to Galaga thanks to emulators.
However, manually adding and configuring games was a pain, so we transitioned to using the Pandora's Box, a board pre-loaded with over 600 arcade games. We also added a smart power switch that could turn off the cabinet after a certain period of inactivity in order to conserve its original (and increasingly rare) CRT. The last addition was an adapter that allowed for quick switching between the original Mortal Kombat II board and the Pandora's Box.
Today, it lives in the AR/VR Lab of The Garage, offering the quintessential arcade cabinet experience that every co-working space should have.