One of the biggest barriers to making video games more accessible is the controller. While most gamers are capable of manipulating the complex cadre of sticks and buttons at an almost subconscious level, the abstract nature of the controller is a huge hurdle for more casual players.
A simpler, more intuitive controller would go a long way towards helping more people enjoy playing games. Nothing is more intuitive than controlling games with your thoughts, but can you build a controller that's accurate enough to let you play games with your brain?
That was the primary question posed by the 2013 NeuroGaming Conference and Expo this week in San Francisco, where exhibitors, speakers, and attendees from five continents to gathered to showcase technology with the potential to eventually transform video games as we know them.
Tomorrow's technology, today
While the conference primarily focused on applied neuroscience, there was a parallel theme of "How do we make games more realistic and immersive?" Exhibitors ran the gamut from electroencephalography (EEG) to eye-tracking and haptic feedback.
Walking the expo floor, I was confronted by hordes of wires attached to people's arms, sensory headbands straight out of a sci-fi film and haptic controllers that take the idea of a "rumble pack" to whole new levels. These aren't necessarily new ideas, though they're more refined than ever.
One of the highlights of the Expo is a game called WrestleBrainia 3000, a prototype hacked together over the course of two months by a bunch of students at the University of Washington. The game works through electromyography, or EMG, which measures the electrical activity of your muscles with sensors attached to your skin.
WrestleBrainia is an arm-wrestling game involving two robotic arms. Players control the robots by attaching sensors to their own (human) arms and flexing as much and as hard as possible. The person who flexes harder wins. It's fun, works consistently and gives you clear feedback about how much your body is working, demonstrating the potential of the technology as a self-improvement device.
I was also highly impressed by the prototype Tactical Haptics Reactive Grip controller, created by Professor William Provancher of the University of Utah. The handle, hacked together from a heavily modified Razer Hydra motion controller, sports four moving bars that shift against your hand in response to on-screen actions.
When you fire a gun using the Tactical Haptics controller, the bars "kick" up and down to simulate the feeling of a gun going off. When you swing a sword into an on-screen dummy, you can feel the moment you hit the target even though your arm isn't actually encountering any resistance. The technology seems ready for consumers if it gets mass-produced, and Provancher claims it's not difficult to integrate into existing games.
Discussing our future
The NeuroGaming Conference also featured a number of talks from people in both the neuroscience and game development industries. Conference organizer Zack Lynch and his team brought together big names in every niche, from Palmer Luckey of Oculus VR to Neurosky CEO Stanley Yang and game designer Heather Kelley.
Panel topics reflected the binary nature of the conference, covering both the technical and design sides of neurogaming as a medium. For every panel featuring people on the hardware side--for instance, Anders Grunnet-Jepsen discussing Intel's perceptual computing--there was an equally enthralling panel on how to use this technology to expand game design.
Noah Falstein, now of Google but previously of Lucasfilm Games (which became the defunct LucasArts) and co-creator of classic arcade game Sinistar, highlighted this duality while discussing how mutual respect is necessary between scientists and game designers moving forward. If game designers concentrate too much on fun to the detriment of the underlying science, nothing improves. If scientists try to make games without understanding good design, nobody wants to play. It's a mutually beneficial partnership.
On another standout panel, industry figures discussed the always-controversial subject of emotion in gaming. Susan O'Connor, who helped write Bioshock and Far Cry 2, seemed excited about biofeedback allowing developers to differentiate between "the experience we tell the player we're giving him versus the experience he really has" by providing data about what the player is feeling in real time.
As a gamer and sometimes-developer, I look forward to seeing whether the tech on display at the 2013 NeuroGaming Conference actually helps make it easier to tell emotionally resonant tales. If not, I guess we can at least look forward to firing more realistic guns and making robots fight each other with our muscles.