Soon Your Desk Will Be a Computer Too

Desktopography projects an interactive AR interface onto your desktop that you can tap, swipe, and control just like a screen.
Image may contain Furniture and Tabletop
Getty Images/WIRED

In the early 1990s, Xerox Parc researchers showed off a futuristic concept they called the Digital Desk. It looked like any other metal workstation, aside from the unusual setup that hovered overhead. Two video cameras hung from a rig above the desk, capturing the every movement of the person sitting at it. Next to the cameras, a projector cast the glowing screen of a computer onto the furniture’s surface.

Using Xerox’s desk, people could do crazy things like highlight paragraphs of text on a book and drag the words onto an electronic word document. Filing expenses was as easy as touching a stylus to a receipt and dragging the numbers into a digital spreadsheet. Suddenly, the lines between the physical world and digital one were blurred. People no longer needed a keyboard, mouse, and screen to harness a computer’s power; all they had to do was sit down and the computer would appear in front of them.

Despite its novelty—or maybe because of it—the Digital Desk never took off. Technology moved in the opposite direction; towards the glassy, self-contained boxes of smartphones, tablets, and laptops. But researchers never gave up on the vision, and now more than 35 years later, these half-digital, half-physical workspaces might actually make sense.

“I really want to break interaction out of the small screens we use today and bring it out onto the world around us," says Robert Xiao, a Carnegie Mellon University computer scientist whose most recent project, Desktopography, brings the Digital Desk concept into the modern day.

Carnegie Mellon University

Like Digital Desk, Desktopography projects digital applications—like your calendar, map, or Google Docs—onto a desk where people can pinch, swipe, and tap. But Desktopography works better than Xerox could’ve ever dreamed of thanks to decades worth of technological advancements. Using a depth camera and pocket projector, Xiao built a small unit that people can screw directly into a standard lightbulb socket.

The depth camera creates a constantly updated 3-D map of the desktop, noting when objects move and when hands enter the scene. This information is then passed along to the rig’s brains, which Xiao's team programmed to distinguish between fingers and, say, a dry erase marker. This distinction is important since Desktopography works like an oversized touchscreen. “You want interface to escape from physical objects not escape from your hands,” says Chris Harrison, director of CMU’s Human Computer Interaction Institute.

That gets to the biggest problem with projecting digital applications onto a physical desk: Workspace tend to be messy. Xiao's tool uses algorithms to identify things like books, papers, and coffee mugs, and then plans the best possible location to project your calendar or Excel sheet. Desktopography gives preference to flat, clear backgrounds, but in the case of a cluttered desk, it’ll project onto the next best available spot. If you move a newspaper or tape recorder, the algorithm can automatically reorganize and resize the applications on your desk to accommodate for more or less free space. “It’ll find the best available fit,” says Harrison. “It might be on top of a book, but it’s better than putting it between two objects or underneath a mug.”

Desktopography works a lot like the touchscreen on your phone or tablet. Xiao designed a few new interactions, like tapping with five fingers to surface an application launcher, or lifting a hand to exit an app. But for the most part, Desktopography applications still rely on tapping, pinching, and swiping. Smartly, the researchers designed a feature that makes digital apps to snap to hard edges on laptops or phones, which could allow projected interfaces to act like an augmentation of physical objects like keyboards. “We want to put the digital and physical in the same environment so we can eventually look at merging these things together in a very intelligent way,” Xiao says.

The CMU lab has plans to integrate the camera and projection technology into a regular LED light bulb, which will make ubiquitous computing more accessible for the average consumer. Today it costs around $1,000 to build a one-off research unit, but eventually Harrison believes that mass manufacturing could get a unit down to around $50. “That’s an expensive light bulb,” he says. “But it's a cheap tablet.”