For any fan of goofy gadgets reared on Dick Tracy comics and/or Inspector Gadget, Samsung’s hotly anticipated launch of its so-called smartwatch, the Galaxy Gear, on Wednesday, must mark the culmination of all those childhood dreams. Large and awkward-looking, the Galaxy Gear will not only tell you the time, it will also obviate the need to pull out your smartphone or tablet—Samsung, of course—at every ping or annoying chirp. It notifies you of incoming emails and texts and coordinates with your larger-screen device if you decide a message deserves your attention. As an icing on the neon-coloured cake, it also makes voice-activated phone calls and takes dictation. So all those fantasies of speaking into your wrist to dial a pizza? Your watch has you covered.
So far, so dorky. Then why is the tech world in a tizzy over the launch? Because, with Apple’s iWatch supposedly on its way and Google Glass in beta testing, wearable computing—that is, advanced computers built into everyday accessories to perform functions that could range from the very specific to the extremely broad—is supposed to be the next big thing. Smartphones and tablets are now commonplace enough that companies are feverishly pursuing the next stage of innovation, creating a new category of products to tempt consumers in the process. Change is the lifeblood of the technology industry, and if you don’t innovate, well, you become Nokia. Consumers seem convinced: a Kickstarter campaign to fund a Google Glass alternative fetched double the project’s goal, and a smartwatch called Pebble received more than $10 million. Wearable computers broaden the scope of human interaction with technology. Samsung’s watch, for instance, could mark the beginning of “quantified self” applications, where the computer you wear monitors bio data. Wearable computers then become not so much cool (or awkward) gadgets, but conduits of an augmented self, further eroding the space between man and machine.