Posted by Lewis Painter 26 May 2015
Why Intel’s vision of the future is a future I want to live in
Saying that technology has come along leaps and bounds in the last fifty years is an understatement to say the least. Where once computers filled rooms and were managed by men in white coats, today we see complete computer systems the size of a USB memory stick, ready to be plugged into our TVs. It’s not even just TVs any more – we have smartphones, tablets, smart watches and virtual reality gaming just around the corner.
I recently went along to Intel’s Future Showcase where the company showcased some of its most interesting and ground-breaking technology, including prototypes of devices that won’t hit the market for years – and it looks amazing. Some of the technology showcased is similar to what we see in films set 100 years in the future, including data transfer by touch and drones that can autonomously weave between trees in a forest at 15mph, to name just a few.
I was shown the Intel NUC, which some people may be familiar with. It’s a system that’ll fit in your hand and offers a full PC experience with, Intel claims, PC performance. Where in the past you’ll have had to plug the NUC system into a TV to use it, I was shown the new iteration that has one crucial difference – a 10in integrated touchscreen.
This mean that you can not only have access to your files but your complete computer system with you at any time, making it ideal for those that work and travel. Take DJs for example – DJs often take their laptops and chargers with them to gigs, but now there’s no need as the NUC can hold their complete musical library and is the size of a handful of CDs. How times change eh?
One of the most impressive prototypes that I saw at the event was the Intel Home Gateway, which could play a huge role in the widespread adoption of smart tech in the home. The Home Gateway is based on the Intel Galileo development board and allows users to create their own home automation system by providing a hub that can talk to any smart device on the market. If correct, it means that it could provide a link between popular smart tech like Philips Hue smart bulbs and LIFX smart bulbs, allowing them to work together in unison.
Interested? You should be. The system is based on NFC tags that can track where you are in your home to within a rather impressive 10cm radius. This means that if you’re in the living room and want to go to your room, lights should automatically turn off and on depending on the room you’re in. As you leave the front room, the TV and lights should turn off and the hallway lights should turn on. Then, once you get to your room, you can set up the system to turn the lights on and put your TV onto a specific channel, for example. It’s completely customisable and the actions are down to the user.
It’s also compatible with multiple users too, each with their own set of rules. Say, for example, you like the front room to be a mixture of red and blue lighting. You could set up the system so that every time you walk into the front room, red and blue lights will come on - but what about your brother, who doesn’t like blue and red lights? Rules can also be set for when two specific people are in a room, so it’ll change to a different colour when he enters and switches back when he leaves. How cool is that?
Not as cool as my favourite piece of future tech though. I was shown a prototype of “Human Body Communications”. When I enquired as to the purpose of the two copper sensors attached to a laptop in front of me, I was informed that it’d be used for data transfer – using your fingers. The idea is that when you place your fingers on the plates, a harmless electromagnetic signal, which includes data, will pass over the surface of your skin. This signal is absorbed and stored by a non-powered wearable such as a ring or a bracelet.
So what next? It’s best to give the same example as I was given: Printing. By placing my two fingers on the metal plate, I was told that I’d theoretically be able to transfer a Word document that I’ve been working on to myself. Once the signal has passed over my skin and has been absorbed by my wearable tech, I could then walk up to a printer, place my fingers on it and transfer the signal back from myself to the printer. The end result? The printer should print the document seconds after touching it.
As I mentioned, human body communications is still an early prototype that can only handle copy and pasting of emoji at the moment, but it gives you some idea as to the road that humanity is headed down. Will a Minority Report-esque future be upon us sooner than we think? If Intel has anything to do with it, possibly.