The Future of Man and Machine Interactions
Published 12/17/2017

Until the mid-1970's, interacting with computers was primarily done using punched cards and tape created by certified programmers only. As technologies improved, computer interaction evolved into the mouse and keyboard as we know them today. In recent years, computer interaction has evolved even further with the introduction of touch-screen interfaces. Today, you can interact with computers (and even your "smart home") using voice technologies such as Amazon's Alexa, Apple's Siri, Google's Assistant, and Microsoft's Cortana. In fact, the last year or two has seen an explosion in the adoption of "smart speakers" equipped with virtual assistants that sit in households around the world, play music on demand, and answer questions like, "What's the weather?" and "How's traffic?". One can only imagine how we will interact with our devices and the things around us in the next 10, 20, and even 50+ years, but let's take a guess at what that will look like anyways.

In 2010, Microsoft got us thinking about the way we interact with our digital worlds with their introduction of the Microsoft Kinect. Since then, the Kinect has been used for everything from teaching people how to dance to controlling drones. I have even seen the Kinect used for teaching first aid and CPR in corporate offices. I have even personally used a Kinect for turning the lights in my home on and off by waving my hands up or down.

Similar to the Kinect, companies such as Leap Motion are developing other computer-gesture-interfaces that allow you to interact with your world by also waving your hands but on a much smaller (and precise) scale. These technologies allow you to "reach into virtual reality with your bare hands" (as the Leap Motion website advertises). Whereas the Kinect is great for monitoring the movements of your body as a whole, the Leap Motion is capable of identifying individual fingers. When combined with AR (augmented reality) and VR (virtual reality) technologies such as the Oculus Rift, not only can you interact with your computer and other devices by flapping your arms in the air, but you can also completely immerse yourself in worlds once only imagined in comic books and movies; Star Trek Holo-deck anyone?

Emotiv and others are developing EEG devices that can monitor a person's thoughts in real-time and translate those thoughts into computer interactions. Think about that for a moment (but don't think too much while wearing the Epoc from Emotiv as it might cause your computer to go haywire). Instead of dragging a mouse across a desk, banging your fists on a keyboard, waving your hands in the air, or swiping your nasty fingers across an already bacteria-filled screen, you can simply think your way across the web, create documents, and even play games by using only the thoughts in your head.

As things continue to evolve and improve, it's only inevitable that we can begin seeing advanced computer-interface technologies introduced into the workplace. For example, Amazon and others are currently working on Echo and Echo-type products that sit in conference rooms and can engage in conversations when asked. Imagine sitting in a meeting and asking the question, "does anyone know how many sales we had during Q3 last year in the southeast region", only to have a tiny box on the table speak up with an answer. Taking that a step further, imagine having the ability to raise or lower your hand to control the lights, open/close the blinds, and change the room's temperature. With even more imagination, picture yourself doing those same things by simply thinking about it in your head.

To some, all of this might seem a bit scary, but that's the future we have to look forward to and I for one welcome it!