Talking with smart things

When we do business with companies, systems are becoming responsible for a growing share of the interaction. This is where ‘things’ are added to the Internet of Things. So are smart things about to start listening to us?

My refrigerator alarm goes off if the door is open too long. But it has no idea if I’ve been staring into it for ages trying to figure out what groceries I need or that I didn’t close the door properly. The cliché of the connected refrigerator is that it automatically orders supplies before they run out. But who is it that stocks those items in the refrigerator? And does it keep track of my agenda, namely when I’m going on vacation or that I’m having guests over for dinner this week? The refrigerator hasn’t exactly evolved over the past few decades. It doesn’t listen to people and it understands them even less.

Smart things

Smart things are available in three flavors.

1. Systems that focus on transactions: without intelligence. Examples include the ATM, parking meter and transit smart card. They don’t connect with the web for the purpose of selecting information and then presenting it. Which means that they don’t do anything with current information. The transit smart card reader processes your check-in even if trains aren’t operating due to a strike. They are non-interactive.

2. Software that is able to handle questions with help from the internet. Each question can be seen as a transaction (call Pete, order a pizza, what’s the weather like in LA): the answer comes down to a combination of data. The most basic example is your smartphone assistant (Siri, Cortana, Google Now, Hound), which answers your question by directing you to a Google website or a contact from your contact list. The best-known virtual assistants are Siri and Google Now. Siri (Apple) has also been listening to Dutch for the past year, but only responds to the words “Hey Siri” if the Apple device is charging or when you press the ‘home’ button. Google Now always responds to the command “OK Google”. Both assistants aren’t able to do very much; their strong point is performing searches. For more complex search commands, speaking is much faster than typing, which looks promising for online shopping.

Bots are doing their utmost

Speech is the most natural form of communication; talking is much easier for people and systems than listening. The personal assistant Amazon Echo is a ‘smart’ hands-free speaker that connects to Alexa, an assistant that listens with you. Alexa, who is faceless, goes further than the Amazon Dash Button, which allows you to reorder laundry detergent with the press of a button on your washing machine – a physical shortcut to the right location in a webshop. Alexa interacts with information and if necessary with the environment: she sums up the headlines for you, dims the lights if you ask her and sets a timer when you put a pizza in the oven. But she doesn’t learn your habits: you’ll need to repeat or program personal patterns (reading the headlines with your breakfast every morning). Other smart systems are also attempting to respond to parts of the internet of things. iOS is in the race with Home Kit, which makes different types of hardware and software compatible. There are also different ecosystems on top of which developers are building apps that link platforms (such as IFTTT).

total bots

From left to right: Cortana, Watson, Amelia, Ask Google, Sophie, and Siri.

3. Software that speaks proactively to you and includes everything in your life that relates to communication. Obviously, the smartphone is the best basis for this, as it knows more about you than your partner does. The smartphone has information on where you are at any given moment – and invariably knows what you’re doing, such as who you’re communicating with – and provides an insight into your data and software use. If your smartphone were able to listen actively to you and your environment and interact proactively, then things might start getting like uncanny valley. Your smartphone could warn you if you overrun your budget when making purchases, inform you of interesting and relevant special offers at stores you are physically walking past, could follow all your discussions and refer back to something you said last week to your partner: you said you were going to call your mom, didn’t you?

Software wants to be your BFF

This software could use your purchasing behavior to determine when new hardware enters your home and immediately come up with instructions or a tutor. By listening to sensors and RFID chips, a system like this could ask whether certain groceries need ordering and remember if you always answer ‘yes’, while keeping track of your agenda: are there any planned vacations? This software is not limited to communicating, but also acts and makes independent decisions that have financial and physical consequences: for orders (having flowers delivered in time) or for the functioning of objects (turning on the heating at the right time, because you’re obviously on your way home). Just as with the self-driving car, complex situations will require us to take decision-making errors or delays into account.

Smart things are able to learn from engagement

Technophiles hope to make the interaction with our environment smarter by adding learning capacity to software. X.ai remains in the background, but can certainly manage your agenda; with Viv the inventors of Siri go one step further and MindMeld wants to make it possible to use speech recognition to operate the internet. The developers of ID Avatars hope to create empathic chatbots with IBM’s Watson; in the appstore you can download and install AskSophie, although the registration procedure doesn’t bode well. EasilyDo searches for relevant entries in your agenda, e-mail and social media but besides a certain level of convenience doesn’t give any assurance.

Engaging in intelligent conversation with or on behalf of objects still has a long way to go. Virtual assistants could become ‘the man in the middle’. Enthusiasm for voice control is certainly growing. Watson (IBM) and Amelia (IPsoft) are still basic math wizards. They create new combinations of existing information, but will never compose a dawn chorus for birds without someone giving the order – a characteristic of creativity. When making their decisions they won’t ever consider how you or your partner is feeling – a characteristic of empathy.

Even dealing with commands is real challenge when it comes to designing systems that learn. Tay.ai, an automated chatbot from Microsoft, was taken offline after a few days because users had trained her to make racist comments. Learning capacity can be developed, also for artificial systems. Empathy – feeling for context – is far more problematic.


Leave a Reply

Your email address will not be published. Required fields are marked *