Skip to Main Content
Three shadows against a neon wall.

Excerpt

"How to Speak Machine": An Exclusive Excerpt of John Maeda's Dataful New Book

Publicis Sapient’s Chief Experience Officer explores computational thinking for the rest of us.

The following is excerpted from John Maeda’s dataful new book, “How to Speak Machine,” released by Portfolio on Nov. 19. Order your copy here.

To know you better is to serve you better.

When I lived in Japan for a few years after college, I was always perplexed by how the trains would arrive and depart at the exact times that were printed in the schedules. This was the case for both local trains and regional lines, so I came to realize it wasn’t just a Tokyo thing. In contrast, growing up in Seattle, I learned to never trust the timetables for public transportation, and found this to be the case everywhere else—except for Japan. The answer that consistently came back from my Japanese colleagues was matter-of-fact: “Because the Japanese people wouldn’t stand for it as customers.” At the time, this response came across as a bit snobby. I wondered if they were saying that, as an American, I had lower expectations. My insecurities aside, I understood their attitude of taking care of customers because it was instilled in me at my parents’ tofu shop. This desire to care for customers was embodied by a word my father often used: omotenashi (oh-moh-taynah-shee).

Omotenashi roughly translates to “hospitality,” but it means much more than just making someone feel at home. It has to do with how people are greeted and sent off, how they are served, how well you can anticipate their needs and outdo their expectations. In the tofu shop, it meant the rigorous practice of giving customers their bag with two hands and opening the door for them as they were leaving. For my father, it also meant quietly picking the right firmness of tofu to give them if he knew they would be traveling far—so it would not break apart. And although my father would never admit it, it also involved the friendly banter between the customers and my radiant mother from Hawaii with her warm, addictive laughter—which I believe was often the biggest reason they would come back. Yumi always had them leaving with a smile. Underlying omotenashi is having an idea of what the customer wants without asking, so that their needs can be anticipated. There’s a famous story that illustrates this best. “The Three Cups of Tea” tells the story of an important sixteenth-century noble warrior who returned from the hunt, evident to anyone that he was deathly thirsty. He was first served tea that was lukewarm and filled to the brim in a big cup, and he quickly consumed it. Wanting more, he was then served tea that was hotter than the last cup, and half the amount. This time he was more relaxed and took longer to enjoy the tea. When he was done and asked for yet another, he was served piping hot tea in a small cup with an exquisite design. With his thirst sated by the first two cups, the warrior could not only fully enjoy the hot tea at the end but also appreciate the beautiful teacup. The tea server, Mitsunari Ishida, was subsequently rewarded by joining the warrior’s clan, and later became one of the greatest samurai commanders of that era.

 

Underlying omotenashi is having an idea of what the customer wants without asking, so that their needs can be anticipated.

The tale of Ishida’s attentiveness to detail is a parable. The idea is not to just serve tea generically, but to consider the kind of tea experience someone might want depending upon their needs. In other words, if Ishida didn’t know ahead of time that the warrior was immensely thirsty, he might have just started off with a boiling hot tea in a beautiful cup that would have burned the warrior’s tongue. Not only would his thirst not be quenched, but the cup’s beauty would be wasted. Or, in blunter terms, if Ishida hadn’t made the effort to learn more about the noble warrior’s hunting trip—which essentially involved some light spying—then his needs would not have been met so perfectly. When we know about our customers, we have the opportunity to serve them the way they want to be served. But that requires us to be a little nosy, and sometimes a little lucky, to get the information that can tell us how to delight a guest.

You don’t have to be in Japan to experience omotenashi. It’s that moment when your favorite restaurant remembers you by your first name—transforming you from an anonymous customer into one that has “come home” like family. A similar thing happens all the time online when you frequent certain sites that greet you with your first name. Reading a message that says, “Welcome back, John!” feels good at first. But it may feel less good when you visit a completely unrelated site for the first time and it enthusiastically welcomes you by your name. You’d feel similarly awkward if you showed up at a restaurant you’d never been to before and a server you’d never met before is addressing you by first name saying, “John! How’s your new job going?” Avoiding this situation—where strangers “know” you to a rightfully uncomfortable degree—is a matter of highest urgency for humanity right now. You’ll hear about it in the media regarding our privacy and how to protect ourselves, and it’s only natural to want the invasion of our privacy by machines to stop. But to ask a computing device to stop gathering information and to stop sharing it with other devices is like wishing away all the magic in your magic wand.

Computational machinery, by its very nature, can and will be instrumented in some way because this is an intrinsic benefit to the paradigm. The level of instrumentation can vary from capturing your every click and keystroke on a device to capturing your three-dimensional location information on earth at every moment. You may have already heard about how the “cookie” is the basic unit of tracking on the Web. Although they sound completely harmless, cookies are the first sin of the Web while also being one of the reasons internet advertising businesses became so successful during the rise of the internet. Cookies are little pieces of text that any programmer can “park” inside your browser to later access when you come back to it—that way, the browser remembers what you have already visited, and when. It’s a handy means for a site to remember where you last left off, and—much like Mitsunari’s tea service—it can then strive to make that third cup of tea the best one for you.

Avoiding this situation—where strangers “know” you to a rightfully uncomfortable degree—is a matter of highest urgency for humanity right now.

Incidentally, this basic technical mechanism of leaving cookies-as-text in your browser’s cookie jar also allows services unrelated to the sites you’re visiting to park information about you. These are called “third-party cookies,” and I recommend you disable them in your browser’s settings after reading this. Doing so will let you exercise greater control over your identity and choose who you want to permit knowing things about you—otherwise, it becomes easier for you to appear at that random restaurant that knows all about you even before you’ve stepped through the door. It’s possible to turn off all cookies on your browser, including first-party cookies, but doing so makes the Web much more cumbersome to navigate. Cookies bring convenience, because they mean you don’t have to remember a password to a service you’ve already logged in to—a cookie gets placed on your computer to mark it fully authorized as “logged in” so you don’t have to go looking for your password somewhere in your pile of notes. And don’t worry, cookies aren’t inherently harmful.

For the foreseeable future, you will be constantly trading computational convenience for digital information that you reveal about yourself. The more privacy you give up, the more convenience you get. Said differently, when you share information about yourself, you are guaranteed the pleasure of getting what you want instead of feeling the pain of being served incorrectly. For example, every hotel chain out there is aware that I don’t like to stay in a room next to the elevator. In a similar vein, every airline out there is aware that I prefer an aisle seat. Do I mind that they know this about me? Not at all, because it means my desires are more likely to be met. What I do mind is when information about me is disclosed without my permission. But it’s hard to know these days when you’ve given your permission to a company when verbose walls of text pop up to ask you to accept the terms of service before you can get to work. What did you agree to?

There are moments when you explicitly opt in to be instrumented and telemetered, like when a site asks you to reveal your location to it. If every layer of technology were to do so similarly, then ultimately you would likely not be able to use the internet as you know it because there are a lot of assumed permissions that we’ve already handed over. Your new awareness of the complicated nature of the computational universe should make you aware that it’s quite possible your internet service provider is storing and selling information about you—in the United States, this is currently fully legal. The same can be said about your telephone network carrier, or the cloud companies, or your favorite apps, or even the physical device you’re using—all of them could be independently telemetering and collecting information about you 24/7. The question of knowing how your data is being shared, both willingly and unwillingly, is an emerging dimension of design that I’m intensely interested in from a computational product perspective. It’s a topic that could fill many books, but I’ll just leave you with the realization that this is really big. And if you’re not fully convinced, check out the American Civil Liberties Union’s prescient piece from 2004, “Scary Pizza,” which depicts a future when an innocent call to a local pizzeria to order a pizza becomes entwined with the caller’s health records and employment history, among other bits of information. The pizzeria ends up charging extra for the caller’s attempt to order extra cheese when they’re supposed to be on a diet, and the caller is naturally alarmed by how much the pizzeria knows about them.

The more privacy you give up, the more convenience you get.

When you’ve shared your credit card information with Amazon once, you don’t need to reenter it each time you make another purchase. That sounds awesome and feels like magic. When Gmail has processed all your emails and knows how you might respond to a message, it will automatically suggest a response. That sounds magical too. By giving the cloud companies access to all of our information, we enable them to do wondrous things for us and brew the perfect temperature and quality of tea. The only problem is, what happens if hackers break into Amazon and steal your credit card information, or manage to access all your emails from Google? How does it feel? Pretty terrible, right? Is the risk worth it? Absolutely. The way to mitigate the accepted risks is to understand, and to respect, how computational systems work and what can possibly happen when things go awry. To wish away all of the miraculous conveniences that the computational era has introduced would mean that I couldn’t easily text my mother a heart emoji at any time or work globally while managing to attend my children’s dance performances. Every time technology can save me time or do things better than I could ever do alone, I feel grateful and fulfilled while still cautiously thinking through what I’m getting versus what the cloud is taking from me.

We should be excited about how instrumentation can empower computational products to deliver extreme convenience to consumers by understanding their every want and need. A priori knowledge of customers lets the Ritz-Carlton hotels, for example, provide their legendary service with similar low-tech methods. By noticing that a guest has left a (real) cookie uneaten from their room service meal, they might deliver the next service with a different dessert option by recognizing that isn’t their dessert of choice. Having had the experience of staying at a Ritz-Carlton hotel once and enjoying its exceptional facilities and service, I would easily give away all my information to them to receive their omotenashi. The question is whether the customer’s best interest will be kept in mind vis-à-vis their data. So as you begin to work with telemetered systems and your customers’ data, fully embrace the omotenashi approach—and treat their data as you would like your own data to be treated. Explicitly knowing the data being shared allows the customer to weigh the compromise they make when losing some of their privacy versus gaining something valuable in return.

Excerpted from How to Speak Machine: Computational Thinking For The Rest of Us by John Maeda with permission of Portfolio, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © John Maeda, 2019.

John Maeda
John Maeda
Chief Experience Officer
Named one of the “75 most influential people of the 21st century” by Esquire, Maeda draws on his diverse background as an MIT trained engineer, award-winning designer and MBA community translator to bring people and ideas together at scale.

Related Articles

  • Getting Started with AI: A Grounded Approach

    How to start using AI in practical, incremental ways within the Salesforce ecosystem to both optimize how we use the platform and engage with our customers.

    {{CTALinkLabel}}

  • AI: Hype or Reality?

    AI implementation increased by 270 percent in the past four years. But is it living up to its hype?

    {{CTALinkLabel}}