Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.

Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit developer.bose.com.

Michael, welcome to the show.

Michael: Wow, what an intro.
Thanks for having me.

Alan: It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.

Michael: Glad you liked it.

Alan: So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?

Michael: Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.

Alan: There’s a conference in
Toronto called Flash in the Can; FITC.

Michael: Nice.

Alan: That’s old school.

Michael: It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to d

Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.

Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit developer.bose.com.

Michael, welcome to the show.

Michael: Wow, what an intro.
Thanks for having me.

Alan: It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.

Michael: Glad you liked it.

Alan: So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?

Michael: Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.

Alan: There’s a conference in
Toronto called Flash in the Can; FITC.

Michael: Nice.

Alan: That’s old school.

Michael: It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to do.

And so I ended up starting to go around
door to door. What was it like? T-Mobile shop, Verizons shop, AT&T
— like, carrier stores — and show them about the phones. And I was
like, this is so easy. And then they started sending me to
conferences. And I started giving talks about stuff and doing the
demos at the booth. And then I got wind that they were starting a
developer relations organization. I have no business even applying
for something like that. But I really wanted to have a forcing
function to teach myself a program for Android, learn Eclipse, etc.
It was Eclipse at the time. And so I basically begged the team for
six months. I just let my enthusiasm kind of guide me, and the rest
of the stuff you mentioned in my past kind of followed from that
enthusiasm. It really actually hasn’t stopped. It’s just sort of
morphed towards different things. The latest thing is obviously
immersive technologies, but I found that they valued my time and it
didn’t feel like work. What more can you ask for from a career,
really?

Alan: Isn’t that awesome? So
let’s fast forward you’re now working with Bose, one of the giant
pioneers of audio in the world. What does Bose have to do with
augmented reality?

Michael: Well, that is a great
question. Yeah. So Bose is doing something really fascinating and
innovative, and that is trying to create a new lane for augmented
reality that is focusing on a different sense that’s not your sight;
that’s sound, which I guess is pretty brand-appropriate. We are
building a whole new area of Bose focused on turning the company into
a platform company and building a platform for developers to build
mobile applications that are sound-focused, that operate via a series
of gestures and spatial sound capabilities, and possibly even voice
input, and prioritize those over staring at your phone screen and
touching, so that you can maybe do things by those methods of
interaction while having your phone in your pocket, even though
there’s an app running. In that way, it actually frees up your visual
to just like if you’re listening to a podcast passively, or music, or
you’re having a phone call actively. [You can] still engage in the
world with your eyes, our most powerful sense and let audio give us
some other capabilities that complement that. If that makes sense.

Alan: Yeah. So let’s put this in
perspective. You guys made a pair of sunglasses, that speakers built
into them. Pretty awesome. The sound is amazing, like what you would
expect from Bose. Now, not only do you have the sound just stereo,
but you also have the ability to make it spatial, correct?

Michael: Yes. The frames are
just one of three devices that support Bose AR at the moment. The
QC35-IIs that are the most popular ones. Every time you pass a
business class on an airplane, everybody’s wearing them. Those
support Bose AR as well, provided they were purchased after November
of last year, 2018. Same thing with the new 700 series we announced a
couple of months ago. I actually got the pleasure of announcing this
on stage at AWE — Augmented World Expo. They’re called the Noise
Canceling 700 series. They are a new super-premium tier of
noise-canceling headphones that’s sitting alongside the QC35s. All of
these devices have a little sensor bundle in them that’s on the right
side of the frames, and that consists of an accelerometer, a
gyroscope, and a magnetometer. And in that way, we have an SDK for
Android, an SDK for iOS, and then also an SDK for Unity, which you
can use to deploy cross-platform that will allow you to interpret and
read data from the sensors to do things like recognize gestures. For
all three platforms, we have native gestures, support for of course,
head nod. So if you’re nodding your head, yes. Shake, negative. So
we’ve changed the word from — sorry — nod to affirmative and shake
to negative, just to sort of future-proof it a little bit with an
intense system. But the idea is, if you want to make an application
that does something when somebody shakes their head, or you prompt
them to shake their head or not their head, and something happens,
you can do that.

And then also there’s input, which on
frames you double tap on the right. Same thing with QC35s and on the
700 series, you touch and hold, because there’s a capacitive touch
panel on that device. And then in terms of spatial sound, to answer
your question, there’s a magnetometer on these devices. So you can
understand directionally with some magnetic declination if somebody
is facing north, south, east, west. You can also do arbitrary
directions. So, for example, there’s an app called Bose Radar. And if
you have a Bose AR-enabled device, you can download that for either
the Google Play store or the Apple App Store. And that if you open
that up, there’s a beachscape scene that I’d like to talk about. So
if you press play it and you close your eyes, you will hear a beach
scene in front of you. But if you look left, you will hear the beach
scene in your right here. And if you look right, you’ll hear the
beach scene in your left.

Alan: That’s so cool.

Michael: So it’ll kind of route
the sound around you based on where you were initially looking — not
based on like the direction of the world, where you were initially
looking — and keep it there. So you could hear seagulls out on the
left side, etc.

Alan: That’s positional sound,
right?

Michael: Yeah, that’s like
spatial sound that’s rooted in a position.

Alan: Very cool. And then I
guess it starts when, whatever direction you’re looking, that’s when
it starts?

Michael: Well that’s an option.
That’s an option. You don’t have to do it that way. But that’s an
option. Yeah.

Alan: I don’t know if you’ve
tried to Magic Leap. I assume you have.

Michael: Yeah.

Alan: They can have spatial
audio built into those. And one of the demos they do is this ball of
light is 10 feet away from you on the floor, and the sound is coming
from 10 feet away from you on the floor. Is that something that is
going to be available through these Bose glasses?

Michael: Absolutely. Yeah. So
there’s a number of ways to tackle that. The advantage that something
like a Magic Leap or even an Oculus Quest has is that it’s tracking
you with cameras. We don’t have any cameras on our devices. And so
it’s very easy to present that sound accurately when there’s an
object attached to it, right? That you can look at and see the
distance, and therefore so can the computer, which is generating that
object. It’s kind of native. The way that you can do it with Bose AR,
if you’re a developer, is within Unity, you can actually do that same
thing. You can create a visual scene and attach sound to an object
that’s at a certain distance, and then present that to a user,
because you know the rough direction that a person is looking. The
ways to overcome the 3DoF versus 6DoF thing — where you can move
forward in space and have it recognize that — there’s two main ways
that we’ve discovered so far. And again, this is a third-party
platform, so I’m expecting we’ll discover new ways for people to do
this with people working with pedometers. What if I can track
somebody, step forward and other ways of doing camera lists, six
degrees of freedom tracking? But here are the two that we found so
far. One is you’re outside, you use the phones’ GPS with the
magnetometer from a Bose AR device, and in that way, you can know
where in the world they are, and also in what direction they’re
looking. And using just those two heuristics, I could see that
someone’s looking at a restaurant — and there is actually an app
called Navigator that does this now, you download it for the Apple
App Store — but you can double tap and it will pull in using Yelp’s
API information about the restaurant you’re looking at, and say
“three stars with 500 people,” and you’ve got to make a
decision about whether or not to eat there.

So that’s one way to understand where
something is in space. There’s another app that uses that technique
to do specialized audio tours. As you walk up to a place, you’ll hear
a ping from a specific location, and you can decide if you want to
enter that audio experience. So that’s cool. There’s a lot of stuff
we’re doing to enable developers to build spatial silent discos. You
could walk up to it, have music fade in, that sort of thing. And then
the other way pertains more to indoor position. So we’ve got some of
our developer advocates on my team experimenting with beacons and
things like that. But if you have no infrastructure, you can always
default to something like the Vuforia or a ARKit or ARCore with your
phone out and the camera pointed at the ground for plane detection,
and then you can actually walk forward and around the space and then
it can also understand where you’re looking relative to that. And
there’s actually an app I’d recommend downloading called Traverse,
which is a really wonderful music experience where you can — for
example, with Elvis — arrange the band around you and then walks
through and around a performance. You can hear Elvis’s voice like a
ghost next to you, and then move behind him, and there’s the person
playing the drums. And you can actually go around a virtual space and
listen spatially to it, using ARKit to support the positioning of the
headset.

Alan: Oh, my God. I can see this
for museums, for public tours or walking tours.

Michael: Absolutely.

Alan: I think it’s great that
this is available on these two headphones and one pair of glasses.
But I can imagine that, through software, you’re gonna be able to
enable people to have these types of experiences across any Bose
device eventually, because if you take away the magnetometer and
accelerometer, you can actually just run it off your phone.

Michael: There’s a couple of
things about Bose’s commitment to this platform. So we’ve already
committed to over a million devices in-market that are Bose
AR-enabled by the end of this year. I think we’ve already basically
hit that. We’ve also started putting them in every wearable we made.
The three wearables are the three most recent wearables we came out
with. The third thing is, you could default to the phone. But what’s
interesting is we know a person is wearing this on their head, and
that’s something interesting and unique. If you combine it with the
capabilities of the phone, you can actually understand where somebody
is looking, for example, and you can understand a lot of other things
as well, by the fact that it’s head-mounted.

Another thing I’d just say about the
platform, briefly, is these are devices people are using every day
anyway. So if I were to make a pitch to developers, the difference
between this and something like a Magic Leap is number one, you
didn’t purchase it just for XR. You purchased it and you’re using it
every day anyway. And then when you build an app for it, it’s not
that somebody has to pick up a device like a Magic Leap and put it on
your head — by the way, I love Magic Leap. There are just different
qualities, to it, right? — You’re probably already wearing your Bose
device. And if you download an app, that has some utility for you.
Those are capabilities. It’s not as friction ball as, “oh, let
me go get my device and put it on my head to do this specific thing,”
if that makes sense.

Alan: Nobody wants to buy
something for a very niche thing. And I have a Magic Leap; we’ve got
them in the office here, and they get used very rarely, when
developers are either making something for them or we’re doing demos
for people. But it’s not something you pick up and put on and walk
around the office with. It just doesn’t.

Michael: I think that’s the way
the world’s going. It might be generational. It might be just a few
years of habit. I’m not trying to say that that’s not viable as a
form factor. But there are some convenience aspects to what Bose is
doing that, I think, are relevant for developers, who we want to
interest in building third-party applications, if that makes sense.

Alan: Absolutely. Let’s look at
this from a business standpoint; what can brands start doing to
leverage these technologies?

Michael: That’s a question we
are answering in various ways internally, and trying to answer with
partners. Hopefully you’ll see some of those answers come to market
later this year. But there’s any number of ways to tackle these
things when it comes to brands. We’re working with a number of
existing application providers to build Bose AR-enabled features that
make sense. For example, a partner of ours is Golfshot, and Golfshot
is a popular application for golfers. And there’s a Bose AR-enabled
feature that, let’s say you’re on the course with a pair of framed
sunglasses. You can actually get contextual advice about where on
that green — based on where you’re looking — you should be hitting
the ball and with what. Little quality-of-life improvement features
are one aspect to existing established branded apps. There’s also
obviously marketing and promotional opportunities. Spatial sound
itself can just transport you very quickly. You just close your eyes
and you’re in the middle of some scene of your favourite superhero
movie, for example. Or whatever the case may be.

Another thing that we’ve done is we’ve
put out something called a creator tool. This is something that feeds
into our Bose Radar app — which does the soundscape that I mentioned
on the beach — and we actually do… oh, my gosh. I don’t want to
butcher his name, that would be terrible, but I think it’s BJ the
Chicago Kid — has some experiences in Bose Radar of his music laid
out spatially around you. And we are working with brands on
delivering more experiences, many of which will be musical, and some
of which will just be artistic and immersive-audio-based. But the
Bose Radar app is fed into by a tool which exists on the web. We’re
doing private invites right now and that’s also available at
developer.bose.com. And that’s going to be a kind of Wiziwig tool
that allows brands to create experiences with spatial sound and some
gesture recognizers, and also use GPS to do location-based and
spatial sound experiences. So there’s a number of channels that we’re
working on now to engage with partners. And if you are a brand that’s
interested in this, just reach out to me or head to our web site.

Alan: Awesome. So what can we
expect in the future? And I’m going to throw this out there, because
I saw — I shouldn’t say I saw — I heard a pair of headphones called
NuraLoop at CES this year, and they were personalized headphones
that… they had some spiel… but man, they they sounded amazing.
And there’s not been much development as far as R&D or real
change in headphones and a long, long time. And it sounds like you
guys are really pushing the limit. So what’s next on the on the
radar? That you can talk about, obviously. Is there something coming
that’s going to take it to that next level?

Michael: There’s always things
coming. We have a future wearables that will be coming out that are
Bose AR-enabled. I think what businesses will need to know is that
we’re really committed to creating a viable, large-target platform,
highlighting our partners that work with us and our channels. And we
have an app called Bose Connect that as over 14-million installs
worldwide. We’re shipping our XR headsets everywhere in the world.
That’s something very few — if anyone else — can claim in this
industry. And I would just say, there’s really exciting things on the
horizon — new capabilities, et cetera. But I also want to reassure
people that this is a platform that can be built on that is
future-proof. And what we’re really trying to get to now, the analog
that makes sense, is we’re kind of with either Web 1.0 or the early
app store for iOS — whatever you want to call it — but capabilities
got added to the Web. Certainly, people were able to develop better
applications as processors got better for mobile. But the core
capabilities are the content. So what we really want to do, what
we’re committed to doing, is working with third-party developers,
small and large. You don’t have to be a big brand. You can be a a
single person. We’re trying to make platform self-service and easy
and free, because the value to us is our customers find a use out of
Bose AR-enabled applications, and they come back to them and they use
them. And that is a win/win for us. So there’s a nice synergy in our
business model for companies, small dev shops, large brands, to come
and work with us because the content is what I think folks who own
Bose AR-enabled devices can look forward to. And that’s what we
really need to bootstrap. In addition to, obviously, future
capabilities, better refinement of the platform as we go along.

Alan: Amazing, there’s so much
coming, I don’t even know what to ask! It’s like you guys have
figured it all out.

Michael: Not at all; it’s a
process.

Alan: It is a process! But I
mean, you’ve been down this road before. You’ve been down the process
from, “we have an idea and couple of nerds in a lab,” to
“hey, look, this is a product and it’s in the market now.”

Michael: It’s what I look for.
Yeah. It’d be boring otherwise.

Alan: Building things for the
world is pretty exciting, and I feel like we’re in this renaissance
moment of technology where it’s unleashing untold possibilities for
humanity. Let’s put our educator’s hat on for a second. How do you
think the Bose technology for spatial audio — or Bose AR — how do
you think it can be used for training or education?

Michael: We are actually already
talking to a number of interested parties on workforce training and
repair enablement scenarios. So the same sort of scenarios that
you’ve heard talked about with augmented reality, we think have some
potential uses with frames. Let’s say, large warehouse rollouts with
a maybe a safety glass version or something like that, where you can
get audio cues, and your eyes aren’t distracted by a screen. You’re
doing your job, but something goes wrong in one corner of the
warehouse. And you hear a ping spatially where that might be. Things
like that that can enable quality-of-life improvements for workers,
while leveraging the fact that there’s not a screen to not distract
from what they’re doing. There’s something to putting something in
front of somebody’s eyeballs that’s just always going to take your
attention. Right? So we think that there’s a lot of
workforce-enablement stuff that can be done with Bose AR-enabled
features within existing applications, and then maybe dedicated
applications for things like I just mentioned, that’s new. Or even
ones we haven’t come up with yet. In addition to that, there’s
there’s a whole host of different potential opportunities. We’ve
obviously, like you mentioned at the top, have lots of interest from
audio tour providers. Well, there’s lots of old-school headsets with
cassette tapes that are still out there for various stores. That’s a
low hanging fruit.

Alan: I used to be a DJ. [record
scratch]

Michael: Oh! Did you?

Alan: Yeah, for 20 years.

Michael: Well, that’s exciting.

Alan: Yeah. Have you ever seen
the Emulator, the big see-through touchscreen DJ controller?

Michael: No! That sounds like–

Alan: Type up “Emulator
DJ.”.

Michael: OK. Yeah.

Alan: I invented this giant
see-through glass touchscreen.

Michael: Woah! That was you?

Alan: Yeah.

Michael: I see this, and it’s
all… wow.

Alan: Yeah. It was a MIDI
controller. We actually worked with lighting controllers. We worked
with video, audio. Oh, ton of big brands. We brought this to
Coachella one year. We made a thing called dream experience, where we
took — Heineken branded it — and one of those buttons was synth,
one was bass, one was high hats, whatever. And we made it so that you
could, no matter what button you pressed, it always was in key. But
we made this really, really cool thing. People were making their own
remixes and emailing it to themselves.

Michael: That is awesome.

Alan: I love me some audio.

Michael: We did an experiment
with Bose AR at Coachella this past year, just to see what sorts of
things were viable at, like, a music festival setting. And I think
you’re going to start to see some of the fruits of those experiments
in the future. It was kind of exciting.

Alan: Did you work with Sam
Schoonover and his team there?

Michael: You know, that was
somebody on my team that took on that. I don’t know. But maybe.

Alan: I think we need to come up
with something crazy for next year, and we’ll all go to Coachella.
We’ll make it a thing.

Michael: Oh, man. I’m sold.

Alan: We’re gonna get everybody
at Coachella wearing the frames.

Michael: Yes! I mean, it’s
perfect. Like a silent disco.

Alan: It’s so perfect. Honestly.
All right. We’re gonna make that happen.

Michael: Awesome. Sounds like
fun.

Alan: I love it. And they
already have a silent disco.

Michael: Yes, I know. I’ve seen
some of these. But seamless, just walking up to it and walking away;
that to me is something I’d like to see realized.

Alan: Yeah. It would be really
cool.

Michael: There’s a lot of hard
tech under the hood that would need to work, and Wi-Fi that doesn’t
suck at a festival. So there’s also challenges.

Alan: But we’re gonna have 5G.
It’s gonna be a thing.

Michael: There you go! Problem:
solved.

Alan: We’re already working with
all the telecos; we’re gonna bring 5G, we can drop it in there. We
got this.

Michael: That would be amazing.
Yeah. If there’s no congestion problems, which I–.

Alan: That’s the promise of 5G,
is getting rid of the bandwidth problem.

Michael: Yeah, I thought it was
the ping problem, or the latency problem. If it gets the bandwidth
problem, awesome.

Alan: It’s three things. It’s
bandwidth, latency, and also capacity of the network.

Michael: But can it make me
dinner? That’s really the question.

Alan: The answer is absolutely.
Uber Eats delivers.

Michael: Awesome. What a world,
yes.

Alan: What a world. So listen;d
we have plans now. We’re gonna go to Coachella. We’re gonna make sure
everybody’s got the frames. We’re gonna have a silent disco. I think
there’s a huge potential for augmented reality as it pertains to
audio, and I think Bose is perfectly situated to take advantage of
that and also bring quality audio to the world.

Michael: Yay! Thank you. I
agree.

Alan: What is one problem in the
world that you think can be solved with XR Technologies?

Michael: I believe in XR as the
empathy machine. I think that it can literally put you into the shoes
of someone experiencing the world from a different perspective. And
in that way, it’s almost not even empathy. It’s almost sympathy.
Like, I was wading through a crowd as a 4’3″-tall person and I
actually experienced what that was like. I think that sort of concept
of experientially being able to put yourself into someone else’s
situation is something that all forms of XR can and will continue to
do, even on accident. Like every time I play a game and things are
sized differently; that’s the size thing. But it could also be
scenarios. “Oh, I was in a scenario where a bunch of people were
bullying me. And how did I handle that?” I think for people who
struggle with being able to empathize with others in different
situations, VR and AR have the ability to give us that new
perspective. I think that’s one of them. Very many exciting things
about XR. I could go on a number of different ways. I think
education, therapy, recovery, the way we work, remote meetings,
collaboration, etc.. I guess I wanted to highlight the empathy
machine aspect of it here.

Alan: You know who quoted that
VR is the ultimate empathy machine?

Michael: Who? Was it you?

Alan: It was not. It was Chris
Milk.

Michael: Milk. I’ve got to give
him credit next time.

Alan: Yeah.

Michael: Did he coin that
phrase, though? Or did he just give a talk on it?

Alan: I’m pretty sure he coined
it; look at the date.

Michael: 2017.

Alan: Yeah. This has been really
wonderful. Thank you so much for taking the time. And I am really
looking forward to Coachella next year.

Michael: Yes. Let’s keep the
conversation going.