diff --git "a/transcript/allocentric_c-N8Qtz_g-o.txt" "b/transcript/allocentric_c-N8Qtz_g-o.txt" new file mode 100644--- /dev/null +++ "b/transcript/allocentric_c-N8Qtz_g-o.txt" @@ -0,0 +1,929 @@ +[0.000 --> 7.400] I can be very loud. +[7.400 --> 8.680] So thanks, yeah, so quick. +[8.680 --> 13.720] I mean, the only thing I would add to the background there is just that in retrospect, +[13.720 --> 17.920] in thinking about the things I've done, I've realized that there's a real theme. +[17.920 --> 21.560] And this is probably true for many of us, to the things that I did that there is a lot +[21.560 --> 24.800] of hardware and software and interaction and sorts of things that really attracted me +[24.800 --> 27.360] to virtual reality long ago. +[27.360 --> 31.200] Some of you, or most of you, don't know, there are very few of you in this room who would +[31.200 --> 38.680] know that my life and everything I've done has been very much shaped by this sort of dual +[38.680 --> 42.080] personality that I have for my mother and my father, my father being a very accomplished +[42.080 --> 45.720] musician and my mother being a mathematician and computer programmer. +[45.720 --> 51.760] And so she's German heritage and my father's sort of French Irish and yeah, here's some +[51.760 --> 52.760] music somewhere. +[52.760 --> 55.400] So the funny thing is, and this is what people don't know. +[55.400 --> 58.880] When I graduated from Purdue, I had three or four job offers. +[58.880 --> 62.880] I turned them all down and I went to New York City to work for a music production company +[62.880 --> 67.680] to do audio stuff for live performance. +[67.680 --> 73.980] So I worked on that point in time, the International Tour of Repeater Gabriel, the So Tour of +[73.980 --> 75.180] South, and set up the audio. +[75.180 --> 76.180] You'd think I'd now have fixed this, right? +[76.180 --> 77.180] It's been a long time. +[77.180 --> 81.120] It's been a very long time. +[81.120 --> 83.960] And so we worked on the beginning of the So Tour a little bit. +[83.960 --> 86.880] This is what Peter Gabriel looked at, like at that time. +[86.880 --> 91.040] And then I also worked on the rig for the cure when they were going out on their World +[91.040 --> 92.840] Tour at that time. +[92.840 --> 96.720] And this is what the cure looks like at that time. +[96.720 --> 97.800] A little more interesting. +[97.800 --> 98.800] All right. +[98.800 --> 102.000] So then I decided it wasn't a life for me. +[102.000 --> 105.120] Literally hopped in my car, drove from New York City to Los Angeles and started working +[105.120 --> 107.040] for NASA and haven't looked back. +[107.040 --> 108.040] All right. +[108.040 --> 113.160] So what I want to talk about today, though, is a bunch of stuff that I want to make sure +[113.160 --> 117.480] I give some credit to other people that I work with, including Guarrett is here somewhere +[117.480 --> 120.760] in the audience, Professor Garrett Bruder, my closest collaborator now. +[120.760 --> 126.720] So a lot of things I talk about today are in some way involve contributions to work +[126.720 --> 131.000] from these various people and these funding agencies. +[131.000 --> 132.480] All right. +[132.480 --> 136.640] So this is something that I realized I was telling Garrett before my talk. +[136.640 --> 139.600] It's a little bit of a risk here what I'm going to talk about because I have thought +[139.600 --> 143.120] about it for a long time, but I've never tried to put it into words. +[143.120 --> 146.640] And now I'm going to put it in words and pictures and we'll see how it goes. +[146.640 --> 148.160] I don't have any demonstrations for you. +[148.160 --> 149.240] I am the only demo. +[149.240 --> 153.080] So I hope I won't break during the talk. +[153.080 --> 154.760] All right. +[154.760 --> 158.360] So David Blaine, anybody here know who he is? +[158.360 --> 159.880] A few people know who he is. +[159.880 --> 160.880] All right. +[160.880 --> 164.640] I want to show you just a little quick sort of an AR-ish demo from David Blaine. +[164.640 --> 168.040] And the audio on this clip is the lowest of all the clips that I have. +[168.040 --> 169.040] So hopefully it'll be okay. +[169.040 --> 170.640] You might have to listen carefully. +[171.640 --> 175.640] We're going to try something with the book. +[175.640 --> 176.640] Seems like your book. +[176.640 --> 178.640] This is kind of interesting. +[178.640 --> 182.640] Think of how old you are. +[182.640 --> 183.640] 30? +[183.640 --> 185.640] Let's try to think of Megan. +[185.640 --> 187.640] Let's not use Megan's whole name. +[187.640 --> 188.640] Let's use her initial. +[188.640 --> 189.640] And okay. +[189.640 --> 191.640] So let's just visualize that letter. +[191.640 --> 192.640] Okay. +[192.640 --> 197.640] And now I want you to, as you're turning to page 30 for one page reach here, you're just +[197.640 --> 198.640] going to hold the book out. +[198.640 --> 199.640] I just don't want to be near. +[199.640 --> 200.640] Hold it out. +[200.640 --> 201.640] It's a little bit. +[201.640 --> 206.640] Turn to page 30 and you're going to feel something and see something happen. +[206.640 --> 209.640] Oh, I see it. +[209.640 --> 211.640] Oh, my God. +[211.640 --> 212.640] Oh, my God. +[212.640 --> 213.640] Oh, my God. +[213.640 --> 216.640] So I've never seen him in parts of that. +[216.640 --> 217.640] And that's amazing. +[217.640 --> 222.760] But the thing that always fascinated me about him is the magic that he does, like most +[222.760 --> 224.160] magicians, appears to be reality. +[224.160 --> 225.640] I'm not there with them, but it's real. +[225.640 --> 227.360] That's what makes it so fascinating. +[227.360 --> 230.480] And we put on virtual reality headgear. +[230.480 --> 234.920] The magic that happens there is maybe less magical in some ways because we know we're conscious +[234.920 --> 236.440] that we put something on. +[236.440 --> 238.200] We don't, we don't all this stuff to get ready for this experience. +[238.200 --> 239.840] So yes, it's going to be different. +[239.840 --> 243.400] So why, why do, you know, what's magical about VR and AR? +[243.400 --> 246.000] It's that we can basically do anything. +[246.000 --> 249.560] Most of what we do, not all of it, is compatible with real world things. +[249.560 --> 251.160] It doesn't have to be. +[251.160 --> 257.480] But most people and most things they do are in some way compatible with it. +[257.480 --> 261.600] We do almost anything you want, like I said before. +[261.600 --> 266.480] What's really, what I find interesting, I've been thinking a lot about is the sense of what, +[266.480 --> 270.240] and I'm going to touch on this throughout my talk, what it means to have a virtual reality +[270.240 --> 274.560] experience and why that is so distinct from our real world experience and why is that +[274.560 --> 276.680] and why, and do we want that and why is that. +[276.920 --> 283.200] But it really makes it, I'm thinking of VR in some sense as being very close to magic. +[283.200 --> 285.280] And I'm thinking about it in several ways. +[285.280 --> 288.680] In it's in the way it's practiced in the way it's done. +[288.680 --> 293.480] So one way, one way I've been thinking about it is every time you want to use some VR, +[293.480 --> 297.680] typically somebody has to be given the crown and somebody has to be given the sector +[297.680 --> 300.440] and they're allowed to do the thing and nobody else can kind of do it. +[300.440 --> 306.000] They're group VR experiences, but there's still the sense that you have to be chosen +[306.120 --> 310.120] and given the opportunity to do this virtual experience. +[310.120 --> 316.040] And that was true in the very beginning from the first time that I've been in his +[316.040 --> 322.200] students built a head-mounted display system in MIT in Utah in the late 60s. +[322.200 --> 323.360] And it's still true today. +[323.360 --> 325.560] So hopefully this video is not too loud. +[325.560 --> 327.000] Coming in. +[327.000 --> 329.000] So we're going to show you some virtual reality today. +[329.000 --> 332.600] It's really hard to show people what it's like to be in virtual reality without having them +[332.600 --> 333.960] try it for themselves. +[333.960 --> 337.520] Filming you in the green screen studio is just the best way we found to help everyone +[337.520 --> 339.640] else understand what it's like to be in VR. +[339.640 --> 340.640] Any questions? +[340.640 --> 341.640] Can I go first? +[341.640 --> 343.640] One person gets to it. +[343.640 --> 344.640] All right, go crazy. +[344.640 --> 345.640] Hi. +[345.640 --> 350.640] And it's magical for that person, everybody else. +[350.640 --> 352.640] Oh, you know what? +[352.640 --> 356.640] Oh, I feel like I'm just sitting there. +[356.640 --> 357.640] That's pretty good. +[357.640 --> 358.640] I'm not saying that. +[358.640 --> 359.640] I'm not saying that. +[359.640 --> 360.640] I'm not saying that. +[360.640 --> 361.640] I'm not saying that. +[361.640 --> 362.640] I'm not saying that. +[362.640 --> 363.640] I'm not saying that. +[363.640 --> 364.640] Yes, it. +[364.640 --> 365.640] Come on. +[365.640 --> 369.440] Oh, you're so cool. +[369.440 --> 374.880] So I've been thinking now about what is it that's been sort of, I'd say, bugging me. +[374.880 --> 375.880] Maybe bugging's a little strong. +[375.880 --> 379.640] But the thing I've really been thinking about is why is it in our discipline that it +[379.640 --> 384.880] seems like in our many disciplines, we tend to have, and I'm going to touch on this later, +[384.880 --> 389.440] almost tribes or groups of individuals who work together and focus on something and +[389.440 --> 391.840] sort of exclude everybody else. +[391.840 --> 393.720] And it's sort of my tribe and your tribe. +[393.720 --> 396.200] And we don't necessarily work together. +[396.200 --> 399.360] We don't necessarily think across tribes and things like that. +[399.360 --> 402.200] And so we have this, I call it the purification of the disciplines. +[402.200 --> 407.680] It's almost a self-fulfilling convergence of those groups of researchers over many +[407.680 --> 409.400] years. +[409.400 --> 414.760] So some of this special status, I think, about is being inherent in the way we think about +[414.760 --> 416.680] ourselves and the things we do. +[416.680 --> 421.560] So those of us who practice VR, do research of VR, when I say VR and AR, I mean broadly +[421.560 --> 425.040] HCI related things, user interaction. +[425.040 --> 426.040] So we're sort of the wizards. +[426.040 --> 429.120] We know how to do it and everybody else, there's sort of muggles and they don't have any +[429.120 --> 430.120] idea what's going on. +[430.120 --> 432.400] And we have to kind of show them how to do it. +[432.400 --> 433.400] And we're special. +[433.400 --> 435.480] We think of ourselves as maybe a little special. +[435.480 --> 439.360] We may not consciously think about it that way, but we do think about it a little bit +[439.360 --> 440.360] that way. +[440.360 --> 443.480] So it's also inherent in the way we think about it. +[443.480 --> 445.400] And this is something that really struck me. +[445.400 --> 451.600] As many of you would know this continuum, but what's interesting to me is, here's virtual +[451.600 --> 453.520] and it's way far away from real. +[453.520 --> 462.520] And so people will get into arguments over where does my work fit in this particular framework. +[462.520 --> 464.840] Because it fit way down here, is it over here and people argue about where it is. +[464.840 --> 470.440] So we think inherently, at least in this construct about placing our work somewhere +[470.440 --> 475.920] in here instead of allowing it to be necessarily many places at once, which some people would +[475.920 --> 477.120] call it mixed reality. +[477.120 --> 480.720] But I still think we're classifying it. +[480.720 --> 484.200] And I am not someone who loves classifying things. +[484.200 --> 488.040] I recognize that classifying things is useful sometimes, absolutely. +[488.040 --> 489.800] But I don't like myself to limit. +[489.800 --> 494.760] I don't want our thinking to be limited by the way we classify things. +[494.760 --> 500.720] So this has happened and it happens, continues to happen in this way. +[500.720 --> 503.520] And again, I'm not saying it's a gloom and doom and bad thing, but I'm just thinking +[503.520 --> 504.520] about it. +[504.520 --> 507.280] We have all of these different research communities. +[507.280 --> 508.320] We have our different journals. +[508.320 --> 510.240] We have our different conferences. +[510.240 --> 517.320] And these tend to sort of, I think, sort of solidify and converge ideas and thinking +[517.320 --> 519.320] in those areas. +[519.320 --> 524.240] I'll touch on this again later, but there's a sense sometimes of if I'm a reviewer +[524.240 --> 525.240] for sighi. +[525.240 --> 528.360] I'd say, oh, this doesn't belong here. +[528.360 --> 529.360] This belongs somewhere else. +[529.360 --> 534.560] And so we're inherently partitioning things and spreading them apart so that they fit +[534.560 --> 536.120] in those particular domains. +[536.120 --> 542.080] But in doing so, a little concern that we may be limiting our thinking. +[542.080 --> 544.200] So we've seen this sort of thing before. +[544.200 --> 546.160] This is a little silly, but I wanted to go back and think about it. +[546.160 --> 548.880] In the beginning of the forming of the earth, there were no humans. +[548.880 --> 551.480] There were no eventually humans. +[551.480 --> 553.440] And we're just a bunch of humans just walking around. +[553.440 --> 558.600] Humans just migrated all over the earth and kind of settled down in different places. +[558.600 --> 564.680] And at some point, they started building communities, families, communities, larger communities, +[564.680 --> 566.480] and building up nations. +[566.480 --> 568.000] So now we have our individual countries. +[568.000 --> 570.400] We need to go on to a country, right? +[570.400 --> 572.600] And not your country, my country. +[572.600 --> 574.600] So we're all separated in that way. +[574.600 --> 576.640] We even have our sports teams. +[576.640 --> 580.920] I'm not too into sports, but I thought some people here might be. +[580.920 --> 581.920] Right? +[581.920 --> 584.480] For our team, the other team's evil. +[584.480 --> 589.640] And we think about in that sort of clandish way or that tri-bish way. +[589.640 --> 596.440] So in the beginning, of course, when people started doing maybe in the 60s, VRA, Rish +[596.440 --> 600.080] sort of things, we didn't have these distinct research communities. +[600.080 --> 602.960] We're just people doing research in that area. +[602.960 --> 606.600] And then over time, some people started saying, I want to work on head-mounted displays. +[606.600 --> 609.080] Other people said, I want to work on these or interface parts. +[609.080 --> 610.480] Other people said, I want to work on displays. +[610.480 --> 613.200] So they kind of divided up. +[613.200 --> 618.080] And so now we have, and this is just a short sampling of a few conferences some of you +[618.080 --> 622.080] might know about and how long roughly they've been going on. +[622.080 --> 630.000] And so you can see across these, when you look at them, that they are, there is some duplication. +[630.000 --> 636.760] I mean, many of us who work, who participate a lot in VR and ISMAR, and even Sui and 3DY +[636.760 --> 639.920] when it was around in VR, they're certainly a lot of overlap. +[639.920 --> 646.560] I'm not saying it's a bad thing, but there's still these distinct communities. +[646.560 --> 651.240] And of course, we in this HCI domain aren't alone in this respect. +[651.240 --> 655.320] There are people, if you look, for example, at the whole previous session. +[655.320 --> 660.200] I thought was awesome because wearables and robotics here at Sui as a part of this is +[660.200 --> 664.480] exactly what I'm going to be talking about at the end of my talk about bringing those +[664.480 --> 667.560] things in, which I think is wonderful. +[667.560 --> 671.400] Whether they're curated or whether they just happen by chance, they still think it's +[671.400 --> 672.400] fantastic. +[672.400 --> 678.920] So, right, there's just a few wearable conferences, organizations. +[678.920 --> 681.760] It's a little more hairy if you go look at, for example, robotics. +[681.760 --> 687.840] Okay, here are, these are current robotics conferences, just a few of them there. +[687.840 --> 694.880] And what area broadly, sort of roughly related to things we do, do you think is the most +[694.880 --> 702.800] prolific or most has the most events, most conferences, most events? +[702.800 --> 703.800] Christian, got it. +[703.800 --> 704.800] Computer vision. +[704.800 --> 706.000] I mean, look at computer vision. +[706.000 --> 709.640] These are 2018 computer vision and image processing related events. +[709.640 --> 713.520] So, each one of these is their own event, their own community, they're all, you know, +[713.520 --> 720.200] thinking about what belongs in their venue and what doesn't belong in their venue. +[720.200 --> 724.120] The friend of mine, University of Maryland administrator, he and I were talking about this +[724.120 --> 726.760] once and he said, you know, disciplines are defined by their boundaries. +[726.760 --> 729.720] That's what makes us, that's what makes us who we are. +[729.720 --> 734.200] We get to say what's at the edge and what's beyond, but we, well, is considered part of +[734.200 --> 736.680] our community and what's not. +[736.680 --> 743.080] And this is where, for those of you who served on program committees for, you know, a conference, +[743.080 --> 747.880] you've probably heard reviewers or seen reviewers write things like, this isn't X or this +[747.880 --> 751.000] isn't, it doesn't belong here, belongs somewhere else. +[751.240 --> 756.600] I'm not saying that's wrong, sometimes that is useful and has to be done, but I do, +[756.600 --> 764.920] again, think about the cost of moving, excluding some of those things. +[764.920 --> 770.560] There's the, you know, I think about this as a, it's a dual edge sort, it's useful because +[770.560 --> 773.480] it allows us to focus on common things. +[773.480 --> 778.320] But I worry that, it makes us a little pro-heal or a little nationalistic in our thinking +[778.320 --> 782.240] that is we don't think beyond sort of where we're working at that moment. +[782.240 --> 785.440] So we may miss some bigger opportunities. +[787.360 --> 790.560] All right, so here I am, painting a picture of Gloom and Doom. +[790.560 --> 793.280] We're all off in our communities and we're converging on something and we don't want to +[793.280 --> 796.640] talk to anybody else, we're just going to work on whatever it is, tracking it, whatever it is. +[797.440 --> 799.760] So is that going to change? +[799.760 --> 800.560] It is changing. +[800.560 --> 802.160] It's nothing, everything changes. +[802.160 --> 806.560] So it is changing, but I want to talk a little bit about some things I've thought about +[806.560 --> 808.720] that could impact this. +[810.400 --> 814.720] So what might disrupt this and what is disruption, of course, it's a buzzword, everybody +[814.720 --> 816.880] talks about disruptive forces. +[817.680 --> 822.160] One thing I've been thinking about here is disruption occurs, can occur on many scales, +[822.160 --> 826.240] right? There can be global disruption, a new thing that changes the way a new transistor, +[826.240 --> 828.240] a new technology that changes the way everything's done. +[829.120 --> 835.520] And it can be local, it can be in communities or in research communities or research groups for that matter. +[836.880 --> 841.840] So one trend I've been focusing on for a while, I'm thinking about, and it's reflected in a lot +[841.840 --> 849.040] of the work I've done over the past many years, and particular work with Gared is how it seems to +[849.040 --> 855.040] me that in general many things about VR, and again, I use that phrase very broadly, are becoming +[855.760 --> 860.080] closer and closer to something that you would consider or we might someday believe is real. +[860.880 --> 866.880] While at the same time I see a lot of things happening in the real world, changes, technological +[866.880 --> 870.960] changes, things that one could think about as being virtual, because what is virtual? +[870.960 --> 875.120] Virtual is allowed, remember this richness and flexibility allows us to do anything we want. +[875.120 --> 879.760] So can we do that in the real world? Well, things maybe in the future are changing in a way +[879.760 --> 885.200] that would do the same thing from the opposite direction. So I'll give you a couple of examples +[885.920 --> 891.360] from my own work with Gared and others and things we've been thinking about here. This is something we +[891.360 --> 897.600] called Wobbly Table. This is a VR paper from 2016. And you think about this as just mixed reality, +[897.600 --> 902.160] but that wasn't the point here. There's a real table up front and she is virtual and she's +[902.160 --> 906.720] got a virtual table back there. And yes, when when Kong Sue sitting in front here moves the table, +[906.720 --> 912.080] her virtual table moves, it's sensed, and now the flip side of this is she can actually move +[912.080 --> 923.760] his table. So the real reason why they play a game of 20 questions. The important thing is that +[923.760 --> 929.360] she is aware of what's happening. She's aware of when the table moves. She exhibits behaviors +[929.360 --> 932.960] that she looks down and she knows so he's leaned on the table. Just like when you're in a restaurant, +[932.960 --> 936.000] right? And the table's wobbly and kind of bugs you and you're going to shove an African under it +[936.000 --> 941.520] or something like that. So you think of that as a nuisance, but in this way we did it as a contrived +[941.520 --> 945.840] way to see if it could establish a social connection between you and the virtual human. And what we +[945.840 --> 951.040] learned is it really makes a difference in how the users feel about her when there's this physical +[951.040 --> 956.160] connection. When she is aware that the table rocks and if she leans on it and pushes the table and +[956.160 --> 960.400] you feel it, it really changes the way people feel about it. So that's the point here, not the mixed +[960.400 --> 965.840] reality, but the point is about that awareness and ability to affect things. So here's another +[965.840 --> 972.320] example where we used like a wind sensor hidden back in here and we had a nice oscillating fan, +[972.320 --> 976.960] not unlike the ones in the corners over here, that was blowing on the subjects as they sat here +[976.960 --> 983.440] and talked to Katie. And the fan was pointed away, the paper that she has would be still when the +[983.440 --> 988.960] fan moved toward it, the paper would flutter, the virtual paper would flutter. And then at some +[988.960 --> 993.520] point she would notice that she would try to push the paper down and kind of look over at it. +[993.520 --> 999.120] So again, same thing, totally unnecessary, totally contrived. We inserted that and injected it, +[999.120 --> 1004.320] that connection to try and that physical virtual connection to try and cement or reinforce the +[1004.320 --> 1010.160] relationship between those two individuals, the physical and the virtual individual. +[1011.680 --> 1016.960] This one's a little different, but it's still the physical virtual aspect. What we did here +[1016.960 --> 1024.080] were experiments to see if witnessing Katie, the virtual human, having a conversation with Michael, +[1024.080 --> 1030.560] a real human, when you walk into the room, makes you feel differently about her. So again, she is +[1030.560 --> 1035.120] aware that she appears to be aware, she's not really, obviously. She appears to be aware that he's +[1035.120 --> 1039.600] there. He's having a high level conversation with her, they're laughing, they're telling jokes, +[1039.600 --> 1045.360] and then he says, oh, your visitors here, I'll see you later, and he leaves. So now you have this +[1045.360 --> 1049.920] sense without saying anything that she is aware of what's happening in the room, she's aware of +[1049.920 --> 1058.000] people there, and again, perhaps that she can influence it. So no surprise again makes a significant +[1058.000 --> 1065.280] difference in how people feel about her. So this is physical and virtual connecting to change the way +[1065.280 --> 1072.240] people feel about, in this case, virtual humans. So we have little gadgets we played around with a +[1072.240 --> 1076.480] lot in these cases, little wind sensors and other little little devices, and this has led me, +[1076.480 --> 1083.760] and I'm not the only person, to think a bit about the coming of the internet of things. And yes, +[1083.760 --> 1090.400] it's a cliche and all of that, but it is happening, and in some sense, disruptive because we're not +[1090.400 --> 1094.640] driving that, right? It's a little, it could be a little disruptive, it's happening outside of anything +[1094.640 --> 1100.160] we do, and there are people who think they've invented the ideas of network appliances or whatever, +[1100.160 --> 1105.440] and they're off just making all this stuff happen. So we could sit there and say, +[1105.440 --> 1110.560] tryably, oh, that's not our field, we don't do IOT, we just do, let's be sure, use +[1110.560 --> 1115.600] our interfaces or VR, or we can look at this and say, wow, it's this an opportunity for us. +[1116.560 --> 1119.920] You know, I was thinking about this as like, where are Alex's, let's not talk about their Alex, +[1119.920 --> 1123.920] yesterday I was talking about activating the physical world. He said that, I was like, +[1124.880 --> 1134.320] this is maybe a way of thinking about that. So a side effect is that, if you allow that IOT +[1134.320 --> 1140.080] could become useful in our spatially bearing interfaces and in our daily AR and VR experiences, +[1140.080 --> 1147.360] if you allow that, what that can start to do is transform the experiences that we're used to +[1147.360 --> 1151.840] having in AR and VR, which are typically very ego-centric. So I put on a head mount display, +[1151.840 --> 1155.280] you know, Kyle doesn't get to see it, it's me, I get everything from me, I get the sounds, +[1155.280 --> 1160.080] I get the sights, everything is for me, into something that is off my head and now more into the +[1160.080 --> 1165.360] real world. And so the sounds that I hear or the effects that happen could be happening from things +[1165.360 --> 1169.760] in the real world, the real world objects could be sensing me, so I don't even need to have my +[1169.760 --> 1174.960] head mount on, my dog comes in, you know, the IOT device could sense that, and my AR agents, +[1174.960 --> 1182.720] for example, could be aware of that when something happens. So we're looking historically at a trend +[1182.720 --> 1188.960] that's at least as somebody in this mindset, myself, stand back and look at these things that +[1188.960 --> 1194.160] had stored that. And so one of the visions I've had for a while that Gerden and I have been +[1194.160 --> 1199.920] formulating for a while is this idea of what we call augmented reality and put output devices. +[1199.920 --> 1204.720] The idea would be that you'd have a component that's like a tracking system, but it'd be a +[1204.720 --> 1208.880] box that you just set down in different places in the real world. And these devices would talk to +[1208.880 --> 1216.560] each other, they would network to each other, and they would set up a sort of a separate subsystem +[1216.560 --> 1221.440] of awareness and effectiveness. So an AR application, for example, could ask that network and say, +[1221.440 --> 1225.440] let me know if there's motion over here, let me know if there's a sound over here, tell me where +[1225.440 --> 1230.240] it is, tell me what it is, listen for these words, look for these actions throughout that space, +[1230.240 --> 1235.920] and then they could also output for you so they could provide sound, some of it could provide +[1235.920 --> 1240.320] liquids, some of them could provide, you know, low air, control, sort of, and so imagine a training +[1240.320 --> 1246.480] scenario like this, training nurses and physicians. One of the difficulties in using ARVR and +[1246.480 --> 1250.320] something on a scale like this is everybody has to wear a headbounce, everybody's running around. +[1250.320 --> 1255.200] There's a bunch of real world stuff, there are actors who come in who just need to pretend to +[1255.200 --> 1260.640] be the patient who pretend to be a paramedic, they don't need to wear a head mounted display. +[1260.640 --> 1267.120] So what we envision here is these ARIO or ARIO units would be spread out throughout objects, +[1267.120 --> 1270.640] so they could make a bed shape, so even though you've got a virtual human on the bed, +[1270.640 --> 1274.400] if you had an ARIO unit somewhere sitting on this structure while you're rolling the structure, +[1274.400 --> 1279.360] if the virtual patient is supposed to be shattering because they're having some sort of seizure, +[1279.360 --> 1282.560] then this unit could be shaking the bed, it could be vibrating the bed a little bit. So it could +[1282.560 --> 1287.680] provide that haptic part while your head mounted display provides the visual part, a little bit of +[1287.680 --> 1293.520] sneezey moist nastiness coming from a child here who sneezes on you, some vomit over here, +[1293.520 --> 1299.360] some blood, different things. So an ability to sense, so again if a real person somewhere waves +[1299.360 --> 1303.440] their hand and says I need help over here, but I don't have the head mounted display, this system +[1303.440 --> 1308.640] ought to be able to sense that and the ARVR system ought to be aware of that, it's not right now. +[1308.640 --> 1314.240] ARVR systems are very closed in their sort of awareness of what's going on in the world, +[1314.240 --> 1321.440] typically, not all but typically. So what do I mean when I talk about the claps? I don't mean +[1321.440 --> 1326.640] falling apart, that's not what I meant when I talk about the claps. I mean claps is in wrapping +[1326.640 --> 1332.720] around, and I'm not sure wrapping around is quite the right way to think about it, it's not +[1332.720 --> 1337.680] satisfying to think about it as becoming a big soup of stuff that doesn't have any organization, +[1337.680 --> 1343.760] because it's satisfying to either. But I am conscious again of the cost of spreading things +[1343.760 --> 1348.320] out like this. So there are people who've thought about other continuums, other ways of thinking +[1348.320 --> 1354.560] about this. So here's an example from Microsoft where thinking about it in more of a then sense, +[1354.560 --> 1360.480] mixed reality in the middle, the real environment, the human computer, some other things where they +[1360.560 --> 1369.200] overlap, some of you I know in this room, no Chris Stapleton. And so here is a version of that +[1369.200 --> 1374.560] sort of a notional diagram from Chris. And one of the things I think is unique and I think interesting +[1374.560 --> 1381.040] here is that he includes imagination. It's a little maybe unsatisfying for some of us that it's so big +[1381.040 --> 1386.240] because we don't control imagination, at least we don't think about controlling it right now. +[1386.640 --> 1393.200] It's a whole other topic, but it's something that some people, psychologists, certain people in +[1393.200 --> 1399.520] our field do think about when they do experiments or create experiences. They think about priming +[1400.320 --> 1405.680] the individuals for what they're about to receive before they do it. So they try to steer them +[1405.680 --> 1411.520] mentally into a particular place before they have the experience. So that is, you know, there is +[1411.520 --> 1417.520] something you maybe could control the imagination, but it seems pretty hard. All right, so virtual +[1417.520 --> 1421.120] becoming real, I'm just going to go back and forth and these two things a little bit virtual becoming real. +[1422.160 --> 1426.320] So what, you know, what does this mean? There's so many different ways that I've been thinking about +[1426.320 --> 1432.720] this. So one is with respect to just the actual experience of doing something virtual. So if any of you +[1432.720 --> 1436.720] as anybody in this room read this book, or seen a book, the where it will, a couple of hands, people +[1436.720 --> 1441.040] who read the book. So one of the very first things Jeremy and Jim talk about beginning of the book +[1441.040 --> 1445.520] is how virtual experiences are real experiences. They're real for that person who's experiencing it +[1445.520 --> 1450.560] at that moment. And he also talks a lot about how, you know, in some sense, the brain doesn't really +[1450.560 --> 1454.880] care much about whether what it's processing, whether it's real or virtual, that we can be influenced +[1455.760 --> 1463.040] in the same way. So there's a psychological or psychology aspect of that virtual influencing +[1463.040 --> 1469.440] the real. Jeremy has done, so those of you who don't know, I don't know, hundreds of experiments on +[1470.240 --> 1476.000] how virtual things can affect real behaviors. So things like cutting down a virtual tree with a +[1476.000 --> 1483.040] haptic chain cell device causes the subjects to be more, to conserve more pepital after they leave +[1483.680 --> 1490.160] the experiment room. How, as another example is saving a child. He has this, he gave this one's +[1490.560 --> 1494.560] experiment where you were Superman. You put him in a headmapper's play and you fly through a city +[1494.560 --> 1498.240] that's been evacuated because something bad has happened. There's one child that nobody can find, +[1498.240 --> 1502.640] child is diabetic, needs it, so like you got to find the kid. And so you do this virtual experience, +[1502.640 --> 1507.280] you find the kid. And then afterwards they, you know, contrived way when you're filling out the +[1507.280 --> 1510.240] questionnaire after you've done the study, you think you're all done, you're sitting down at the +[1510.240 --> 1515.120] table doing a questionnaire and the researcher accidentally knocks over a bin full of pencils. +[1515.120 --> 1519.520] And they fall all over the table on the ground. And it's amazing statistically how many people who +[1519.520 --> 1523.760] had the Superman experience helped pick up the pencils compared to the people who you did. +[1525.440 --> 1533.040] I'm involved with a couple of some, some of you know, I've been trying to start some workshops +[1533.040 --> 1538.160] with some other folks called VR for good, AR for good, kind of mixed things that I truly +[1538.160 --> 1544.240] VR and AR and I'm involved with some other organizations looking at VR and AR for the social good. And +[1544.800 --> 1551.920] people think about how do we, how can we use VR and AR? And the most obvious thing people think +[1551.920 --> 1557.440] about is, oh, show me the starving child, show me Sam and show me whatever it is. And I agree, +[1557.440 --> 1560.240] except the problem is how do you get people to look at that? Who's going to want to look at that? +[1560.240 --> 1564.160] Right? You can't force them to, and the thing comes to mind for me is anybody here knows the movie +[1564.160 --> 1569.040] of Clockwork Large? Yeah. It's all you can think about as a senior clockrollers, like I strapped in +[1569.040 --> 1573.040] the theater and their whole desire was able to make it and watch, you know, these terrible videos +[1573.120 --> 1579.360] trying to influence me. So there's a psychological aspect of this virtual becoming real. There's of course +[1579.360 --> 1585.120] a visual or, you know, the sort of traditional visual and appearance. Does anybody see this? +[1585.120 --> 1591.040] Can you just read it? I'll just do this and this is very, very new. So a little video here +[1593.040 --> 1598.080] from Magic Leap. Let's see if this works. Nearly two years ago we had interesting progress in +[1598.080 --> 1604.480] Micah's development. After focusing on realistic eye gaze, that's she's right movement and gaze, +[1604.800 --> 1610.960] we set up Micah on our current prototype. AI components were then added to track the user and +[1610.960 --> 1615.840] look them in the eye. Additional AI elements were added for body language and posture. +[1619.520 --> 1624.480] So, you know, you heard and say AI gaze, body posture, of course, keep in mind, an important thing +[1624.480 --> 1629.840] is she has to be aware of your body language and your posture and what you said in order for her +[1629.840 --> 1635.040] to exhibit things that are responsive. So again, is there a role for broader +[1636.320 --> 1642.480] allocentric sensing as an IoT or other devices? How can they play a role in helping our agents that +[1642.480 --> 1646.480] are, and this is AI, this is Magic Leap. So the idea is you put on your Magic Leap goggles and +[1646.480 --> 1650.720] you're walking through your house and you know, she appears over here and you talk to her. Well, +[1650.800 --> 1655.200] she has to, for it to be effective, she probably has to know what I'm doing where I am, you know, +[1655.200 --> 1659.920] and those sorts of things. So again, that environmental allocentric sensing may play a role. +[1660.560 --> 1666.240] There is a physical side in terms of taking the computer graphics. So Kyle mentioned it, +[1666.240 --> 1671.600] but there's some work I did years ago with Rommatch Raskar and others to develop something +[1671.600 --> 1679.840] made up calling spatial augmented reality. This is in the 1990s. So the idea was take the richness +[1679.840 --> 1684.960] of computer graphics from a, these were just still from like Photoshop still images, +[1684.960 --> 1689.440] carefully projected onto white wooden blocks. Suddenly the white wooden blocks look like they +[1689.440 --> 1693.600] have color. The basic idea here was when we see color on things, we see it because white light +[1693.600 --> 1698.560] generally is reflecting off of something that's passing blue light back to me. So I see blue and +[1698.560 --> 1701.680] that's blocking the other wavelengths, but you can do the same thing by having colored light up there +[1701.680 --> 1706.160] and having the object be white. You just transfer where the where the filter is being done. So we +[1706.160 --> 1710.240] worked on this for a while and this, and I think one of the things I realize now looking back on +[1710.240 --> 1714.720] this and it's really hard to articulate for those of you who've done any work in this area, +[1714.720 --> 1718.960] it's a very weird and special feeling to be in front of something without any headmaid display +[1718.960 --> 1723.680] on or anything that is changing color and changing maybe apparent shape and things are changing +[1723.680 --> 1728.240] about it right in front of you. It is really a compelling feeling and I'm not saying that just because +[1728.240 --> 1733.280] I had something to do with it. It just feels different and it's very hard to convey to somebody who's +[1733.280 --> 1739.920] never felt this or experienced it what it feels like. But I tell you that combination of +[1739.920 --> 1748.320] off of me in the field, dynamic virtual things happening is compelling. So we're using this now for +[1750.640 --> 1754.480] a lot of other things, people use the concept or the paradigm all over the world, we're doing +[1755.120 --> 1759.840] patient simulator. So we've got projectors and cameras and things and touch sensing going on +[1759.920 --> 1763.920] in the patient here. So nurses, these are a couple of our, uh, Garrett and my colleagues, +[1764.880 --> 1770.320] Desiree and Mindy who are nursing pediatric nurse professors and so they can walk up and touch +[1770.320 --> 1774.320] the patient and it's got touch sensing built into it, these curved surfaces you can do things like +[1774.320 --> 1779.040] pull down on the lat or pull down on the eyelid or take a limb or do whatever you want to the kid. +[1779.040 --> 1783.040] It's got temperature control all over the body so we can change the temperature of his head, his +[1783.040 --> 1787.120] hands, things like that, provide pulse and breathing sounds and all sorts of things. +[1787.920 --> 1792.880] Happy to talk to anybody who wants to about that moral of lines and exciting things in the +[1793.680 --> 1798.240] I don't know how many of you know about this. Has anybody in this room ever seen this work? Do you +[1798.240 --> 1804.560] know of this work? A few people. So this idea of passive haptics and I don't know that Fred and +[1804.560 --> 1810.080] Brent Inscow and Mary were among the first to do this but the first time aware of. So they went +[1810.080 --> 1815.600] out and very carefully measured Fred Brooks's kitchen in his house, very carefully modeled it in +[1815.600 --> 1820.800] the graphics model all the surface surface properties did everything they could at that time. This is +[1820.800 --> 1827.520] in the 1990s and then went and took a lot of bunch of styrofoam blocks and some masonite wood +[1828.160 --> 1833.840] created a crude physical representation of that same space and would walk around in it and so you +[1833.840 --> 1838.240] would feel if you reached out and there was a counter virtual countertop you would actually feel +[1838.240 --> 1841.520] something there so that was sort of satisfying. I don't know that you could lean on it but you +[1841.520 --> 1846.000] would feel it there. An interesting thing from this was one of the outcomes of the research was that +[1846.640 --> 1852.960] the visual sense of shape in the cases that they tested seemed to capture or overcome your +[1852.960 --> 1857.200] tactile sense of shape. Maybe not that surprising because our hands are pretty low fidelity but if +[1857.200 --> 1862.320] you see something that looks to you curved and you feel it and it's not curved your senses that +[1862.320 --> 1867.040] least in their studies was that it's curved even though it didn't feel like it. If you took the +[1867.040 --> 1871.840] head mount off and you felt it you'd say oh that's a sharp edge. So that was kind of a cool idea. They +[1871.840 --> 1876.560] went on and did this and I'm sure many of you know about the pit experiments. One was originally done +[1876.560 --> 1881.920] at UNC and again in the 1990s but then they've been done all over the world. Some of you may not +[1881.920 --> 1887.760] know or may not have seen how many in here have seen or how many have done a pit experiment actually +[1887.760 --> 1892.240] done the demo. Okay most everybody in here so something that was really cool that was that they +[1892.240 --> 1896.560] did here was for those of you who don't know so here's the virtual room. The objects would walk into +[1896.560 --> 1900.160] the virtual room they had had not to play on. They would the door would open they did some stuff. +[1900.160 --> 1904.080] They'd step out here and you kind of see here there's an opening here and all you can see down. That's +[1904.080 --> 1908.240] the pit. You can see down to the room down below here and you're given a task you had to walk out +[1908.240 --> 1912.240] on this little ledge right here that little like diving board which is what this is and look down +[1912.240 --> 1917.600] and you had to drop a ball onto a target and it's very the vexion the you know motion products is +[1917.600 --> 1921.200] very powerful as you're standing there and you move your head just a little bit this whole downstairs +[1921.200 --> 1926.000] thing is moving a lot and for most people it's pretty gets your attention. The thing they did that +[1926.720 --> 1933.040] pushed people over the edge so to speak part of the fun was to add a plywood +[1934.080 --> 1938.240] structure around here that was maybe three centimeters tall or something like that and so the +[1938.240 --> 1943.200] people walking on it. First of all they stepped off carpet onto the wood so it felt different to +[1943.200 --> 1948.160] their feet and then when they get to the edge of it they could feel over the edge with their foot +[1948.160 --> 1954.000] and it matched what they would see. So what they measured was heart rate and a skin galvanic +[1954.000 --> 1958.320] skin response and several other things and could see some significant increases in people's +[1958.320 --> 1964.800] heart rate as they felt and sweat when they felt that edge. Again bringing the physical +[1964.800 --> 1971.840] and the virtual together and spreading it out into the real world a little bit. So really +[1971.840 --> 1980.000] spreading it out. I don't know how many of you have seen these but you know theme parks in general +[1980.080 --> 1985.680] are doing a lot of this and I'll talk about that in a minute but you can do this yourself now +[1985.680 --> 1990.960] as a consumer if you want to be happy. When you use virtual reality at home you're always trying +[1990.960 --> 1996.640] not to touch anything. Your real world surroundings the thinking goes break the illusion of VR +[1997.200 --> 2002.240] but it turns out that if you merge the two things you end up with an experience that's far more +[2002.240 --> 2008.640] immersive. At the void a quote hyper reality facility they're melding state of the art VR tech +[2008.640 --> 2014.240] with real world physicality and in doing so they're leading a wave of location based VR. +[2014.240 --> 2019.440] It's a little bit video game a little bit laser tag and feels more like an actual adventure +[2019.440 --> 2028.160] than anything else on the market. And we have fire and bringing it back to magic to what is +[2028.160 --> 2032.800] the magic there's magic and multiple respects here but for all this tech hyper reality actually +[2032.800 --> 2039.600] uses some old school magic theory. Virtual reality and its truest sense is a form of magic. +[2040.560 --> 2045.360] Magic is just creating a new reality for people using the tools that are available whether that's +[2045.360 --> 2050.320] picking up a coin and using slide a hand to make someone think the universe is just for a second +[2050.320 --> 2055.840] that that coin could vanish. That's a little reality that you were able to create for it. VR is the +[2055.840 --> 2060.400] same way we're trying to create new realities that people believe in so it makes sense to use +[2060.400 --> 2065.680] magic principles to take that to a further stick stand and really get people embedded in a +[2065.680 --> 2071.280] mercenaries world. What I really love about that is that I live in Orlando so you have Disney +[2071.280 --> 2077.440] world there at Universal Studios but those one of my closest friends works for Disney. Those +[2078.560 --> 2082.240] teams of people who develop the experiences there they don't care. They're not +[2083.120 --> 2087.920] what's the word they're not bigots about oh it's not VR or it's not this or it's not they don't +[2087.920 --> 2091.120] care. They'll use any trick they'll use magic they'll use deception they'll use everything they +[2091.120 --> 2094.480] can to give you the experience they're trying to give to you to make you afraid to make you happy to +[2094.480 --> 2099.440] make you you know smile to make your kids happy and so I really love that and I'm conscious of the +[2099.440 --> 2103.760] fact that a lot of times when we do things in the academic community we frown on you know these +[2103.760 --> 2107.840] sort of tricks that people would play or something that isn't that isn't intellectually deep or +[2107.840 --> 2112.400] something like that it doesn't fit it's not VR it's something else it doesn't belong so again +[2112.400 --> 2117.680] is there a cost to us doing that or is there a place for people to do that and think about it and +[2117.680 --> 2122.480] pursue those particular things okay so that was virtual becoming real I want to cover a little +[2122.480 --> 2128.400] bit of real becoming virtual so what do I mean by this I mean as I said earlier virtual I think of +[2128.400 --> 2133.600] as mostly richness and flexibility of things to happen we can do that with head mounted displays +[2133.600 --> 2138.320] right but where do you see this happening now in the real world things you see it happening one +[2138.320 --> 2143.360] example is in robotics so I get a little film here from Boston Dynamics short +[2178.240 --> 2181.840] that's good because I was watching you guys when that happened it's interesting several people +[2181.840 --> 2185.440] or I'm really going oh you know and you feel bad it's like you kicked a dog or something right +[2185.440 --> 2189.360] that's what it looks like it's really cool of course it's not and then he's demonstrating how +[2189.360 --> 2195.280] stable it is but you know it is it is interesting to me that this thing that's mechanical and physical +[2195.280 --> 2199.600] is in some sense becoming more virtual and that it can do things that it couldn't do before it can +[2199.600 --> 2205.280] climb and go places that you know 10 years ago even we didn't think of robots doing those sorts of +[2205.920 --> 2209.360] things you don't have to be you know have a million dollars to buy something you can go out to your +[2209.360 --> 2220.880] toy store and buy cosmos +[2228.720 --> 2234.720] so you can start to think about you know how could this impact what I do how could I make use of +[2234.720 --> 2239.200] these sorts of robots is there a place or what problems could we solve together if we thought about +[2239.200 --> 2243.520] this and you can start thinking about adding remember I showed you the spatial augmented reality a +[2243.520 --> 2253.280] little while ago so what if you have this robotic thing but you can change the appearance of it +[2253.280 --> 2261.440] also Disney's been doing it for a while. Robots here, so it's animated based again rich and +[2261.440 --> 2267.200] complex right. It's body is also very comfortable so it's a combination of those two things some of +[2267.200 --> 2273.200] you here I know have seen I've seen Sphereo does anybody here remember I remember and is +[2273.200 --> 2276.960] our Christian anybody else that when this was today we want to introduce you to something really +[2276.960 --> 2281.360] special we've been working on and it's called Sharknobin we've been working on augmented reality +[2281.360 --> 2286.000] technology for Sphereo for over a year now we're just about all right now projected imagery but +[2286.000 --> 2290.960] now capital based or color based. It's different than AR this thing moving around you don't have to +[2290.960 --> 2294.960] use a printed out marker. It's being tracked and it's the novelated. A lot of augmented reality has +[2294.960 --> 2301.520] been all about introducing external markers into the scene in the case of Sphereo this marker is +[2301.520 --> 2306.080] a robot. This AR marker isn't stationary it isn't on a piece of paper that's stuck on the ground this +[2306.080 --> 2310.720] AR marker can actually move around and drive around and you can walk around your entire house and +[2310.720 --> 2316.000] plane augmented reality game. The reason why Sphereo is so special is because we can put this character +[2316.000 --> 2320.800] in your living room and you can move him around and interact with him in the real world and this +[2320.800 --> 2324.880] is just never been done before. So you think about it and it's like if I turn off the video and just +[2324.880 --> 2329.200] listen to what he said I might tend to think I can do that right now with AR I do with anything else +[2329.200 --> 2332.480] so why is this different what's different about it being a robot I'll give you one example one +[2332.480 --> 2338.240] example is my cat is lying here on the floor sleeping and Sphereo comes by my cat's going to get +[2338.240 --> 2341.920] up and jump out of the way which is going to change the way how I feel about this virtual thing +[2341.920 --> 2345.040] that's flying through the room right instead of the virtual thing passing right through my cat my cat's +[2345.040 --> 2349.120] going to like you know freak out and scream and and run somewhere else which I'm not trying to be +[2349.120 --> 2356.720] mean to my cat but just saying other things will react to this and it will be in a in a non-computer +[2356.720 --> 2363.760] vision way sensitive to you know floor height bumps things like that as you go through it. So again +[2363.760 --> 2368.080] back to some work that Garrett and I and and some others have been doing think that's related to this +[2368.080 --> 2372.400] thinking about why does the physical thing matter? Does the physical thing matter? Does the +[2372.400 --> 2376.720] physical or the shape of the environment or the the relationship the environment matter? +[2376.720 --> 2380.320] So I'm not going to go deeply into this but some recent experiments we did and this is being +[2380.320 --> 2386.800] this is at Ismar next week being presented but looking at so Amazon Echo or Google Home or +[2386.800 --> 2393.040] Apple HomePod or whatever right so it's just a voice or they can movie her or the guy falls in +[2393.040 --> 2403.760] love with her. Those are named the actress Scarlett James so anyway so the the idea here is you ask +[2403.760 --> 2411.520] your Echo Alexa please turn on the light and so Alexa turns off the light and you go okay fine +[2411.520 --> 2417.520] it's great or versus you ask her to turn on the lamp and you see somebody there representing Alexa +[2417.520 --> 2422.320] kind of like the magically video so you see an assistant and another condition we had was she +[2422.320 --> 2427.680] pulls out a device and she taps something in the light. Thank you the third condition was +[2428.480 --> 2432.640] you see her and you say please turn off the light and she goes okay and she walks over to the light +[2432.640 --> 2436.640] and again using IoT right she appears to turn off like she does she turns off the light. +[2436.640 --> 2442.080] The light goes off so this so the question is how does that affect people? Well turns out it doesn't +[2442.080 --> 2445.280] matter so much when the lights in your room but think about when the when the thing you ask for is +[2445.280 --> 2450.560] outside you can't see it so how do you feel about it then? Well as it turns out it makes a big +[2450.560 --> 2456.080] difference if she leaves the room appears to leave the room even out in snow purpose right? +[2456.080 --> 2460.480] The light's been turned off but if it looks like she leaves to do something and comes back you trust +[2460.480 --> 2466.640] that it's done more than you ever did before so it matters in this case you know here's an example +[2466.640 --> 2473.120] of similar to that where we did an experiment looking at privacy so if it's just the voice agent +[2473.120 --> 2478.640] the right there's a lot of privacy stuff in the news about like really Echo and other you know +[2478.640 --> 2482.880] is right collecting all the speech all the time processing it off somewhere in the world and stealing +[2482.880 --> 2488.960] your life secrets or my life's boring anyway so the so there's one thing if you don't see anything +[2488.960 --> 2493.280] but what if you see her and she says okay and she says I'll give you a few minutes of privacy +[2493.280 --> 2497.680] shirt and she appears to put on headphones and like since they're listening to music so now do you +[2497.680 --> 2502.320] believe more that she really is not listening or is she there and then she leaves the room she says +[2502.320 --> 2507.040] okay I'm gonna leave yell when you're ready and I'll come back and it turns out again people are +[2507.040 --> 2512.160] more trusting that she's not listening when she physically appears to leave before then when +[2512.160 --> 2518.720] she just says okay so and this may be a vestige of our real world now maybe you know future generations +[2519.440 --> 2523.840] won't worry about it it would be ideal for me but that's an example where it matters so +[2524.880 --> 2534.080] so this is something this sort of melding of robots and AR and HCI and user interface this is +[2534.080 --> 2537.760] something that I and others many people have been there for a long time but in particular I've +[2537.760 --> 2543.360] been thinking about it with was thinking about it a few years ago with a group of people gosh 18 years +[2543.360 --> 2549.280] ago eight years ago eight years ago and in a in a grand big grandpa pose a big team we put together +[2549.280 --> 2553.360] to work on this and it was the advantage of sometimes for those of you who haven't written +[2553.360 --> 2558.160] grand proposal students will be writing them it's a pain in the neck it's it's I think what +[2559.120 --> 2566.080] somebody called soul sucking which I agree it is a soul sucking draining on the other hand it forces +[2566.080 --> 2570.880] you to think about things articulate them write them down and sometimes what comes out of that +[2570.880 --> 2575.120] lives on you go somewhere else with it and this is lived on this isn't effort that wasn't successful +[2575.120 --> 2580.080] from a funding standpoint but a lot of things lived on about it one of the thing coolest things +[2580.080 --> 2584.960] about it that I loved was we were thinking about in these primitive sort of being these like robotic +[2585.040 --> 2588.560] building blocks you've probably seen many of these I'll show you one here in a minute but these +[2588.560 --> 2592.080] little robotic things that can reassemble self-assemble what if they could self-assemble and what +[2592.080 --> 2596.880] if they could change their appearance using shader lamps or something else in some way and so that +[2596.880 --> 2601.600] was the kind of thing we were thinking about there many people since this time have started you +[2601.600 --> 2609.120] pan these guys were some of the first here's a recent clip from that I think our objective is to +[2609.120 --> 2615.840] design self-assembling and self-reconfiguring robot systems these are modular robots with the +[2615.840 --> 2622.960] ability of changing their geometry according to task and this is exciting because a robot +[2622.960 --> 2630.160] designed for a single task has a fixed architecture and that robot will perform the single task well +[2630.160 --> 2635.120] but it will perform poorly on a different task in a different environment if we do not know +[2635.200 --> 2640.640] ahead of time what the robot will have to do and when it will have to do it it is better to consider +[2640.640 --> 2647.920] making modular robots that can attain whatever shape is needed for the manipulation navigation +[2647.920 --> 2653.120] on or sensing needs of the task so there's stat these she's showing her static in color imagine +[2653.120 --> 2657.600] that they could change color at least imagine they could actually put imagery on the side imagine +[2657.600 --> 2661.520] that you could have and there are people who've done I forget the name of it but there was a group +[2661.520 --> 2666.720] in Japan who did like conticular displays and it a cube so it was an auto stereo cube and you +[2666.720 --> 2671.120] could look at it so you couldn't see things outside the cube but you could perceive things inside +[2671.120 --> 2675.280] the cube you can move it around so a lot of interesting things maybe you could do here all of +[2675.280 --> 2679.920] changing the appearance of and think about by the way you need user interface elements right this +[2679.920 --> 2683.600] thing could reconfigure to whatever is appropriate at that time it becomes a stick it becomes a circle +[2683.600 --> 2687.280] it becomes whatever it needs to be at that moment for you to do whatever it is you need to do +[2688.240 --> 2692.160] changing color changing appearance all that takes energy it's view dependent it's all these +[2692.160 --> 2695.360] things so what we were thinking about and we're not the first people to think about this there are +[2695.360 --> 2700.320] more groups starting to pick this up but it turns out this is again the advantage of trying to +[2700.320 --> 2707.120] get outside of your silo I hate that cliche word but again out of our tribe into other groups so +[2707.120 --> 2712.960] there are people in material sciences who are studying how for example butterflies and cuddle fish +[2713.120 --> 2719.120] and other animals change their appearance with no apparent energy or very low energy and so in +[2719.120 --> 2726.560] the case of butterflies it has to do with the the nanostructure of the materials in their wings +[2726.560 --> 2730.400] and that when you look at them very closely they reflect or refract certain wavelengths of light +[2730.400 --> 2734.560] and that's how we see color so there are people actually developing nanomaterials that you can +[2734.560 --> 2739.520] apply a current to and change the nanostructure such that it changes the light properties so you +[2739.520 --> 2744.480] basically can make a display that's static they're working on dynamic ones the ones I know are +[2744.480 --> 2749.920] static where you can apply a voltage and suddenly here on the table it's a different wood pattern +[2749.920 --> 2755.200] and it is real it is real wood pattern as real wood it's not real wood but I look at it with a +[2755.200 --> 2759.840] flashlight or move around in different ways and you're going to it's not the protesters are they +[2760.880 --> 2766.640] they're coming closer there so in the structure of this has changed so that it actually has +[2766.640 --> 2771.760] those colors it's not an emissive display creating those colors so I love that and I you know +[2771.760 --> 2777.600] still wonder if there's something there to be done all right so now a little silliness here +[2778.880 --> 2783.520] anybody been to Orlando before besides Gared if you others I know anybody been to Disney World +[2783.520 --> 2790.000] or Universal there okay anybody seen Harry Potter stuff a couple of fantastic all right +[2790.000 --> 2796.800] steam all right so I'm gonna show you just a little bit here think IOT robotics and think +[2796.800 --> 2804.240] you know making physical things virtual or virtual things real either way this is a really fun experience +[2808.960 --> 2813.440] so you show up you get this map you get to get a lot of money you get into the park of course +[2813.840 --> 2825.120] but you're in Die Gun Alley go to all of Anders Wandshop the whole ceremony go through +[2825.120 --> 2830.480] and they'll pick out a wand for you just like in the movies like Harry Potter lots of cool stuff +[2830.480 --> 2834.320] happens when you're in the Wandshop now you have your wand +[2844.400 --> 2849.840] so you can take a sip of the frog so she's moving her wand it's a positive thing to happen +[2853.440 --> 2858.000] so there's something sensing what she's doing then there's something actuated and +[2858.000 --> 2866.080] what's positive in the real world I love that funny story about this if you go to Universal Studios +[2866.080 --> 2872.000] to Die Gun Alley you will see people walking around who look like employees they look like the +[2872.080 --> 2877.520] Universal Studios people but they're not they're fanboys girls who are so into Harry Potter and +[2877.520 --> 2881.600] they have annual passes and they show up there and they're like free docent they walk around and +[2881.600 --> 2885.760] they'll tell help the help little kids like here's how you do it you know and I guess Universal just +[2885.760 --> 2892.720] let's them you know do their thing but that's pretty fun all right so the last topic I want to +[2892.720 --> 2898.000] cover here is a little bit of a challenge a little bit of thinking for us and again I don't +[2898.000 --> 2901.200] want you to think I think this is all new and other people have thought about some of this before +[2901.200 --> 2905.200] but I'm thinking about it a lot more now and so I want to bring it to your attention so +[2906.320 --> 2911.680] so one thing I've been thinking about is you know just VR what we do right now VR and AR I'd call +[2911.680 --> 2917.840] it just pseudo reality I really feel like I should say or something we should all be swaying +[2924.800 --> 2929.680] all right so I would call that just regular sort of it's magic magic and quotes magic meaning +[2929.680 --> 2934.800] that we know it's not real and it's just sort of you know kind of thing that's happening so now I +[2934.800 --> 2940.320] think if your VR and AR things are somehow aware and they're able to affect things things can happen +[2940.320 --> 2945.840] in the world around you doors can open and close robots can come back together I would call it a +[2945.840 --> 2951.920] little more real reality and then super reality is now if you have distributed sensing and control +[2951.920 --> 2956.240] and you have all this information like my calendar my email so all this comes together that's +[2956.240 --> 2961.040] sensing right it's not just sensing a physical things but sensing of virtual things of my life and +[2961.040 --> 2968.080] my data think about Jarvis right or Iron Man I think is one of the things the way in Iron Man that's +[2968.080 --> 2971.840] the way I think about it so right my age it could be tell me Greg your shoes untied you left your +[2971.840 --> 2976.320] window open your mother-in-law's come in I adjusted the thermostat for you she'd be aware of all this +[2976.320 --> 2980.960] because it's in my calendar and she's looking at my shoes your flies on down whatever it might be +[2981.920 --> 2986.720] you know she could even change the way I feel right says Greg you're awfully handsome today so +[2986.720 --> 2991.520] she made me feel better right so it's important for her to influence me socially so I believe that +[2991.520 --> 2995.440] I have to you know all these other things have to happen I have to be conscious that she can +[2995.440 --> 3000.640] affect things she's really aware of things she's smart so well she says that then I know she's smart +[3001.760 --> 3007.040] so right here's all the other stuff my email a coffee maker of course everything that's going +[3007.040 --> 3014.560] on in my house and like I said think of Jarvis so where do we you know what should we be thinking +[3014.560 --> 3023.040] about and again Garrett and I and others have been thinking about this in people in CS and engineering +[3023.040 --> 3028.640] and other groups have been thinking about what could this be like so if you have distributed +[3028.640 --> 3033.120] things in this internet of things some of them could be AR user interface IoT objects and +[3033.120 --> 3037.760] they could be just normal IoT object your coffee maker things like that so everything in my house +[3037.760 --> 3042.560] every device I have is getting going into the cloud it's being analyzed it's being analyzed locally +[3042.560 --> 3048.720] for me right and then I've got all these user interface side things appearance and interaction +[3048.720 --> 3057.280] with things over here how I can so things are continuously being analyzed and things are +[3057.280 --> 3061.840] happening around me my agent will follow me around the house right my agents in my phone then my +[3061.840 --> 3070.960] agents in my refrigerator my agents at work with me and I can talk with them so convergence I'm +[3070.960 --> 3076.560] going to talk about that word in a moment but I'm I think it's fun to just think about you know +[3076.560 --> 3081.120] IoT robotics all these things is there's something we could do together is there something we're +[3081.120 --> 3088.080] missing because we're working in these individual communities that we could have an opportunity to +[3088.560 --> 3094.640] play with have fun and do something meaningful if we would only pay attention and work together on those +[3095.440 --> 3101.680] so here's the start of my little bit of challenge some of you or in the US professors would know about +[3101.680 --> 3107.280] the National Science Foundation has this notion of what they call convergence and this is something +[3107.280 --> 3111.920] very special this is one of the ten top ideas for or the big ten ideas they call it for +[3112.560 --> 3119.600] for money for where NSF is going to put money the National Science Foundation what if you read this +[3119.600 --> 3124.240] what they're saying is it's more than just multi disciplinary work it's more than trans disciplinary work +[3124.240 --> 3138.960] it really means in some sense yeah the music is going along with this is somebody should film me +[3138.960 --> 3147.200] I'll do this yeah so it's more so it's it really is the ideas you're potential for forming a new +[3147.200 --> 3151.920] discipline it's a new new conference or new something that comes out of where there wasn't something +[3151.920 --> 3156.560] before and people work together so ironically it's sort of like you're creating a new community where +[3156.560 --> 3160.160] people are going to be potentially be isolated again so it's like you know we're going to take +[3160.160 --> 3162.080] something from here something here we're going to put them together and then they'll go off and do +[3162.080 --> 3166.400] their own thing and they'll be isolated so it's like I said I'm not saying that that's necessarily bad +[3166.400 --> 3171.520] that people work together in particular areas I just think it's interesting to to think about the +[3171.520 --> 3175.200] way I think about it because I think a lot about a lot of things the signal processing things is +[3175.200 --> 3179.840] we could get stuck in local minimums right and so a lot of what we do our work we're sort of focused +[3179.840 --> 3184.880] in on this one area and if we don't take the opportunity to occasionally take excursions out +[3184.880 --> 3190.240] into these other disciplines or other areas we sometimes miss opportunities to look at something +[3190.240 --> 3196.000] interesting or do something compelling so I'm going to challenge you guys and I think this is +[3196.080 --> 3199.920] already starting it does happen and it happens naturally through a lot of things we do +[3199.920 --> 3206.400] but to take those excursions especially for students to to try and think about how you can go see +[3206.400 --> 3212.000] other conferences go visit other talks that are not in your area and see if you can learn something +[3212.000 --> 3216.400] so these happen this is happens a lot in some of you might know about doctoral seminars here in +[3216.400 --> 3222.960] Germany in Dougschtul and shown in meetings in Japan and other workshops so this is these are +[3222.960 --> 3226.560] like week-long sort of retreats where researchers go and they think about things and they talk +[3226.560 --> 3232.320] about things and those of you who've been in them and and organized them know that it is really +[3232.320 --> 3238.640] hard but it's really important to articulate and disseminate the ideas that you have so it's we +[3238.640 --> 3243.600] all owe it to everyone else since those of us who go to those events are fortunate enough to be +[3244.320 --> 3248.960] chosen to be able to go to those events even though we pay for them but we really owe it to the +[3248.960 --> 3255.840] rest of the community to share what it is we think about so then co-locating conferences I think +[3255.840 --> 3263.040] really helps so three thumbs up for out of three for Sui West is Martin and AWE it's awesome that +[3263.040 --> 3266.160] they're you know it's not perfect there is no perfect right if they were on top of each other +[3266.160 --> 3270.400] it'd be a bummer if they're spread out it's a bummer so there is no perfect there's just something +[3270.400 --> 3276.240] that is an attempt and it works joint registrations are hard financially it's hard to cross money +[3276.320 --> 3281.680] between places but it's I think it's worth it we really need to work on it this keynote is an +[3281.680 --> 3286.800] example complimentary joint sessions so people from West we're invited to come here so I love that +[3288.480 --> 3293.440] last thing is just on an individual basis you know I was thinking about I'm glad Steve's here +[3293.440 --> 3297.280] because I didn't see it before and I was thinking Steve is one of the Steve Finer professor Steve +[3297.280 --> 3301.600] Finer from Columbia for those of you who don't know him is one of the people I know who's like at +[3301.600 --> 3307.520] every place I go he's involved and many communities that I'm not involved in and so he serves as a +[3307.520 --> 3313.680] bridge between and he's not just attends them but Steve is involved in the leadership of many of +[3313.680 --> 3319.120] these different conferences so he will pollinate cross pollinate ideas and thinking in things across +[3319.120 --> 3324.640] these organizations and I think that's really really really important I do wonder about and think +[3324.640 --> 3331.040] about if conferences should make a more proactive effort to reach out to other communities and +[3331.040 --> 3335.840] bring one or two people to their conference to be a part of it and experience it and learn and see +[3335.840 --> 3341.600] if there's a potential synergy next week at SIG Graph Steve and Christians some others know we have +[3342.240 --> 3346.640] somebody from SIG CHI who's the vice president of conferences or something is going to come and +[3346.640 --> 3352.800] meet with us and talk to some people it is more so it's an example again of that and with that I +[3352.800 --> 3360.720] am going to shut up and happy to sing a song or take questions whatever you guys like but I'm finished +[3361.040 --> 3385.520] okay thank you thank you for the talk it was very inspiring and so do we have any questions +[3386.080 --> 3395.200] and it's okay if you don't I'll be around over here we can talk over here or hang out Steve +[3397.360 --> 3403.600] great talk Greg this is less of a question more of a kind of a comment okay and that's that if +[3403.600 --> 3410.800] you think about back the days of 1965 ultimate display talk that I've been settled and gave +[3410.960 --> 3416.000] and a lot of this resonates with that except with the ultimate display talk besides the fact +[3416.000 --> 3421.440] it was only one room it was indoors only one person there were no virtual people but there were +[3421.440 --> 3425.760] chairs you could sit in and bullets that could kill you and handcuffs that could confine you but he +[3425.760 --> 3432.000] had no idea how to do any of that stuff and it was just all this like the ultimate display would be +[3432.000 --> 3436.800] and then years later of course there's a headborne display and sort of down at least it a bunch +[3436.800 --> 3443.360] of very clunky but really cool 1960 stuff so one thing that really impresses me here is that +[3443.360 --> 3450.160] here we are of course coming up on in months on 50 years later and we actually kind of know how to +[3450.160 --> 3456.640] do some of this stuff and so it's not just this crazy wild dream right but it's really something that +[3456.640 --> 3461.680] together lots of folks and lots of different disciplines can turn into a reality at some time +[3461.680 --> 3465.760] absolutely absolutely and I think that's one of the things where again for me the evolution of +[3465.760 --> 3470.240] robotics and IoT things because it's happening independent of me at least I can't speak for +[3470.240 --> 3474.320] everyone and so you know the fact that these things are coming to fruition they do play into +[3474.320 --> 3478.800] that vision and nobody could do those at that time and you're right now and now we have the +[3478.800 --> 3482.480] luxury maybe of starting to be able to do or think about some of these things and I think that's +[3482.480 --> 3491.280] just really cool so yeah it's a good observation you guys get that I'll just take this over for +[3491.280 --> 3494.160] David Tar apparently he was just going to sit down I thought he was trying to take over +[3500.000 --> 3504.160] I wanted to ask you a question Greg as we were talking about sort of the social contract stuff and +[3505.040 --> 3510.160] Alex's work question I asked earlier about that and when you showed the you know the virtual +[3510.160 --> 3515.360] people sort of responding to you and her leaving the room and stuff that was not that was really +[3515.760 --> 3521.840] an amazing example of this this cross over where we've got this device that can send us but now do +[3521.840 --> 3527.680] we expect the same things of the virtual person what kind of contract do we have with them or other +[3527.680 --> 3533.680] peoples right and do we treat them similarly so I didn't you know you mentioned it but +[3533.680 --> 3539.760] Garrett and I and others and Jeremy and Stanford have done studies for lack of a better thing I +[3539.840 --> 3545.200] call them after we call them afterglow related studies so if I see a virtual human sitting in this +[3545.200 --> 3552.800] chair and I'm talking with her and then I take off my head mount display for example I just take it off +[3552.800 --> 3558.880] do I behave now as if she's still there or if I shut it down if I intentionally do something in my +[3558.880 --> 3563.440] head on display to say turn off do I behave as if she's still there so all these these things that +[3563.440 --> 3569.360] have to do with not just how she reacts to me like you're saying Kyle but how I feel about her do I +[3569.360 --> 3575.440] avoid her do I walk around her when I leave and surprisingly enough people do and so you know +[3575.440 --> 3580.800] it's like you think about it do people subconsciously have like a separate dimension in which these +[3580.800 --> 3585.280] people exist and they're still here and just because I've taken off these glasses it's like heat +[3585.280 --> 3589.520] sensing glasses like I just can't see them right now but I put my glasses back on they're still +[3589.520 --> 3595.040] there take them off so I think there's I'm not a social psychologist just again why I love +[3595.680 --> 3601.200] stepping outside of my area working with someone like Jeremy it makes it so much fun to learn +[3602.480 --> 3610.240] Frank thanks Rick for the excellent talk I really enjoyed it um is it on yeah um also a question +[3610.240 --> 3614.240] related to other fields which might come in our field and I observed in the last two years or so +[3614.240 --> 3618.640] that every sick rough conference every computer vision conference there was a lot of machine learning +[3618.640 --> 3623.360] and deep learning for everything exactly what they developed so far haven't seen so much machine +[3623.360 --> 3628.000] learning in our field so there are rarely any papers within the last two years at PR of RST which +[3628.000 --> 3633.120] were made use of machine learning so I want to ask you what do you think what would be interesting +[3633.120 --> 3637.120] applications for machine learning and not virtual agents because that's pretty obvious but what +[3637.120 --> 3642.080] are other options that we could use machine learning I get to tell you before I answer the question +[3642.080 --> 3645.440] because I have an answer for you but I saw a funny article somewhere once it was like almost a +[3645.440 --> 3650.080] cartoon but it was where statisticians who were sort of bitter about machine learning getting +[3650.080 --> 3653.520] all the glory and getting all the money now there was like this little table where they said like +[3653.520 --> 3658.160] things like you know like who gets credit for inventing certain things and it's like machine learning +[3658.160 --> 3661.920] people it was basically sour graves from the statisticians it was kind of funny it was like you know +[3661.920 --> 3665.360] who gets all the research funding who has all the conferences it's the machine learning people +[3665.360 --> 3670.320] what we've been doing this stuff for 30 years and nobody cares you know so you know I just feel like +[3670.320 --> 3675.120] there's so many things I mentioned earlier the the spatial user interface that's adapting to whatever +[3675.120 --> 3679.840] I needed that moment it needs to know what I'm doing what I what I need I tell people before it's like +[3680.880 --> 3687.760] except for an agent my son one of my sons has this rare for my experience rare ability to both +[3687.760 --> 3691.520] pay attention and be alert and be attuned to what's happening so if I'm doing something like +[3691.520 --> 3695.200] an example comes to mind as I'm changing a light switch in my house so it's you know there's some +[3695.200 --> 3699.040] electrical stuff there I'm paying attention to it I'm trying to shock myself I think I've turned +[3699.040 --> 3705.840] off the electricity and I need a tool and I kind of put my hand down and I say Carlos could you get +[3706.160 --> 3711.360] and he's already got the FIG ID he knows from the context he knows what I'm doing he knows I need to +[3711.360 --> 3716.640] remove this screw so he's provided for me the tool right there before I've been saying it's like +[3716.640 --> 3723.360] it is liberating it's so amazing sometimes my assistant is is is equally proactive and press +[3723.360 --> 3728.800] in it in that way so machine learning I think at least for context understanding for recognizing +[3730.080 --> 3734.560] and it may be as simple as recognizing gestures you know one of the things that Gary and I have +[3734.560 --> 3741.280] talked about in the the awareness avatar AR avatar thing is think about with Amazon echo or +[3741.280 --> 3747.200] something like that right now how clunky it is you have to name every lamp right you you get it +[3747.200 --> 3751.280] the first day you get the echo home and you program it and you set it up for this and it says you +[3751.280 --> 3755.840] have to give it a name and so you go a lamp and then you know you go okay well it's pretty cool I'm +[3755.840 --> 3759.680] gonna put on this lamp too and you get an amp to name it you're like oh I don't know lamp too +[3759.680 --> 3763.520] you know you're we're not very creative and we don't think it through and we don't plan it out and +[3763.520 --> 3768.400] so now you got to remember it's like that's lamp too that's that's Greg's lamp this one's floor lamp +[3768.400 --> 3772.000] you know getting weird names and everything but shouldn't it be the case that you should be able to +[3772.000 --> 3776.240] just say turn off that lamp and just gesture and point toward it and to turn off kind of like +[3776.240 --> 3781.120] X1 but with my hand I shouldn't have to to do that so maybe in machine learning I think there's a +[3781.120 --> 3788.000] lot to be done about context about what's going on and about the state of the real physical environment +[3788.000 --> 3796.720] including my body +[3799.200 --> 3805.360] yeah thank you so much for the talk I also I also want to point out how humble you are when you're +[3805.360 --> 3813.360] mentioning spatial AR which basically influenced the whole feel of projection mapping and ironically +[3813.440 --> 3818.800] in Berlin this week we have the festival lights so tons of buildings are +[3818.800 --> 3826.960] yeah it might be on today is anybody been to in Leone to I was for the fate of Lumière in +[3826.960 --> 3833.280] Leone this is amazing too I didn't know they had that here yeah so I think you might be on tonight +[3833.280 --> 3839.680] still yeah wow okay sorry Alex so it was like for me it was like a 30 minute walk which is +[3839.680 --> 3847.520] building after building so thank you thank you so I was a little bit curious there's a lot +[3847.520 --> 3854.880] of great examples from simulation and I mean especially I'm interested in you're involved in +[3854.880 --> 3861.920] so many different fields and I was especially interested about the medical applications and how +[3861.920 --> 3867.360] you see some of these techniques being applied not only in training but actually how they could +[3867.760 --> 3877.200] improve medical practice and like surgery etc in real time that makes sense yeah so we I'm not +[3877.200 --> 3881.360] sure about surgery in real time I haven't thought about that too much but I'll think about it but +[3881.360 --> 3886.640] in terms of training I've thought a lot about it so there's a lot of it turns out there's great +[3886.640 --> 3892.640] evidence that let me back up in the US at least and I think this is pretty true worldwide +[3893.360 --> 3896.720] physicians that is doctors where they go through schooling and at some point they have an +[3896.720 --> 3902.000] internship and then they're they're actually practicing in a real hospital or a real setting and +[3902.000 --> 3904.640] that somebody looking over their shoulders so they're really working with somebody but they're +[3904.640 --> 3909.920] actually doing stuff nurses don't have that and you know for whatever reason historically and +[3909.920 --> 3913.600] there are many more nurses and nurses show up at a hospital the first day they get hired +[3913.600 --> 3917.520] and hospitals don't trust them because they haven't had that experience they so their +[3917.520 --> 3922.480] internship is in a sense begins right then of course we don't want them practicing on us right nobody +[3922.480 --> 3927.280] wants that or practicing on our kids or our family members or something like that so it's a +[3927.280 --> 3932.720] difficult situation so there's lots of investment and in training nurses in particular and +[3933.600 --> 3938.720] there's lots of evidence that the realism of the experiential part of it plays a big role +[3938.720 --> 3944.640] in how much of the skills they retain and therefore the outcomes once they start practicing +[3944.640 --> 3948.480] and one of the things that I've been told by nurses and physicians that I work with over the +[3948.480 --> 3952.800] years is that there are many of the technical things they can train you know you can practice you +[3952.800 --> 3956.320] can have rubber arms you can practice doing an IV or a central line then you can do that over and +[3956.320 --> 3961.760] over again you make mistakes and it's okay but it's learning about talking to other people about +[3961.760 --> 3966.400] the chaos of say an emergency room where you're on call you show up and there's another doctor shows +[3966.400 --> 3970.160] up and another nurse and you don't know each other and you don't know what your specialty is I don't +[3970.160 --> 3974.240] know if I can trust you I don't know whether you're you're gonna sit back and let me lead whether +[3974.240 --> 3977.840] you're gonna take over and so there are all these dynamics then there's chaos you know the kids +[3977.840 --> 3982.160] screaming because it was a car accident somebody you know the mother's upset or the father's upset +[3982.160 --> 3985.680] and so there's all this stuff that people just aren't used to dealing with and the same thing +[3985.680 --> 3990.240] happens within the military training medics for example who are have to deal with somebody getting +[3990.240 --> 3994.720] shot or something the first time and and you know that you don't want them to freeze out in the field +[3994.720 --> 3999.840] and so you want them to get used to that so all of this stuff we talked about and I showed the +[3999.840 --> 4006.640] picture there of the of the sort of trauma center Garrett and I have an experiment or project going +[4006.640 --> 4011.040] where we're working with actually we're very fortunate to have a relationship with the UCF +[4011.040 --> 4017.040] police department at our university who are it's a big university it's a 68,000 students so it's +[4017.040 --> 4022.080] a very big second largest university in the US maybe first go so a very big police force but very +[4022.080 --> 4027.120] forward thinking and so we're doing some work with them to look at the use of augmented reality +[4027.920 --> 4034.160] in our student union which is this great big hall where students gather so police unlike +[4034.160 --> 4042.000] military have this situation where they know quote unquote that this could be a target for some crazy +[4042.000 --> 4046.960] person or people someday shopping balls things like that because a lot of people concentrated one place +[4046.960 --> 4052.960] so what we're looking at is how could the police both do real not just real-time simulations of a +[4052.960 --> 4058.400] bad event happening in the union in the student union so having virtual innocent people having +[4059.360 --> 4065.200] the paramedics come in and all sorts of stuff happen but then also what I for lack of a bit of +[4065.200 --> 4069.520] termical pre-breathing which is they go to the union and police officers just talk about that +[4069.520 --> 4074.240] structure and say you know like where the imagine there's a guy with a gun over here where would +[4074.240 --> 4078.560] you go where would you take cover where they can look around and they they don't just learn +[4078.560 --> 4082.960] specific to that place but they learn about the types of things where you would take cover tables +[4083.040 --> 4089.040] are better than this and something else is worse than that so AR the ability to sense in this +[4089.040 --> 4094.960] big area stuff that's happening the ability to create specialized effects around in the environment +[4094.960 --> 4100.800] all of that can be helpful in terms of increasing the realism of the experience which lots of research +[4100.800 --> 4105.360] are shown increases the effectiveness of those people in medical or military or teaching or +[4105.360 --> 4111.520] anything else they do after that I hope that kind of answer the first question yeah I mean I +[4111.600 --> 4116.720] actually I was also curious a little bit more so this is training before the actual +[4117.680 --> 4124.800] event or the skill is needed and maybe it's easier to talk about that from a medical standpoint but +[4126.320 --> 4134.400] the nurse that needs to exercise these skills is there a way as you see it to shorten +[4134.400 --> 4141.040] that training loop by basically have the technology as a real-time feedback loop while it's +[4141.040 --> 4146.880] happening like a digital support system so you're training in C2 that's a great great idea +[4148.560 --> 4152.720] boy I mean it'd be interesting to talk to Nassir Novab about this a little bit the C +[4152.720 --> 4157.600] C would probably have a better sense of for those of you don't know he's a professor at T.A. +[4157.600 --> 4162.640] Munich who's who's and at Johns Hopkins University in the U.S. because back and forth teaching +[4162.640 --> 4170.000] about places and his focus area is AR in medicine so it happens to be AR but which is not VR which +[4170.000 --> 4176.800] is not this which is not that two things come to mind one is it's not exactly what you're asking but +[4176.800 --> 4182.800] I'll say it anyway Garrett and I have another project where we're working on is that I had forgotten +[4182.800 --> 4188.080] about it do you guys know what moulage is then the word moulage a French word means like fake wound +[4188.080 --> 4193.840] or fake injury so if you put catch up on me and something else you can make it look like I +[4193.840 --> 4197.920] hurt bleeding or something that would be called moulage it's this makeup that essentially makes +[4197.920 --> 4204.240] me look injured so one of the things we're working on are I assume you either know or could +[4204.240 --> 4210.880] imagine what what I mean when I say fake barf a little rubberized pile of something that could be +[4210.880 --> 4217.200] like a wound that you put on your arm and that has sensors and actuators inside it so that you can +[4217.200 --> 4222.800] actually if it's if it's if I look down with my air headset I can lock on to that object because +[4222.800 --> 4227.280] it's got markers on it active or passive and so I can see blood coming from that if I apply +[4227.280 --> 4231.040] pressure to it it can sense that I'm applying the pressure so it can reduce the blood flow +[4231.040 --> 4235.600] which changes the physiology of the patient and changes the visual aspects of it so that's +[4235.600 --> 4239.680] bringing some of those things into the training the real-time thing the only experience I have Alex +[4239.680 --> 4245.840] that that was a surprising experience to me was and and maybe the biggest challenge for anything +[4245.840 --> 4252.480] related to your last question is trust and the and so I assume there are certain things that +[4253.360 --> 4258.000] are less critical and I can trust them more but I did an experiment with Henry Fougues in some +[4258.000 --> 4262.480] others years ago on telepresence for medical people so it wasn't they wasn't the system was going +[4262.480 --> 4266.400] to do anything for it was a real human was going to it was 3D telepresence was the idea you'd +[4266.400 --> 4270.960] have multiple cameras watching while if you're a paramedic you're doing something really sensitive +[4270.960 --> 4274.160] it happened to be a cryotherautomy you're putting an airway in someone's throat and it's something +[4274.160 --> 4278.480] that most paramedics and EMTs in the US don't do very often and they probably don't have very much +[4278.480 --> 4283.760] training and if they don't do it person's going to die right there and if they then they're +[4283.760 --> 4286.000] afraid to do it because they're afraid they're going to kill the person because they don't know how +[4286.000 --> 4292.800] to do it so the idea was you could have a 3D viewpoint live reconstructed 3D a person back +[4292.800 --> 4296.800] of the hospital a nurse or a physician could be watching a coach and you say that's right it's okay +[4296.800 --> 4300.320] that's expected now's the time to do it first of all second of all now put your hand there do you feel +[4300.320 --> 4305.840] it you're good so kind of comforting you and we thought oh this is going to be great it turns out +[4305.920 --> 4313.040] the paramedics in the study we did 100 paramedics by far didn't like the system because they were +[4313.040 --> 4317.360] afraid they wouldn't trust it it wasn't they didn't trust the doctors of the nurses they didn't +[4317.360 --> 4323.760] trust that the system was providing the context for them for what it was they were doing so they're +[4323.760 --> 4327.200] looking at it is and then we talked to them and said things like you know the patient's life is +[4327.200 --> 4331.120] in my hands that's the way they think about it this patient's going to die if I don't do the right +[4331.840 --> 4336.880] thing you're a thousand miles away telling me what I should do but you really don't understand +[4336.880 --> 4341.600] the whole situation and or I'm afraid you don't and so I'm probably going to do what I think is the +[4341.600 --> 4347.360] right thing in the end anyway and not listen to you so I wonder about those real-time aids when it +[4347.360 --> 4351.680] comes to something is it depends on the on the medical circumstances if it's something really +[4351.680 --> 4358.720] critical I do know people use obviously institute I have a friend as a surgeon at the children's +[4358.720 --> 4363.360] hospital in Orlando and he's pretty forward-thinking they do a lot of AR based surgery and they do +[4363.360 --> 4368.880] planning everything of the head like we he's a maxillofacial cranial surgeon so he reconstructs +[4368.880 --> 4376.160] head bone and so it was really hard for me to watch some of it but even in pictures but they'll +[4376.160 --> 4381.520] use it in real time and they'll watch using AR essentially which amazed me because again that's +[4381.520 --> 4385.760] sort of one of these things is like totally outside of it wouldn't surprise me to go to Ismar and +[4385.760 --> 4389.600] see that presented as a paper but to go to this hospital and see these people just using it +[4389.600 --> 4394.400] it's like wow it has arrived or something like you know people are actually using it but they're +[4394.400 --> 4399.200] doing things like he was telling me about like if they have to drill and they don't want to drill too far +[4399.200 --> 4402.480] I mean I have this problem when I'm drilling in my wall in my house so I think it's probably more +[4402.480 --> 4408.160] important the kids head that you stop drilling in certain depth and so they'll be using AR in different +[4408.160 --> 4413.520] sensing of the drill and all sorts of things to help them go just as deep as they need to but no deeper +[4414.240 --> 4420.000] that's the closest I'm aware of someone else in the room might have some other some other knowledge +[4421.280 --> 4422.080] yeah thank you +[4425.840 --> 4431.280] their little mundane thing is geared and I got approached by Orlando Fire Department +[4431.280 --> 4435.200] paramedics the chief of all paramedics that's concerned about and this is really sort of +[4435.920 --> 4440.800] an IOT-ish mundane thing but they have boxes with medicine in them and they keep them in the +[4440.800 --> 4445.760] ambulance all day long and certain medicines have to remain a certain temperature and they don't +[4445.760 --> 4450.800] know this this is where you learn about the real world so many things are eye opening you know +[4450.800 --> 4453.760] it's like they have these boxes they get to know if the temperature goes goes both certain +[4454.400 --> 4458.560] level then they throw the medicine away they replace it and of course we're thinking oh just put +[4458.560 --> 4462.640] a wireless thermometer in there you know and they're thinking we have no money and we don't +[4462.640 --> 4468.160] have to do this and so it's like you know how do we do this without money and how do we so it's +[4468.160 --> 4473.920] eye opening to sometimes when and I've seen the same thing with the military where you go in +[4473.920 --> 4478.480] and you have this great idea for something big and I'm sure Steve and Tobias and we all see this +[4479.280 --> 4486.480] and they're worried about something very small very mundane so that's when getting out of your +[4486.480 --> 4489.040] field sometimes it's already have to find the right person the right person who's under the +[4489.040 --> 4495.760] right circumstances who's both able and willing to spend time with you and and give up sort of on +[4495.760 --> 4502.560] their some of their own desires and be flexible and be patient and invest in that relationship +[4513.280 --> 4516.480] I talked to you to death all right and the music stopped +[4516.720 --> 4526.560] like I said I'm here you know I'll leave tomorrow but I'm here tonight so I'm very happy to chat if +[4526.560 --> 4533.920] anybody wants to at some point I'll be around. I wonder our questions so let's all give a round of applause to Greg Welch. +[4541.120 --> 4542.000] Thank you. Thank you.