00:03 Jenell Pizzaro I'm Frederick Philip on Vice and 00:06 thank you for consuming the Thunder 00:07 nerds a conversation with the people 00:09 behind the technology I love what they 00:11 do and do tech good and we are with Josh 00:16 Clark Josh oh so excited have you want 00:18 to show I'm really happy to be here 00:19 between you two Ryle is just maybe 00:26 pumpernickel he knows 00:29 don't you remember bread Josh let's talk 00:33 but it's colorful and delicious right 00:36 it's a good call yeah yeah it's very 00:38 good with tuna salad Josh don't let our 00:43 audience a little bit about you if you 00:44 don't mind 00:45 besides my love of pumpernickel I'm not 00:47 an interaction designer and founder of a 00:50 design studio in Brooklyn called big 00:51 media we focus on design for what's next 00:54 so trying to help organizations to 00:57 figure out how to adapt the emerging 00:59 technology that's here right now to make 01:01 meaningful change and hopefully make the 01:03 world a better place along the way to 01:05 all that so how are you guys doing that 01:07 how are you making the world a better 01:08 place but some of the aspirations yeah 01:11 well you know I mean a lot of it is how 01:12 can we make technology that focuses 01:15 attention instead of distracts it that 01:17 means to our lives in or out of the 01:19 reverse I think often we find ourselves 01:21 bending to technology some part of this 01:24 stuff is how can we create experiences 01:26 that you know sort of fit into a human 01:31 worldview instead of cramming humans 01:33 into a technological one so for a lot of 01:35 the last 10 years a lot of that has been 01:37 a focus on how do we sort of push the 01:40 frontiers of mobile and use all the 01:42 interesting sensor packed stuff to make 01:45 appealing mobile experiences but I would 01:48 say in the last couple of years there's 01:51 been a real shift into how do we use 01:55 machine learning and artificial 01:56 intelligence do we do things that that 02:00 the machines can do interesting or 02:03 useful things either to take sort of 02:05 menial tasks off of our plate or perhaps 02:08 to give us insight and sort of amplify 02:10 our judgment and our 02:12 ability with the kinds of patterns that 02:15 they're able to to find so a lot of that 02:19 stuff can be in the healthcare realm 02:21 it's some of the stuff that we've been 02:23 working on and she defined patterns 02:25 around that so it's a few different 02:28 aspects I'm a passionate about the new 02:31 kinds of interactions that we're able to 02:32 have but also we look at the problems 02:34 that we didn't solve without hopefully 02:37 making new problems which is often the 02:40 way the technology goes it's Werner 02:42 worse yeah and and do you think that was 02:44 a lot of your concept with like writing 02:46 designing for touch so you're you know 02:51 essentially like trying to solve maybe a 02:53 problem like or even posing like maybe 02:57 solutions for machine learning and and 03:00 growing with humans 03:02 you know like almost like a graduation 03:04 from touch devices and connected devices 03:06 yeah well if you think about it it's 03:09 like machine learning is powering all of 03:11 the emerging technologies that are 03:13 really interesting right now it's gonna 03:15 the oxygen of the interaction world 03:17 right now so all the speech recognition 03:20 natural gesture predictive interfaces 03:22 augmented reality all that stuff relies 03:25 on this kind of pattern finding that 03:28 machine learning is enabling so it's 03:29 sort of it's underneath everything yeah 03:32 and so as with designing for touch touch 03:35 was interesting because it suddenly 03:36 created this illusion that information 03:39 was physical that you can drag it around 03:40 squeeze it pinch it you know it's sort 03:43 of this useful illusion or piece of 03:45 misdirection and how we're at the next 03:47 phase where we can sort of talk to 03:49 information as if it is a person hey 03:51 Jerry yeah I'd like to have this 03:54 information or if people were to barking 03:56 Alexa Alexa it's an interesting way that 04:00 all the different platforms talk to each 04:01 other - hey Google hey Siri and I think 04:05 this goes to your your talk the new 04:07 design material right yeah that's right 04:10 I'm giving a talk it's a very brand new 04:13 talks I talk about brand new things to 04:16 think about how do designers work with 04:19 machine learning as a design material 04:21 and I think that's the way to think 04:23 about any new technology is howdy what's 04:26 the new stuff that this material does 04:29 and how do you work with it in a way 04:31 that the material wants you to work with 04:33 it what is the machine morning want ya 04:36 and how does that marry with what people 04:39 want yeah well how would you design you 04:42 know maybe something small for Alexa how 04:45 what is your first step for making an 04:48 Alexa what are they called done tasks 04:50 your skills yeah right you know I think 04:53 with anything it's the first thing in it 04:56 but let me back up I think that it's 04:57 okay to build some silly toys first 05:00 especially my yeah most human at play 05:06 and we tend to sort of put aside the 05:08 demands of what are the the metrics and 05:12 okay ours that we need ahead of all this 05:14 stuff and more sort of like really 05:16 playing with a material so I think that 05:19 it's okay to first think it's like well 05:21 what would be a fun silly thing to play 05:23 in part because it pulls us out of our 05:25 rut how do we usually think about design 05:28 problems because I think that when you 05:29 when you look at who had the best 05:31 interaction designers in the industry 05:33 frankly video game designers yeah they 05:37 bring a lot because their challenges 05:38 they have to make something that's novel 05:40 and fun and that you don't have to read 05:42 the instructions for that sort of the 05:44 thing explains itself as you use it and 05:47 so I think that making games and toys 05:49 can be a great way to as a side project 05:53 get to know how to work with with a new 05:57 technology as a design material so I 05:59 think that if you're sort of new to 06:01 making Alexa scale or working with any 06:03 kind of new emerging technology making 06:06 something a little playful is a good way 06:08 to start well it's not what you want to 06:10 release as a product again we have a lot 06:13 of frivolous things that are being 06:15 pushed out as products and so I want to 06:17 make a distinction in our practice let's 06:19 play so that we can learn how to make 06:21 meaningful useful products that don't 06:24 just sort of add to the absurdity of the 06:27 world which is something I think useful 06:28 that Kate O'Neill talked about it or not 06:31 care about tech humanism yeah 06:33 you're saying create Alexis skills that 06:35 you know repeat what you say but in 06:37 piglets 06:43 you know I think one of the things that 06:45 I think about a lot and working with a 06:47 new technology is what is the what is 06:50 the grain of this new design material 06:53 what is it enable and what is it 06:55 encouraged and what's the new thing of 06:58 it it takes a while to figure that out 07:00 but when you think about speech 07:01 interfaces they're sort of like well 07:03 what are the new physical contacts that 07:06 it enables that you wouldn't be able to 07:08 do with the screen what are the 07:09 constraints of it that's the emotional 07:11 context of it what does it mean that 07:13 you're talking to this disembodied the 07:16 personality floating in the room 07:18 and there's something yeah what's it do 07:21 to the people around you so we have an 07:23 echo at home and it understands me and 07:28 our 18 year old daughter Veronica ninety 07:30 percent of the time pretty good yeah it 07:33 understands my wife Liza never yeah Oh 07:37 interesting and so you know we hooked it 07:39 up to some stuff in the house to turn on 07:42 the lights or whatever or you know it's 07:43 yeah Liza will come out in the morning 07:45 and be like Alexa turn the lamps on and 07:48 it's like playing children's Christian 07:49 music you know it's a I'll try to help 07:52 you know Alexa turn the lamps on it'll 07:54 work and she's like oh your girlfriend 07:59 awkward for us to talk to Alexa so 08:05 there's this there's this sort of 08:06 there's this real thing that happens 08:08 which is just it is just a computer 08:10 system oh yeah but it has taken on this 08:13 personality of this unwelcome person in 08:15 our house something you have to consider 08:18 in the design of these things I mean 08:20 what goes wrong how does it change 08:21 relationships how does it change us yeah 08:24 yeah how does it change us as we grow up 08:27 with it one of my buddies his son is 08:30 three years old and he's always been 08:32 able to tell time by going Alexa what 08:35 time is it and he was talking about he 08:37 was in the store and his son just said 08:40 Alexa what time is it just expecting 08:43 that to be there like 08:44 when does that actually happen yeah yeah 08:47 well I think with kids it's really 08:48 interesting because you know I think 08:51 early on there was some concern with 08:52 Alexa of are we teaching kids like not 08:55 to say please and thank you you know 08:56 it's just like giving orders Alexa what 08:58 time is it you know it sort of this and 09:01 so Amazon has come out with Alexa for 09:03 kids which is a little nervous making 09:05 right sure but it will when it's sort of 09:08 in kid mode you'd sort of say you know 09:10 you could say please or you know you 09:12 could say thank you so you don't have to 09:14 but it's sort of these prompts around it 09:16 it is cute I think it's useful but it's 09:18 also sort of a thing it sort of also 09:20 assumes that there's a one-size-fits-all 09:22 parenting mode yeah and I think look 09:25 we're at the beginning of all of this 09:26 but it's the kind of thing of like do I 09:30 want Amazon to do my parenting and even 09:32 if so is Amazon are there sort of 09:36 parenting values there designers the 09:38 same as mine right would they would do I 09:40 want Alexa correcting my kid and if so 09:43 how it's like it's a whole source of 09:46 different interests Wow 09:47 that's a separate relationship it's my 09:49 kid having with Alexa over there when 09:51 I'm not around and cultural values to 09:54 you know like you know you know Asian 09:58 culture right that would be weird as 10:01 well too like maybe put an Alexa and a 10:03 Hispanic you know household and it's 10:05 completely different than you know what 10:07 it expected it to be right bring up a 10:09 whole huge important area sort of you 10:13 know of the machines know what we teach 10:16 them know it's like what is the data 10:18 that we're providing to them and who is 10:20 doing that data and frankly you know you 10:22 look at a lot of caman computer of 10:25 vision face recognition stuff and it 10:27 works great on pale males you know pale 10:30 males actually I would say everything 10:31 looks great on Pam yes Alexa listens to 10:33 me more than right yeah it's like things 10:36 has been so many things cut my way 10:38 because I'm a straight white middle 10:41 class educated American man oh man 10:45 things go my way all the time and in a 10:47 way we're sort of embedding that bias 10:49 into these systems because that's what 10:51 the data is and that's the data that's 10:53 been gathered and so is this something 10:55 that as we do this are these systems 10:57 just gonna rien 10:58 worse my power and privilege at the 11:01 expense of others or is it gonna 11:02 steamroll culture as as Google suggests 11:06 phrases to use to respond to email does 11:10 that mean that there's going to be a 11:11 very specific Silicon Valley vernacular 11:14 that gets amused and spread and is that 11:17 going to flatten you no other dialects 11:22 yeah would you also say to to a certain 11:24 point where you look back at computer 11:28 language and you look at maybe the last 11:30 30 or 40 years how most things that you 11:34 would write you would write in English 11:35 good right HTML on Englishmen no matter 11:38 what country you are from you wrote HTML 11:40 in English yeah you know you still 11:42 spelled color if you're in the UK with 11:45 without a you because that's that's the 11:47 HTML way to do it so do you think like 11:50 there's some kind of parallels there 11:51 what so right I mean in that case that's 11:53 sort of like it became the lingua franca 11:56 sorry franca sorry French you know you 11:59 know and it's English is the technology 12:01 language and so at some point you need 12:03 to sort of choose something especially 12:05 if you're gonna give instructions to 12:07 computers this is how it is but I think 12:08 you're right is that it does create a 12:11 bias of access to who can code that you 12:14 have to have that baseline understanding 12:16 of this other language in addition to 12:18 the logical language yeah Wow but the 12:21 risk is of course that we're also 12:22 creating those kinds of decisions 12:25 inadvertently and probably with the best 12:27 of intentions but about who can use 12:29 these systems or do I recognize myself 12:31 in these systems where's the system 12:33 recognize me you know there was a study 12:36 of Google speech translation from 2016 12:40 so only a couple years ago they found 12:42 that there's a bias against 12:44 understanding women so 70% of the time 12:47 the system would understand the speech 12:50 of a random woman better than of the man 12:53 despite the fact that women's tend to 12:55 speak more slowly or clearly longer 12:58 vowels grateful recognition for these 13:00 systems did not this is clearly a case 13:03 of just bad data skewed toward 13:06 understanding dudes because that soon 13:09 presumably don't this system 13:11 so a lot of this as we think about how 13:14 do we create these machine driven 13:16 systems as how do we give them the data 13:18 how do we make sure that we are aware of 13:22 our own blind spots and actively trying 13:24 to fill them and a lot of that goes to 13:26 the teams who do we work with mm-hmm 13:28 the teams cannot look like me you know 13:31 it's like and I don't mean just in terms 13:33 of demographics race ethnicity gender 13:35 age that kind of stuff but also you know 13:39 almost worldview yeah how do we get 13:41 outside of the tech bubble and bring in 13:43 philosophers and civil servants and 13:47 artists and every walk of life because 13:50 this stuff is affecting our entire civic 13:53 life now these systems determine how we 13:56 speak to one another 13:57 yeah it's which really shaping our 14:00 global culture yeah right and and let's 14:03 at least do that with intention and not 14:05 like machines decide to do I like to say 14:08 that the future should not be 14:10 self-driving you know we need to sort of 14:12 decide where we want it to go or the 14:14 machines will decide for us mm-hmm I 14:17 mean we can see it too many addicts like 14:20 in Ed's you can see but you know towards 14:22 the end or the beginning to add they're 14:24 talking to Google they're talking to 14:25 alexa in the ad right so that like Alexa 14:29 can hear you like the commercial can 14:31 hear the commercial itself right and 14:32 then so Alexa is gonna be like okay cool 14:34 I heard that gun at like I got it down 14:36 right and so you're being marketed to 14:38 multiple ways just through like your TV 14:41 now because there's this device that is 14:44 is part of your family really like 14:47 that's kind of can easily be punked we 14:58 should all Google parents talking to 15:00 alexa it's like there are all these 15:02 cases Elect it's a parrot's adding stuff 15:07 to shopping lists turning lights on and 15:09 on really because they hear you I'm 15:13 talking to your Alexa and so they just 15:15 pick it up right sure they're just like 15:17 yeah 15:21 like it's so looks like order pizza for 15:23 her dachshund and Alexa can't tell the 15:31 difference I mean that's they're getting 15:33 better at it but this is a an 15:35 interesting nuance of designing for this 15:37 stuff that in terms of sort of design 15:39 and for the grain of the technology the 15:42 grain here is that the machines are 15:44 weird and so the design that we have to 15:46 do is not designing fixed paths through 15:50 information that's under our control 15:51 anymore we used to say design for the 15:53 happy path now how can you design it for 15:55 error and failure and how do you recover 16:00 from that gracefully and help people 16:01 understand what's going on when they get 16:03 bad or suspect right you don't want to 16:06 be technically wrong and that sums I 16:08 mean it's a book but it's a it's about 16:11 like how machine learning can be you 16:14 know biased or just completely behind a 16:17 lot of electrics 16:25 yeah um how do you go about you know 16:28 designing a skill or you know anything 16:33 like a machine learning kind of 16:35 situation for lookup big enterprise like 16:38 how can they use Alexa's and and you 16:42 know their Google assistants etc to make 16:44 their either their jobs easier or you 16:48 know their product yeah I mean I think 16:50 that there's always two questions that 16:52 you ask when you're about to build one 16:53 of these things is how do I build the 16:55 right thing and then how do I build the 16:57 thing right in this often we don't spend 17:01 enough time on how do we build the right 17:03 thing it's often sort of something that 17:05 somebody has an idea and then it's like 17:07 we're gonna make this thing and I think 17:09 that the the work of design especially 17:11 about UX researchers is trying to figure 17:13 out where can we actually make the most 17:15 impact not just the impact for the 17:18 business but in the impact of the 17:20 person's life who's using it and I think 17:22 that whether it is sort of some of the 17:24 efficiency of talking to alexa in a 17:26 moment anywhere hands-free eyes-free and 17:29 what are those opportunities and we're 17:31 in sort of the customer journey can you 17:33 have the biggest impact to of answering 17:36 a question quickly and casually in that 17:39 way but I think there's also sort of an 17:42 interesting thing as much of machine 17:43 learning generally which is you know 17:46 what are the moments where we find 17:48 ourselves doing error-prone joyless 17:51 repetitive tasks but the machines can 17:56 sort of do that part and bring some new 17:58 insights so that we can do the part 18:00 that's not horrible that's the story of 18:04 technology and it it sometimes has 18:09 unintended consequences to that we can't 18:11 always predict ah who knows what's gonna 18:14 happen in the future yeah yeah yeah but 18:18 the other part of building and riot is 18:20 the great news is that all of these 18:23 technologies certainly including lots of 18:25 skills really low attends a really sort 18:28 of low barrier to entry it was like if 18:31 anybody who has heard some basics or 18:34 front-end development skills can quickly 18:37 sort of get 18:38 mr. berries are a simple hello world 18:40 Lexus skill go in and so there's the 18:44 hard work and I work with clients a lot 18:47 on sort of product invention exercises 18:50 to sort of figure out how do we make 18:52 sure that we're building the right thing 18:53 where do we build this in there but once 18:55 you do it you can also get really sort 18:57 of get prototypes up really quickly on 18:59 these new speech platforms but also for 19:03 other stuff it turns out Amazon and 19:06 Microsoft and Google and IBM are given 19:09 away their machine learning API is at 19:12 least sort of common uses of so it's 19:14 like you don't know how to use do 19:15 computer vision camera vision no problem 19:18 it's like okay eventually for free it's 19:20 like the guts of what make LexA go you 19:23 can get at that essentially for free or 19:25 at low cost from Amazon they're giving 19:27 this stuff away because they want your 19:30 data they want to get better at it 19:31 yeah so at least for prototyping stuff 19:34 that's a great environment yeah it's 19:36 really cool to UM I know there's Asher 19:39 and there's like this thing where it can 19:42 like tell who is who and what is what 19:45 like could literally find celebrities in 19:47 a crowd like oh that's Bernie Sanders 19:48 right like that's the coolest freakin 19:50 thing not only can I find Bernie Sanders 19:53 an identified Bernie Sanders in a crowd 19:54 it will also tell you what his emotional 19:57 state is an estimated age and so there's 20:00 like all that stuff of analyzing video 20:03 or photo faces it's you know it's all 20:06 very accessible yeah so I think that the 20:08 trickier part at the moment is there is 20:12 a whole sort of set of stuff that you 20:13 can do when when you want to build your 20:16 own models in your own sort of stuff but 20:18 for a whole vast number of particularly 20:20 consumer facing technologies stuff is 20:24 available for you to play with today and 20:25 with no sort of prior knowledge of you 20:28 can save money you really can I think 20:30 you should get on some of those silly 20:31 and finally I remember they were talking 20:34 about there's a lot of silly fun things 20:35 to be committed yeah I have a friend who 20:38 had made a Twitter bot because he you 20:42 know wants to be a little bit more of an 20:43 hi Duane a little bit more visible on 20:46 Twitter and so he uses Instagram and so 20:51 he takes pictures of things 20:52 you know that he's doing right like 20:54 coding or going to meet ups etc and take 20:57 a picture of it and runs it through 20:58 auger and it'll post it on Twitter and 21:01 they'll say hey this is the thing that 21:03 I've made right like it's like oh it's 21:05 twelve people in front of a bus getting 21:07 a taco right and it'll be like is that 21:09 accurate or not right and I think that 21:12 is the coolest thing it's just a he just 21:14 wanted to be more visible on Twitter and 21:16 he created this like like recognition 21:20 thing just just for funsies so I think 21:22 being hilarious it's like it's really 21:35 sort of charming sometimes and that's 21:37 what I mean by the fact that the 21:38 machines are weird is they see the world 21:41 in different ways and we do and so I 21:42 think that there's a lot of fear that 21:44 the machines will somehow replace us 21:46 they're sort of the Hollywood version of 21:48 that and they understand the world so 21:51 differently and with such a sort of a 21:53 different perspective I think the better 21:55 way to think of it is they can actually 21:56 just be like really interesting and 21:59 useful sidekick companions yeah to us 22:02 that in the best form helped to amplify 22:06 our judgement and engage it in 22:08 interesting ways that we can be better 22:11 or more interesting versions of 22:13 ourselves instead of having to do in 22:15 sort of some of this long we can look to 22:17 the sound a little back yeah well said 22:19 you know I wanted to ask you about one 22:21 of you are a very interesting creation 22:24 from early on look at the couch to 5k 22:37 held to be the one to organize everybody 22:40 to run every time you get all a comprar 22:41 you know I only started doing it 22:44 recently and it's something that I found 22:46 it's can go anywhere sort of and it 22:49 turns out to be a nice way for people 22:52 who were like me gonna go out for 22:54 anybody to do it together and we have a 22:56 really casual with talks it's like so 22:59 speakers and attendees about 10 people 23:02 today which is a record it's like a lot 23:04 of people will come out at 6:00 23:05 more than yeah after the conference 23:08 party the night before yeah but yeah and 23:12 and so yeah but thanks for mentioning 23:13 couch to 5k that was something that I 23:15 wrote this schedule a little over 20 23:17 years ago oh my gosh really hasn't been 23:19 that long and I wrote it for my mom to 23:21 sort of help my mother was the original 23:24 couch to 5k she's still doing it like it 23:31 she's in her seventies she walks more 23:33 now but she's like walking half 23:35 marathons oh my goodness yeah yeah it's 23:40 great but since then couch to 5k insult 23:42 like millions of people it's been 23:43 adopted by the UK's National Health 23:46 Service as a National MS plan it's grown 23:50 so far beyond me and I really can't take 23:53 credit for it it's for all the people 23:55 who've found some inspiration image to 23:58 make some changes or try some new things 24:01 in their lives so it's it's been a cool 24:03 thing oh that's so neat yeah I love that 24:05 that's something that you were able to 24:06 take you make something and now it's its 24:09 own kind of entity out there helping 24:11 people it really shows that sometimes 24:13 you just sort of drop these seeds and it 24:15 grows up into something much bigger than 24:17 you would imagine it's a it's been 24:20 genuinely a humbling thing to see this 24:23 idea that I put out there almost like my 24:25 original open-source project that just 24:26 yeah I haven't made any of the apps or 24:29 podcasts and yet this whole ecosystem of 24:31 apps and podcasts and websites and 24:33 communities growing up around it I would 24:36 have a sort of the core schedule itself 24:38 I'm so amazing yeah you'll be able to 24:40 enjoy the felicity of that for time to 24:43 come it is amazing just hearing people's 24:45 stories about because it's not just sort 24:47 of oh I ran a 5k it turns out that for a 24:50 lot of people with Fitness they run into 24:54 it it's a defeat you know it's like I 24:55 tried to do this I couldn't do it I am 24:58 NOT this is not something that I can do 25:01 it's not something to have within me and 25:03 it's sort of because you run into these 25:05 defeats at shapes just your perception 25:08 of yourself yeah and one of the things 25:11 that I've heard from people who found 25:12 it's like wow I am now able to jog for 25:15 30 minutes I 25:18 got a new job I changed my relationship 25:20 like I've heard these stories of people 25:22 who this experience changes their life 25:25 so it's an interesting thing of that I 25:29 think that you can think about with 25:31 design in general is this thing that you 25:33 are creating may have ripple effects 25:36 beyond what you're doing this could be 25:38 terrible effects it could be very 25:40 positive effects you don't necessarily 25:42 know yeah you can sort of do your best 25:45 and sort of try to corral it in that