00:00 [Music] 00:01 welcome to thunder nerds I'm Brian Hinton 00:03 I'm Janelle Pizarro and I'm 00:05 Frederick Philipp Von Weiss and thank you 00:07 for consuming the Thunder nerds a 00:10 conversation with the people behind the 00:12 technology that love what they do and do 00:16 tech good my good ear thank you 00:22 everybody for joining the show super 00:23 appreciate you spending a Saturday 00:26 afternoon with us depending on what side 00:28 of the pond you're at and we'll get to 00:30 that with our guest and I just want to 00:33 say everybody if you can please go to 00:35 the old YouTube and subscribe hit the 00:38 notification button we're at youtube.com 00:41 slash c / thunder nerds c for channel 00:45 super preciate it we're at I think 249 00:48 we get about another 60 subscribers 00:50 Brian will be released from his I don't 00:55 even want to talk about it but where I 00:57 also please go to the old iTunes me huh 01:00 it's always me 01:02 there's always you Brian's parole I 01:04 wasn't gonna bring it up but anyway go 01:06 to the old iTunes and subscribe there 01:08 and give us a review on iTunes go to 01:11 thunder nerds io / review and give us a 01:15 review and we'll read it on the show and 01:17 speaking of which I have a review to 01:20 read by M Herbert 15 that's titled good 01:25 conversations with industry leaders 01:27 thunder nerds continues to have great 01:30 content yeah and I diverse selection of 01:34 industry leaders on their program always 01:36 a fun engaging listen we'll continue to 01:39 follow them as they progress that's 01:41 fantastic thank you and Herbert Herbert 01:46 15 yeah it's a Matt we know yeah listen 01:57 - really appreciate it Matt reached out 01:58 to us before so thank you let's go ahead 02:01 and welcome our guests we have a super 02:02 amazing guest from across the pond which 02:05 is in a completely different time zone 02:07 we have designer author co-founder and 02:10 designer of indie Laura 02:13 bag welcome to the show thank you for 02:16 having me you really appreciate you 02:18 joining us what is it it's five hours 02:21 difference I believe so you're at six 02:22 yes yes 6 p.m. that's a nice time oh wow 02:26 it'll probably stop getting dark well 02:29 while we're talking I reckon yeah yeah 02:32 how does it feel over there like you're 02:33 in you're in Ireland I am I'm in the 02:37 south east of islands I'm Nick hook oh 02:40 my goodness ok so you have to tell us a 02:43 little bit about that because we're 02:44 super American and we have no idea and 02:46 we don't travel because we're terrible 02:48 so well people might be confused because 02:50 I have a very British accent because I'm 02:52 a very British person so I'm not Irish 02:56 but tell us about it like tell us about 02:59 wow what is it like being over there 03:01 I've been living here for about 7 months 03:04 now and before that we were living in 03:07 Sweden for two years and before that I 03:10 just lived in the UK my whole life and 03:13 Island is really nice it's really green 03:15 and the culture and the weather is 03:18 mostly similar to the UK except there's 03:22 quite a lot of stuff that's similar to 03:24 the u.s. there because a lot of us 03:26 culture came from Ireland so there's 03:28 lots of things that I think oh this is 03:30 really American and then I like H is 03:32 probably really Irish what the portion 03:39 sizes the kinds of food there's very 03:41 popular here very very American I need 03:46 to be fair I've only been to the US once 03:48 and that was to Vegas so that's maybe 03:50 not a typical us experience the 03:55 quintessential u.s. experience is the 03:58 most extreme version and the u.s. in a 04:02 lot of ways yeah definitely so how did 04:06 you come about or landed in Ireland like 04:10 what was your journey from recent Sweden 04:13 to Ireland 04:15 so we we're trying to look for a good 04:17 place to live so I don't know if you 04:20 folks have heard about this wonderful 04:21 thing called brexit and but essentially 04:24 my partner and I are 04:27 and we run Indy together and we are 04:30 together together and when we kind of 04:34 when the government current government 04:36 in the UK got elected we kind of had an 04:38 inkling that something was gonna happen 04:40 like brexit they've been talking about 04:42 it and the thing is that we are I'm a 04:45 British citizen he's a French citizen 04:47 and so the potential is that when brexit 04:52 happens we end up on the wrong side of 04:54 the border from each other and also we 04:58 have things like a lot of the the 05:01 surveillance bills and in the UK where 05:05 possibly gonna become a legal issue if 05:07 you're working in privacy like we are 05:09 and working and building software that's 05:12 private and so we had to look for a 05:14 country that might be a bit more 05:15 friendly to that as well I mean to be 05:18 honest not many countries are very 05:20 friendly towards your privacy nowadays 05:23 and but we tried to find someone we 05:26 could settle in the EU for as long as I 05:28 had the ability to choose where I wanted 05:30 to live in the EU because when brexit 05:33 happens I will no longer have that 05:35 choice either so it's a tricky 05:40 how does it work right now then like 05:42 wherever you decide to settle like if 05:45 you're in Ireland now you can just say 05:46 okay when it happens you're like I'm 05:49 going to be here was that yeah do you 05:52 become a are you going to become a 05:53 citizen I should be able to survive as a 05:56 British citizen here because well the 06:01 north of Ireland is part of the UK very 06:03 Northern Ireland and the rest of Ireland 06:06 is a republic and it has been for quite 06:08 a while now and but previously to that 06:10 and it was one of one of the many 06:13 countries that the Brits decided to take 06:15 for their own and they got it back but 06:19 it actually means that I still have 06:20 pretty much the same rights as an Irish 06:23 citizen as a British citizen here and my 06:26 partner has the same rights as a 06:28 European citizen here because it's part 06:30 of the European Union 06:32 that's all very complicated it must be a 06:36 nightmare for you guys and you can 06:40 doing freelance together for a long time 06:43 right like that's been your only gig 06:45 yeah well we we're not so much 06:48 freelancing that we make our own 06:49 products so we've been doing that for 06:51 the last few years and prior to that we 06:53 were both freelance but yeah we've been 06:55 doing this for about I said four or five 06:58 years I've been doing it full-time for 07:01 about four years because we could only 07:03 really afford for one of us to be 07:04 working on it full-time for a long time 07:06 and it's still it's still touch-and-go 07:08 sometimes but right now we can sustain 07:11 both of us working full-time on stuff at 07:14 Indy that's awesome to be able to be 07:17 like working with your partner like how 07:19 cool is that or what are the 07:21 difficulties of like working with your 07:23 partner and that there is never any end 07:27 to work it is very easy to work all the 07:31 time because work is at home and so it 07:34 can be quite hard to escape that I keep 07:37 very strict schedule so I don't work 07:40 weekends and I don't work in the 07:42 evenings unless there's like a deadline 07:44 or something like that so that helps 07:46 I've always been very disciplined and 07:47 because I've always worked for myself 07:50 since I left University so I had to 07:54 learn how to be very disciplined 07:55 otherwise I'd never get any work done 07:57 and I never get paid yeah yeah I think 08:03 so and otherwise I'd well I don't know 08:06 what my mental health would be like if I 08:07 didn't take time to myself and gotta get 08:11 out and do other things rather than 08:13 looking at computers so what's had what 08:16 was the inspiration for you guys to you 08:17 start India then it was when the Snowden 08:21 revelations came out so when the Snowden 08:24 revelations came out a lot of people 08:26 were worried because they're Edward 08:28 Snowden there's yeah our governments are 08:31 spying on us and what I think thinking 08:36 about was actually that while a lot of 08:39 people are very rightly concerned about 08:41 government surveillance what we noticed 08:43 was the role of the companies on the 08:46 internet on the web and how they were 08:50 actually supplying all of the 08:52 surveillance because all of these 08:53 corporation 08:54 were recording what we were doing and 08:56 that's how the government got hold of it 08:58 and and it's just because that's how 09:00 social media is architected it's in a 09:03 way that's very easy for the governments 09:05 to say oh well now you have to handle 09:07 that over to us and whether that's your 09:10 original intention or not and so we said 09:12 well this is all for that corporations 09:15 have such a role in it and so what can 09:18 we do to try to fight that what can we 09:20 do to try to build things that don't spy 09:23 on you 09:23 by default yeah so what is the product 09:27 oh that's right 09:32 big sorry I feel like yeah I'm 09:34 having an echo real fun sounds like 09:38 what's going on so what what is it that 09:42 indie does make and what do you guys 09:44 actually have to offer people in the 09:47 world so we do three things is how we 09:50 try to focus on it so we so if we say 09:53 this the problem that we're trying to 09:55 work on is the problem of what we call 09:57 surveillance capitalism which is a term 10:00 coined by Shoshanna Zubov and what that 10:03 is is essentially that the mainstream 10:06 technology its business model is to look 10:10 at what we're doing to profile us and to 10:13 use that information to either sell on 10:15 to other people to give to governments 10:18 or just to use to sell advertising and 10:21 to advertise to us better as they say 10:24 and so how do we fight that we have to 10:27 do it in three different ways one we 10:28 have to try to encourage effective 10:31 regulation against the companies that 10:35 are doing this we have to try to stop 10:36 them from doing things that are very 10:38 harmful and can really harm vulnerable 10:41 people and so we go and get talks about 10:43 that and try to educate people about 10:45 that and talk to the European Union 10:47 about that and things like that we don't 10:49 really have much sway in the US official 10:51 and then the other thing we do is we try 10:55 to regulate it technologically so we've 10:58 got two apps and better for iOS and 11:01 better for Mac and what those do is if 11:03 you use Safari they block all the 11:05 trackers we can find when 11:08 and protect you in safari so that's our 11:11 way of doing it technologically but one 11:13 of the biggest problems is that you can 11:16 tell someone oh don't use Facebook 11:18 because they're profiling you can tell 11:21 someone don't use Google because they're 11:23 profiling you but what are the 11:25 alternatives and there are some 11:26 alternatives when there aren't that many 11:28 so one of really our primary goal is to 11:31 work on building alternatives so we've 11:34 been doing lots of research in the area 11:35 building prototypes in the area and 11:38 trying to work out the best way of doing 11:40 it because the thing is so much of 11:42 mainstream technology has been 11:44 architected around the model of 11:47 collecting all the information of your 11:50 users on a server and that is pretty 11:52 much the way that most technology works 11:55 and that in itself is a risk in terms of 11:59 people's privacy and in terms of 12:00 security like if all that information is 12:03 on one server somewhere it is easy to 12:06 access and it's vulnerable and so we've 12:09 been working on ways that we can 12:11 architect technology that doesn't 12:13 require that that isn't all stored on 12:15 one server and ways that we can 12:18 socialize with each other and do all the 12:20 things that we do now but without having 12:23 to rely on big corporations who monetize 12:26 us imagine that big corporations are the 12:29 ones probably coming to you as well for 12:33 solutions like hey this is great we need 12:36 this we're looking for any kind of 12:38 privacy security that we could get our 12:40 hands on nice lady 12:45 what's that wanting to do is go Shh 12:48 don't don't tell people what we're up to 12:50 we don't want you to talk about that 12:52 stuff we want to refrain privacy as 12:55 being um Oh your direct messages are 13:00 private because the only people that can 13:02 see them are you your friend and us and 13:06 that means private and so you see them 13:09 you've got the privacy dinosaur that 13:12 Facebook has and all these things so 13:13 many companies now saying your privacy 13:16 is important to us and the thing is 13:19 they're not offering solutions that are 13:21 private from 13:22 what about I think maybe what Frederick 13:26 was going at was like businesses like 13:28 like a corporation that's not a 13:31 gathering data was wanting to protect 13:34 privacy of their employees like is there 13:36 do you get approached if at all from 13:39 that angle I think that that is a lots 13:42 of people that are working in the area 13:44 that's probably their main way to make 13:46 money would be things like trying to 13:49 prevent corporate espionage and things 13:51 like that it's kind of keeping their 13:53 stuff private and but I say that because 13:56 what we're working on is we're not 13:59 trying to make things for companies 14:01 we're trying to make things for people 14:03 the mainstream people because the thing 14:06 is we're the ones that need protecting 14:09 like I mean sure I care about people who 14:13 are employees but what do I think about 14:15 what are those companies doing is that 14:17 something I care about maybe maybe not 14:19 what I really care about is individuals 14:21 I really care about the people in the 14:24 world who are most vulnerable to losing 14:26 their privacy maybe people who are gay 14:29 people who are seeking asylum and people 14:32 who are in a country where that 14:35 country's government doesn't believe 14:37 they have a right to be there and we've 14:39 seen technology be used against these 14:41 people time and time again being used 14:43 against protesters and so I want to 14:46 build technology and infrastructure that 14:48 people can use where they are safe and 14:51 they can communicate with their friends 14:53 and they can organize protests and they 14:56 can live their lives safely where 15:00 technology isn't a risk to them 15:02 businesses that want to get on board 15:04 with this though I mean I imagine 15:06 there's some people that want to align 15:08 with this movement and be part of it 15:11 yeah I mean people wise lots of lots of 15:14 people are interested in this and 15:16 there's a big movement around Federation 15:19 so there's this alternative to Twitter 15:22 called mastodons and the whole idea 15:24 behind Macedon is essentially you have 15:26 your own Twitter and so you can post 15:28 things but it's on your own site and or 15:33 hosted by someone else although it's 15:34 better if it's on you 15:35 on site and though all of those 15:38 different Twitter's can communicate with 15:39 each other and this is what they call 15:42 Federation so it's a really cool 15:45 technology and that's got a lot of 15:48 potential in that we maintain ownership 15:50 and control over all of our own content 15:53 to begin with so and then we can build 15:57 more things on top of that think about 15:59 us all having our own little social 16:02 medias we can decide who's on our 16:03 Facebook and who like our version of 16:05 Facebook and who can see our messages 16:08 and who sees what we're posting and a 16:11 lot of what's macedon and what they're 16:13 doing with macedon is that they're 16:16 building it with the defaults that you 16:18 don't get from the companies that have 16:20 an interest in knowing everything about 16:23 you so they're really building in great 16:25 anti-abuse tools they're building in 16:28 lots of stuff that's private by default 16:31 like why default people can't search 16:33 what you toot as they call it or Macedon 16:36 unless you put a hashtag in it and they 16:38 can search that hashtag because they're 16:40 thinking will by default actually when 16:43 people search for things on say Twitter 16:47 a lot of the time they're searching for 16:48 a particular term that makes them angry 16:51 so that they compile onto a person and 16:53 so let's not have that in Macedon 16:56 because then people can't pile onto each 16:58 other maybe let's not list by default in 17:01 the timeline how many likes favorites 17:04 message has how many retweets or boosts 17:09 as they call on Macedon have because 17:10 actually that encourages us to get more 17:13 addicted to technology that encourages 17:15 us to get hooked on this how popular can 17:18 I be how meme worthy am i and so there's 17:22 a lot of really clever design going on 17:24 is not perfect no solutions can be 17:27 perfect but there is a lot of thought 17:29 going into how can we make things 17:32 different because it doesn't have the 17:34 same business model as the rest of 17:37 social media where the business model is 17:39 they want you there all the time they 17:41 want you to engage as much as possible 17:43 they want your attention and that's why 17:45 we hear a lot of people talking about 17:47 trying to build things for attention 17:49 trying to build engagement but actually 17:52 if we think about technology's role in 17:54 our lives we don't want it always 17:55 wanting our attention we don't want to 17:58 always be engaged with it we want to be 17:59 engaged with our friends and family we 18:02 want to be able to spend time doing 18:04 stuff away from screens so I think it's 18:06 a positive movement and design I think 18:10 that's really cool that you were able to 18:11 like that there's such a thing as 18:13 mastodons especially like when you just 18:15 called it like privacy dinosaur I just 18:17 think it's pretty hilarious not same era 18:21 but still kind of cool but yeah I think 18:25 that also seems to be you know very 18:27 popular right now among developers and 18:30 and designers even is the whole concept 18:32 of you know there's we're completely 18:35 hooked on technology how do we design 18:38 and develop to kind of get yourself away 18:40 from technology but still making it the 18:42 best experience possible so that you're 18:45 using it to enhance your life rather 18:48 than to consume your life so yeah what 18:51 are your feelings about an app one of 18:53 the problems I have with the area is 18:55 that a lot of the time we're kind of 18:57 victim blaming because a lot of us are 18:59 victims of technology that's 19:02 deliberately designed to addictive it's 19:04 deliberately designed to get our 19:05 eyeballs all the time I mean infinite 19:08 scroll is a classic example of that yeah 19:14 people know once you're in you're stuck 19:17 and they very much done that on purpose 19:19 and we can talk all day about like 19:22 what's great ways to design things to 19:25 make them less addictive but the thing 19:27 is unless the business model underlying 19:30 that technology is not around addicting 19:33 us so unless that business model is 19:36 someway designed that they can make 19:38 money and sustain their business without 19:40 having to have us there all the time 19:42 then very few people are going to be 19:45 able to design that into their 19:47 technology because a lot of designs and 19:49 developers as much as they want to do 19:52 the right thing they don't have the 19:53 ability to do so 19:55 because they are designing to avoid 19:57 addiction is the opposite of what their 19:59 bosses and their managers want them to 20:01 do 20:03 a good very good point I want to circle 20:06 back you said that the initial when you 20:08 started the conversation about 20:09 surveillance capitalism I love that term 20:11 by the way you said there are three 20:14 things that you Indy addresses and one 20:17 of them is surveillance capitalism I'm 20:18 curious what the the other two are what 20:22 the three things we're trying to do is 20:24 to fight surveillance capitalism so what 20:26 we're doing is we're trying to promote 20:28 regulation and trying to just encourage 20:32 people to design things that are 20:34 properly private and we are trying to 20:38 regulate ourselves through what we call 20:40 technological regulation by blocking the 20:43 trackers as much as we can and and then 20:47 build alternatives so those are the 20:49 three things that we try to do okay 20:50 great thank you 20:51 and I'm curious about brave exactly a 20:58 not brave better how exactly it works 21:01 like what what is the how are you 21:04 knowing what to block is exactly where 21:06 are you getting that information so wait 21:09 so we're one of the reasons we started 21:11 better in the first place was because we 21:14 looked at a lot of the other ad blockers 21:17 and we didn't like what they were doing 21:20 because one of the things they were 21:22 doing was a lot of them were blocking 21:26 ads but they were offering to unblock 21:29 ads if you'd pay them a lot of money and 21:32 so that's a very common model of one of 21:34 the biggest ad blockers is the big 21:36 companies like Google they just pay them 21:38 a load of money and they don't get 21:40 blocked and and also that the focus was 21:44 on ads because actually ads aren't 21:47 inherently bad like a picture of a thing 21:51 you might want to buy a link not a 21:54 problem what the real problem is is that 21:56 underlying tracking behind those ads 21:58 that behavior or tracking that does 22:00 things like make an ad follow you around 22:02 and all the different websites are on it 22:05 okay suddenly yeah retargeting something 22:08 like the people behind the advertising 22:12 company by putting those ads on all 22:14 those sites they're seeing what you're 22:16 doing 22:16 where you're visiting and they're 22:18 creating quite a developed picture about 22:20 you from where you go what you do what 22:24 times a day you're there the location 22:26 information your browser information 22:28 your device information they can build 22:30 up quite some detailed profiles about 22:33 you which can also be wrong which is 22:36 harmful in its own way and then they can 22:38 sell that later on or they can use it to 22:41 target you with other advertising as 22:42 well all right what did you have 22:50 something oh no I guess I was just kind 22:53 of like commenting um you know target 22:55 for example can can predict when a woman 22:58 is going to like is pregnant by the 23:00 things that she buys which is terrifying 23:03 right like yeah three weeks before she 23:06 even knows that she's pregnant that's 23:08 yes horrified and they really should let 23:12 her know that she's pregnant she's my 23:20 talks is when my mum died three years 23:24 ago and you don't share that stuff on 23:26 social media right away because you 23:27 don't want anyone to find out via social 23:29 media but the next day I started getting 23:31 ads in my feed for funeral directors you 23:34 know it's like why is this happening and 23:36 I know it doesn't come from nowhere and 23:39 so I was talking I asked my siblings 23:42 about it and my sister said oh yeah 23:43 sorry I told my friend by a private 23:46 message because she's in Australia and 23:49 Facebook must have just seen that I they 23:53 may not have done it that way there are 23:54 other ways I don't think I was looking 23:57 at any funeral director sites but 23:59 perhaps someone I knew was looking at 24:01 sites for funeral flowers or something 24:04 like that and perhaps that if they had a 24:06 facebook like button on there and that 24:09 could track them so that could be how 24:13 they found out so I don't know exactly 24:14 how they did it but it is a real example 24:17 of how I knew that it was very unlikely 24:20 I'd get that information anyway because 24:22 the rest of time I get the stuff that 24:24 most women in their early 30s get make 24:26 up dresses washing powder pregnancy 24:29 tests 24:31 I'll bet occasionally I'll get stuff 24:33 about like get married or you know the 24:35 very stuff they expect us to be doing 24:37 yeah I'm curious though on the other 24:41 side of the coin I mean we're talking 24:42 about the negative aspects of this but 24:44 as a consumer there is in a certain at a 24:48 certain level a benefit to this as in 24:51 I'm looking for I don't know 24:54 I want our sunglass yeah I'm looking 24:57 well I want desks and to then get I 25:06 guess insight into other things as I'm 25:11 browsing and saying hey oh I didn't know 25:14 about this company that there isn't 25:15 there is value there certain P land 25:19 value added as Frederick said to that 25:22 there's two sides of that yeah I mean my 25:25 initial response was like this is 25:28 usually yeah but do you want to be 25:30 defined as a consumer and is it worth 25:33 knowing that you're potentially being 25:35 manipulated into buying something you 25:36 don't necessarily want or need I don't 25:39 know about you but I have no will 25:41 whatsoever if I see something cool I 25:43 like on the web Oh out comes the money 25:46 there it is I've got no more money to 25:47 the rest but also the other side we can 25:55 build we can still have these advantages 25:58 and this technology there's no reason 26:00 why we shouldn't have things that are 26:02 really like great ways of finding out 26:04 the stuff that we won by observing our 26:06 behavior it's just that information 26:10 should all remain on our device so all 26:12 of that profiling and all that 26:13 information about us if that was kept on 26:16 our device and which is something that 26:18 Apple says it's been doing in terms of 26:21 trying to separate itself from Google 26:23 it's um they do a lot of their AI stuff 26:27 on the device and the idea being Apple 26:30 doesn't have that information about you 26:31 but you can still reap the benefits of 26:34 having things watching what you're doing 26:37 and going oh this is when you've got a 26:38 calendar appointment this is where you 26:40 go every day this is where you parked 26:42 your car and and 26:44 knowing all that stuff about you and 26:45 being able to use it in a way that is 26:48 useful to you without costing you 26:50 anything aside from obviously lots of 26:53 money cause Apple devices are very 26:54 expensive 26:55 I just thought along the same vein sorry 27:01 Frederick but well what about you know 27:03 we would did the example of target where 27:06 you know just sending now hey you should 27:09 get this because you're pregnant and you 27:11 gotta know in three weeks what what do 27:14 you say to companies that I mean they 27:17 how do they in this world of technology 27:20 where we live you know we live on these 27:22 phones and we live on our computers and 27:25 there's no that's not gonna change 27:27 anytime soon 27:27 what can companies do in order to reap 27:31 the benefits without invading privacy 27:34 like what what what are some approaches 27:37 that you believe right I really think 27:40 the the one thing we have to do is 27:43 change the business models is find 27:45 alternative business models I this is 27:47 the thing we've seen through Silicon 27:49 Valley for a long time is that a lot of 27:51 these companies they get a huge amount 27:53 of venture capital with the idea that 27:55 there will be a massive rapid growth and 27:57 a big return on that and so everyone who 28:00 invested will get loads of money back 28:02 however this often comes with the issue 28:05 that the people who started out they 28:08 thought they were just building a cool 28:09 thing they weren't necessarily thinking 28:10 about how it would be financially 28:12 sustainable aside in the fact that all 28:14 some angel investor might throw a ton of 28:16 money at us and so they haven't 28:20 necessarily thought about how to make 28:21 something really long-term financially 28:25 sustainable and so what happens is 28:28 what's the thing that they all have that 28:30 they can make money from well it's the 28:33 bee users if they've grown fast enough 28:35 they've got users somehow they can make 28:37 money from adverts or profiling or 28:41 something like that and so that's why 28:43 we're seeing a lot of these things I 28:44 don't necessarily believe that the 28:46 people starting out building this 28:48 technology is doing so for like 28:50 nefarious purpose ISM that they want to 28:52 be evil but they've accidentally Mort 28:55 themselves into a situation they have no 28:57 choice but to invade people's privacy 28:59 and so if we think about different ways 29:02 of funding things I one of the examples 29:05 you see a lot now is things like patreon 29:07 where the people who can afford stuff 29:10 often help support the people producing 29:13 cool content like that's a pretty cool 29:16 model that doesn't require and knowing 29:19 what anyone's doing and it just requires 29:22 people signing up and saying oh that's 29:24 cool and so that's a pretty nice model I 29:26 mean really if we think about it the 29:28 amount we're using social media it's our 29:31 new modern infrastructure now we'd 29:34 probably use it more than we use a road 29:36 more than we use the emergency services 29:38 a lot of us and more than we use water 29:41 all these things that we actually pay 29:43 taxes for we pay local subsidies for I 29:46 mean could there be a way that the 29:49 technology is funded from the Commons in 29:52 some way whether it's funded by taxes or 29:54 something like that 29:56 III don't know exactly what the options 29:58 are but we've been talking to we've 30:00 worked on a prototype with the city of 30:02 Ghent in Belgium where they started 30:05 funding it from their taxes and they 30:08 started funding this social network 30:10 we're building from their taxes and the 30:14 key thing being that did not mean the 30:17 city got to see what everyone was doing 30:19 it just meant they paid for it for their 30:21 citizens 30:22 they had no they were otherwise not 30:24 involved with it it's so nasty it makes 30:28 me think of this thing I heard on user 30:31 defenders with GC Nobel he had on Alan 30:34 Cooper and Alan was making this point of 30:36 how we go down these paths where we 30:40 reward CEOs for making money we don't 30:43 reward CEOs for necessarily doing the 30:47 right things and when we and if we could 30:52 find some way to shift that if we could 30:55 pivot to maybe a more beneficial road 30:59 for everyone that's when we'll start 31:01 making these big leaps these big changes 31:04 that are just so much more positive you 31:08 know I wanted to ask you Laura and I 31:10 here 31:11 a lot of people talking about this and 31:13 that maybe some people think it's kind 31:15 of a silly conspiracy kind of thing but 31:17 do you and I don't want to bring it up 31:20 because I do see people bring this up on 31:22 the Facebook's the youtubes what do you 31:24 think about people that cite that these 31:28 things might also be listening to what 31:30 we say picking up key words and then 31:34 using that like if we have a 31:35 conversation about dog food and all of a 31:38 sudden on a website I get a ad or dog 31:42 food like is is that a possibility is 31:46 that realistic is that just goofy or is 31:49 that the future it's a bit of both so if 31:53 you have an Alexa or something like that 31:56 or you have a some of the Samsung TVs 31:59 then they are definitely always on 32:02 listening and transmitting back to those 32:05 companies that's what you buy it for 32:06 because you want it to hear what you're 32:08 saying when you say hey Alexa get me 32:11 this why do you do that however the 32:17 there has been some scientists recently 32:20 looking into this specifically because 32:22 it's people saying oh I was near my 32:25 mobile phone and maybe it's microphone 32:28 picked up me talking about this thing 32:30 it's perfume I was talking about to my 32:32 friend and suddenly we were seeing ads 32:34 for her and actually what they found is 32:36 it's highly unlikely that is happening 32:39 in the majority of cases it's just that 32:42 because the likelihood is that your 32:45 friend will go away and search for it 32:47 and or someone else who was it with the 32:49 new ear shot 32:50 went and searched for it and of course 32:52 your device and the sights you're on 32:55 often know your location and so they'll 32:57 often know your proximity to other 32:59 people so they'll be like well those 33:01 people in a room together they may have 33:03 been talking about this thing and this 33:05 person search will send it out to that 33:08 person that is a perfectly normal 33:10 algorithm that's the kind of thing that 33:12 you will see happening and so it will 33:14 seem very spooky a lot of the time but 33:16 there's usually a reason behind it and 33:19 in some ways that's pressuring in some 33:22 ways that's not reassuring it 33:24 because it really does give you an idea 33:26 of how much they know if you think about 33:27 the stuff you enter into search engines 33:29 like do you want people knowing like 33:32 everything that you're like looking up 33:34 all the time because there's a lot of 33:36 stuff that I put in a timeline that's 33:44 what you get right back in jail I 33:49 actually recently I switched to using a 33:52 Google phone an Android phone the pixel 33:56 - and I actually was talking about 34:00 something with a friend of mine that I 34:02 was looking into getting and I I kid you 34:05 not I two days later I got I was getting 34:09 emails and I got saved actually two 34:14 phone calls from companies for the 34:16 services coincidence I don't know but 34:18 still freaked me out I'm I'm mostly 34:21 suspicious of Google stuff I have to say 34:23 like I generally would not use a Google 34:27 device I 34:27 avoid using Chrome wherever possible 34:29 asides from if I'm testing my web 34:31 development stuff because opera 34:35 everything is designed to be data entry 34:38 with Google and you buy a subset of 34:40 phone at a subsidized price because it's 34:42 a wonderful data entry device sorry did 34:45 you say Skynet you can use our opera by 34:50 the way since it is using the same like 34:53 inter like backend as Chrome and it's 34:55 actually really nice I highly recommend 34:57 people check out opera yeah their mobile 35:00 apps really nice - they are owned by a 35:03 Chinese company now but you do have to 35:09 ruin everything 35:10 they're not interesting our data right 35:12 yeah but and it's I mean so I use think 35:19 I use Safari it's not great Apple aren't 35:21 great they don't treat their employees 35:24 particularly in countries like China 35:26 particularly well and but they make the 35:29 right noises about privacy I use 35:32 DuckDuckGo the search that's pretty good 35:35 and yeah it may not produce exact 35:38 the same results as Google but it's 35:41 pretty good there's a cool new one out 35:43 called find X and there are very privacy 35:47 aware search engine all about privacy 35:49 and so these left little alternatives 35:52 are popping up but do you recommend for 35:54 4G for a gmail as an alternative 35:58 well given that I've always been someone 36:00 who used Apple Mail so it was I was just 36:03 using stuff that plugs into a server 36:05 behind it but actually I've been using 36:07 fast mail and I've been using it for 36:10 quite a few years now they're based in 36:11 Australia and they're really good but 36:16 it's and and their web interface is 36:19 really good but it's not going to be the 36:20 same as Gmail because it's not doing a 36:23 lot of those things that Jima is doing 36:25 that are on one side very helpful on the 36:28 other side a little bit invasive I one 36:31 of the big issues recently with Gmail 36:33 was that they were giving app developers 36:36 if you were using extensions with Gmail 36:38 access to everything that you were 36:40 sending an email so is very useful if 36:44 you want to have auto-generated replies 36:46 and things like that and or useful 36:48 adverts in the corner but everything 36:52 you're emailing I mean I get emails from 36:53 my doctor I don't necessarily want a 36:55 company knowing that especially a 36:57 company that could then sell that 36:58 information onto it my health insurance 37:00 company or something yeah it's a weird 37:03 balance to me there's a convenience 37:05 value to all that that's so nice like 37:07 when I was using my Google phone I was 37:10 it was so nice that hey oh it knows I 37:12 went here in it like recommended these 37:14 other places and then like a few seconds 37:17 later I'm like wait it knows I went here 37:19 and it's recommending other places and 37:24 it's it's a it's a privileged thing I'm 37:27 afraid to say like a lot of us working 37:28 in tech we're pretty privileged and our 37:31 information being out there isn't a 37:33 massive risk to us but actually to a 37:35 lots of people there are information 37:38 they don't necessarily want to share 37:39 things people with disabilities don't 37:42 necessarily want to share everything 37:43 about their medical status with their 37:44 insurance companies because they know 37:47 they'll get penalized for things people 37:48 I mean if you look at the way the prison 37:51 system 37:52 in the US is using algorithmic profiling 37:54 in order to decide who stays in prison 37:57 and you see the racial bias in that as 37:59 well and you think well actually maybe I 38:01 don't want my data contributing towards 38:03 that racial bias whatever whatever race 38:06 of background you are and so because all 38:11 of this information is being sucked up 38:13 by technology and by algorithms that are 38:15 designed by people who don't necessarily 38:17 know what they're doing or don't 38:19 necessarily understand the impact of 38:21 what they're doing and it has very 38:23 dangerous outcomes even if it's wrong 38:25 it's like if the information they have 38:28 about you is wrong if you look at 38:29 Facebook you can actually see the terms 38:32 that it has that it thinks you're 38:34 interested in for advertising I have 38:37 some very strange things that's put in 38:38 those terms I mean it's got some like 38:41 fairly true stuff like lives away from 38:43 family is an example of stuff they have 38:45 uses Wi-Fi uses a 3G connection like it 38:49 knows all this stuff about me that you 38:51 can tell it's fairly easy to get but 38:53 then it'll put like random things that 38:55 I've no idea about things that I had an 38:58 interest in time I mean but if you think 39:02 of it but if they like to put you in 39:07 boxes because that makes you easier to 39:08 advertise to it makes sense and but if 39:11 it puts you in a box because it thinks 39:13 it'll make you easier to advertise to 39:15 but that classification could be used 39:17 against you in the future 39:19 that's quite a dangerous thing if it 39:21 decides oh well then actually you seem 39:24 like you talk to a lot of people who are 39:26 protesters so that makes a strong 39:28 likelihood that you're a protester and 39:29 so when they come to pre arrest people 39:33 before a big event which is something 39:36 that they did in the in the UK they did 39:39 before the Royal Wedding the big one 39:41 they pre arrested people because they 39:43 thought they were likely to protest and 39:45 they've been using it borders are going 39:48 into the US as well the idea of vote 39:50 your social media is not looking 39:51 particularly clean we're gonna give you 39:54 some extra extra checks and they're 39:56 using all this stuff already so we don't 39:58 want that stuff being used against us 40:00 yeah it's amazing they're gonna try to 40:02 say hey I don't know 40:04 it's the lizard Illuminati there they're 40:06 gonna try to sell us into the thought of 40:08 privacy being in and to create a concept 40:11 possibly in the future but you know I'm 40:13 speaking about accessibility I feel like 40:15 if we were getting close to the end and 40:18 I would the service if we didn't get to 40:21 your book and I'd love to talk to you 40:23 about it it's the accessibility for 40:26 everyone this is an amazing book and you 40:29 also have an audio book and you narrate 40:32 it yourself I'd like to know a little 40:33 bit about it and um why you wrote this 40:35 book yeah thank you I write it because 40:39 they asked me if I wanted to write it 40:43 very I'm I'm the kind of person who 40:47 would necessarily put myself up for 40:49 writing a book but I wrote an article a 40:52 long time ago for 24 ways and the blog 40:57 that's Rachel Andrew Andrew McLaren do 40:59 before Christmas and they do it every 41:01 year and I wrote a little thing about 41:03 why bother with accessibility because 41:06 it's something I've always cared about 41:07 they ever since I was stuff started out 41:09 learning about HTML and CSS as soon as I 41:12 learned that tables wasn't the right way 41:14 to do it around that time a lot of 41:16 people were also talking about 41:17 accessibility and I think I just read 41:19 the right blogs at the right time and I 41:21 just always thought well this should be 41:23 a priority and so I've always been 41:25 picking up stuff about accessibility 41:27 over time and so when they asked me do 41:30 you want to write a book about it I was 41:31 like well I'm not one of the experts 41:34 like people don't necessarily hire me 41:36 and to sort out the accessibility of 41:38 their site well I do know is I know a 41:41 lot of all the introductory stuff and 41:44 the people you should be following if 41:46 you want to know about the really like 41:48 nerdy expertise stuff 41:50 and so I set out trying to write a book 41:52 that would bridge that that would be an 41:54 introduction to accessibility that's why 41:56 it's for everyone and that would work 41:58 for lots of different disciplines in the 42:01 industry so it's not development 42:03 specific there's a lot of stuff in there 42:05 about design and about copy and things 42:07 like that and that could lead people 42:09 into the experts who are often talking 42:12 about things that are that can come 42:15 across as quite dense and difficult and 42:17 confusing if you 42:18 already have an introduction to 42:19 accessibility so I want to be the bridge 42:22 for them I love that I love in one of 42:25 your talks you also mentioned how you 42:28 were touching on universal design and 42:30 how developers are also designers 42:32 because they touch all the aspects of 42:37 this project or whatever project what 42:40 have you that they also need to be and 42:44 possibly its paramount for them to be 42:47 thinking about these aspects yeah 42:50 absolutely like that's why I call myself 42:52 a designer but a lot of people will call 42:55 me a developer as well because I write 42:57 code and and so but I see that as part 43:01 of trying to execute my design visions 43:04 because everything we do every little 43:06 thing we choose to do like from the 43:09 language we choose how we mark something 43:11 up that's a design decision that's a 43:14 decision that is part of the core part 43:17 of the design of what we're building and 43:18 so developers are making these decisions 43:21 all the time but it's very easy to go oh 43:24 yeah but it's a development thing so 43:26 it's not designed so it doesn't it's a 43:28 quite a good way of letting go of 43:30 responsibility I think by saying oh it's 43:34 this is just development sounds like 43:39 hard drama to that is that like that is 43:42 this something that like I feel like we 43:45 clash as designers and developers is 43:47 like but is that accessible though like 43:49 I know that's a conversation we have a 43:52 lot at my own job I don't know about you 43:54 guys that start a start but yeah we're 43:56 constantly like designers and developers 43:58 are like but like is that accessible but 44:00 like can you but like can they click 44:02 that but like but like can you use tabs 44:04 like there's I think that's really cool 44:08 that you're able to you know say okay 44:10 it's everybody's job to you know think 44:12 about accessibility oh we can all use it 44:14 against each other I think I think that 44:17 developers could turn around and say 44:19 well I've been looking at the hex value 44:21 of this color that you've chosen and 44:23 actually it's very low contrast against 44:26 the background because that's a common 44:28 one that designers like to do designers 44:29 like low contrast text and 44:32 hi any little text that's very hard for 44:34 most people to read because it looks 44:36 neat and tidy on the page and so we 44:38 think oh that's perfect it looks nice 44:40 and neat so drivers go around go well 44:42 actually they love eee on ccc exactly 44:48 and I've seen a lot of Apple used to be 44:50 one of their footer that they used to 44:53 have used to be the tiniest smallest 44:55 gray text they used to have things like 44:58 recruitment information so they have 45:00 like if you wanted a job at Apple 45:02 there'd be a link there in the footer 45:04 and that's the only place on the page 45:05 that would be in you're thinking well I 45:07 think only trying to find people that 45:09 have really great eyesight maybe people 45:12 that are very young to get that job 45:15 because they're not making it accessible 45:17 to everyone that's for sure I'm still 45:19 angry they got rid of you know something 45:22 looking like a button yeah it's a it's a 45:26 real pain they not distinguishing links 45:29 and buttons is a big issue making things 45:31 it hard to understand I think some of 45:34 the worst is making divs clickable they 45:38 we have plenty of HTML elements you can 45:40 use that are lovely and clickable why do 45:42 we have to do everything with divs and 45:44 spans they it's beyond me you get so 45:48 much goodness from using a semantically 45:51 relevant HTML element like that's one 45:54 thing I think that a lot of developers 45:56 they're fantastic at javascript 45:58 fantastic at CSS why aren't they any 46:01 good at HTML because it's it's a fairly 46:04 straightforward language and there's so 46:06 much you could get from it and save 46:07 yourself so much time in JavaScript and 46:10 CSS if you really understand HTML so 46:14 what would you say is the the the buy-in 46:19 value of the book like what what are you 46:21 all over that like what are the the core 46:25 considerations that focus on focus is on 46:27 in the book well if focus is on first 46:30 understanding like a range of like 46:33 people who could benefit from you making 46:36 your sites more inclusive and if that's 46:40 broadly focusing on things that make 46:42 your site easy to see easy to hear 46:46 and easy to understand and easy to use 46:48 so these are all kind of like common 46:51 usability goals really but then under 46:54 those brackets you can help a lot of 46:55 people who may be using a screen reader 46:58 to access your site maybe using keyboard 47:00 navigation maybe they need to use speech 47:04 input on your site or some kind of like 47:08 switch or external button and to access 47:11 your site or so all these different 47:14 things people with dyslexia people whose 47:16 second Lang like their second language 47:18 is English or whatever language your 47:20 site isn't can help people like that it 47:23 can help people that are distracted that 47:25 I'm bothering to like that are kind of 47:28 half using your website and half doing 47:29 something else there's so many different 47:32 people that can be helped by making our 47:34 sites more inclusive and then I try to 47:36 go through some of the simplest things 47:39 we can do in the different disciplines 47:41 too for those things say just things 47:44 like writing good HTML paying attention 47:47 to the color combinations that we're 47:49 using and making our text easy to 47:52 understand and easy to read a lot of 47:55 that comes from text just writing good 47:57 clear copy it's great for everyone and I 48:00 love that you did an audio book for it 48:02 did it I thought I saw where it took you 48:05 a fare link the time to do the audio 48:07 book to that to go through the process 48:10 wasn't like close to six months or 48:12 something like that yeah that was 48:14 because I kept doing it badly okay 48:16 it's really hard and it takes something 48:20 like that quite short books from a book 48:23 apart they call them brief books no but 48:26 still it it's what's nearly four hours 48:29 long the recording and I it's easy to 48:33 sit and kind of have a nice chat over 48:35 like a podcast or something like that 48:38 your voice doesn't get too worn out 48:39 we're sitting there for hours reading 48:42 something is really intensive what I'd 48:45 get really sorry and also I wasn't very 48:47 good to begin with because I was trying 48:49 to perform it too much and you don't 48:51 wanna you don't want to perform an audio 48:53 book you should have just just said I'm 48:56 gonna do it all in one recording and 48:57 just keep 48:59 the pints and just money but pints okay 49:04 that would have been being gracious more 49:07 amusing I said I know and if I did that 49:11 [Music] 49:13 [Laughter] 49:17 go Brian your turn I know you got yeah 49:20 is that toward the end of it like you 49:22 probably would like you know like the 49:24 beer goggles basically if you're drunk 49:26 you'd have like beer ear goggles that 49:28 would be fantastic so I'm sure that 49:30 would be you'd be like accessible but 49:32 for your ears right well the last thing 49:34 I read was the hardest thing to read 49:35 which is the list of resources in the 49:38 back it's essentially a list of Link's 49:39 so speech readout Europe well when I 49:44 when I went to record it I was thinking 49:46 how do I make it so that you the audio 49:49 book is useful without the of the rest 49:51 of the book but is there some stuff like 49:53 images code examples and stuff like that 49:56 I can't reproduce those exactly like 49:59 could you imagine me reading like just 50:01 so even the shortest bit of code it 50:03 would be like open bracket a space href 50:08 equals like it would be ridiculous and 50:10 so I had to try to find ways to make 50:12 their audio more accessible make it 50:15 easier to understand if you oh we're 50:18 trying to just this into the book 50:20 without reading it oh you know would be 50:22 fun would be like giving out some some 50:24 small URLs to go here for this example 50:28 mmm yeah well I and I thought about that 50:31 but actually I deliberately kept the 50:34 code in the book to the smallest amount 50:36 possible so that someone who was maybe a 50:39 copy writer wouldn't open it this looks 50:42 too much or scary the code our I I talk 50:48 I read around them so I kind of I talked 50:51 about it in a way that you could 50:52 understand why I meant but because I 50:55 don't have loads of code examples in 50:57 there and that's what we would do anyway 50:59 a book apart for their books that do 51:01 have a lot of code examples they link to 51:03 things like code pens with code examples 51:06 and stuff like that because no one wants 51:09 to try and copy and paste code from a 51:11 PDF of a book 51:12 or a pub format that's a nightmare 51:15 that's that's asking for invisible sort 51:19 of characters everywhere if you do that 51:23 normally I have a spotlight question 51:26 this point I kind of actually want to 51:28 spend a little more time so we get we 51:30 don't have a lot of time left talking 51:32 about accessibility a little bit more 51:34 like some resources cuz I think this is 51:36 an excellent topic and it's something 51:40 that isn't talked about enough I think I 51:43 mean we hear it more and more lately I 51:46 think but I don't think that I'd like to 51:49 hear some resources that you might 51:51 recommend I'm sure you spend a lot of 51:53 time researching anything that you could 51:55 share would be incredible 51:57 well off the top you might have one of 52:00 the best things that I see is every week 52:03 there's a newsletter by David Kennedy 52:04 that's called 52:06 it's al it's Ali a one-one why the most 52:09 accessible of all of that short names 52:13 its accessibility weekly or a11y weekly 52:17 and that's a really good email and what 52:20 they do and what he does every week is 52:22 he sends a load of resources things the 52:25 people of postive throughout the week 52:27 which is a great way of finding other 52:28 blogs and other experts in the area but 52:31 it also has a specific section each week 52:33 for people who are new to accessibility 52:35 so if you're just finding your way in 52:37 they'll always be something that's on 52:39 your level that you'll find easy to 52:42 understand something you can act on 52:43 right away rather than things that feel 52:45 a lot more intimidating that's that's 52:48 one of my favorites I'd also say if you 52:50 really want to get into the the code 52:52 side of stuff and you want some good 52:54 examples of really like great accessible 52:57 code and you want to look at inclusive 53:00 components by Hayden Pickering and he's 53:03 got a website for it and he's recently 53:05 made it into an e-book as well which is 53:07 great and it's got just really great 53:10 patterns for all this these different 53:12 things that you do all the time on the 53:13 web and how to make them as accessible 53:16 as possible and it's a great he's in he 53:19 wrote a book before about it as well so 53:21 he's got some really good stuff he's 53:24 funny as well yeah didn't he 53:26 do the for the forward on your book I 53:30 believe yes he did 53:31 yeah yeah he said he's a good guy and 53:34 I've learned a lot from him so it was 53:35 really nice having him do the foreword 53:37 for me nice well hey we're just about 53:41 out of time Laura and we're gonna have 53:44 to close out and just really appreciate 53:46 coming on and we uh we'd like to ask our 53:48 guests one final question and that is if 53:52 you have any kind of final words or 53:54 wisdom or any kind of things that we'd 53:57 like to say at the closing any any 53:59 advice for our audience well I wouldn't 54:02 say I'm a particularly wise person but 54:05 what I say is it's best to always try to 54:08 do the best you can for other people so 54:11 whether that's trying to make your site 54:12 more inclusive whether that's trying to 54:15 find a business model that's ethical and 54:17 it feels good to be doing good for other 54:20 people and you got to think about what 54:23 you're doing for your future self and 54:25 for your children as well he's very well 54:28 said yes well what's the best way people 54:32 could get ahold of you obviously we'll 54:33 add some links in the show notes but for 54:35 our audio listeners where can they go 54:36 well let go to my website Laura Cal bag 54:39 comm and I also have my own Macedon 54:42 instance which you can find out about on 54:44 my blog but that's I'm Laura at Macedon 54:50 Laura Cal bag comm it's a long username 54:52 but you get used to it and you can find 54:55 our work at Indy at IND dots ie nice 55:00 Laura thank you so much for being on the 55:03 show super appreciate it that was 55:06 awesome a lot of great takeaways thank 55:09 you so much well hey everybody um please 55:10 remember to to go to iTunes subscribe go 55:14 to thunder nerds io / reviews in this 55:17 review subscribe on the old iTunes and 55:20 what else do you know 55:21 and definitely send us in some questions 55:24 to you for next time that would be 55:26 pretty awesome we want to read your 55:28 questions live that's a good one all the 55:31 wait did you have anything in mind 55:33 Frederick Petrie on oh right our patreon 55:37 uh we were talking 55:39 I totally was like oh my gosh patreon we 55:42 should totally do it um so yeah we have 55:44 a patreon please you know we would love 55:48 your support and and love and 55:50 appreciation in the form of literally 55:52 pennies and nickels if you need is like 55:54 if that's all you can give we're totally 55:56 cool with that 55:57 and yeah just love on us and we'll love 56:00 on you back we'll send you some really 56:01 great content and yeah pretty cool stuff 56:06 thank you everybody for listening thank 56:08 you Laura really appreciate it hey Alexa 56:10 subscribe to thunder nerds