TFGL2021 - S3 - Ep 2 - Trying Something New
Welcome to this episode of the Tech For Good Live podcast.
Hosting this show this time around is Tom Passmore and he’s joined by ever-present potty mouth Greg Ashton.
And we have guests! Plural!
Arda Awais & Savena Surana co-creators of Identity 2.0, a creative studio working at the intersection of digital rights, technology and Identity. For more information and to get tickets for their exhibition ‘This Machine Is Black’, head to identity20.org/thismachineisblack
Transcript
Tom: Hello and welcome to another exciting episode of the Tech for Good Live podcast. This episode is all about what is new in the news this week. This week, we have a new host that is me. We also have not just one, but two fantastic, wonderful inspirational guests with us. We have a new controversy on Tic Tok. The UN is making new mistakes with data. And Facebook is trying something new. They're attempting to be not the finest pieces of dog excrement on the planet. So, with so much new, in so little time, without further ado, let's get cracking. So, this week on the show,our researcher is the ever present potty mouth that is Greg Ashton. Say hi, Greg.
Greg: Hey [bleep]
Tom: Hey Greg, this week, I've got a question for you. What is new in your life? How has it impacted you? And is that a good or a bad thing?
Greg: New in my life? Not much given lockdown. It's kind of hard to find new things. I have been finding that I have an ever increasing sense of existential dread as I get closer to my 40th birthday next year.
Tom: And so going back to my question, Greg, could you please answer the question? How is that impacting you? And is that a good or a bad thing?
Greg: I find it really hard to watch anything about death. You know, it just raises many questions. What's going to happen to us afterwards and all that kind of stuff. So yeah, the news is pretty tough at the minute.
Tom: Wow. Okay, great. Well, thanks for that, Greg. You've certainly lifted the mood here. But thankfully, we do have two wonderful guests with us today. So Arda and Savena, can you tell us about yourself? Who are you? Why are you here? What are you up to and why is tech so good?
Savena: Yes, I am just a vague cameraless person at the moment. But beyond that, I'm also the Co-creator of Identity 2.0, alongside Arda, and we're a creative studio working at the intersection of digital rights, technology and identity. So essentially, we make art about what it means to exist online. That is all sorts of ways. And we're here today because we believe in tech for good. We'd like to use some tech, we think it can be really interesting. But also, we quite like the opposite and often question, is this actually good? Who gets to use this phrase, why are we using that phrase. And we can't do that in like, creative ways from using memes to using memes. We just make a lot of memes, basically. And so yeah, that's a bit about what we do.
Tom: Excellent. And so so actually, more importantly, than any of that, what's new in your life? How is it impacting you and is it a good or a bad thing?
Savena: And what's new in my life? I started a new job this past month. And so it's been exciting, I guess, it has impacted me in a good way.
Tom: Excellent. Excellent.
Savena: Yeah, that's, that's on my end.
Arda: I got an iPhone for the first time.
Tom: Ohhhhh.
Arda: That has impacted me because for so long, I think when I was growing up and as a teenager, or like, kind of in my early 20s, being like, yeah, Android. iPhones’ for losers. I'm so much cooler. I'm just basically trying to be like the opposite of a hipster by trying to go for Androids and thinking it was customizable, it's so much better. But after many, many times of my phone failing when it really shouldn't have, I decided to migrate over and now I'm just like an applehead. Like my work, I have a MacBook Air and I have an iPhone and the joys of Airdrop are just so good. I can't believe I deprived myself of this for so long.
Tom: See.
Arda: I’m kinda like reverse hipster in a way.
Tom: [laughs] You’re just behind the times. It’s fine. It’s fine.
Greg: I never got people's love of Macbooks and hatred of like PC/Windows, until I started working for government and realised how bad people can set up Windows. Now I'm like, I totally get why people are big MacBook fans and hate PCs. Because you give any IT department windows and they will ruin it. They will absolutely destroy it. Yeah, there you go.
Tom: Maybe you’re just working for the wrong government departments. Greg.
Greg: [laughs]
Tom: I don't think we can answer that question. It wasn't a question. It was just a fact.
Greg: I'll let you know. I wanted to ask about your latest project for Identity 2.0 because I was having a little read about it. It sounds amazing. So do you want to just fill us in on that?
Savena: Thanks! Yeah, sure. So it's called This Machine Is Black and it is an immersive exhibition that's happening in real life, which is very exciting for us.
Arda: No more Zoom.
Greg: [laughs]
Savena: No more Zoom exhibitions [laughs] So yeah, it's exploring the relationship between race and technology and it does this in kind of like a fun interactive way and takes people through the ways that tech kind of battles, tries to support and then like, undermines technology, and also just like, amplifies existing inequalities. And we do that using fun and puns and things like that. It’s happening at the end of August in Leicester. We've been supported by the City Council. So yeah, that's, that's our latest project. Up and coming.
Arda: Get your tickets, get your tickets.
Greg: [laughs] Yeah, do it. It looks fantastic. And I think, you know, that medium of like, art is just such a, particularly when it's interactive is just such a great medium for like, communicating complex, and often, you know, difficult issues.
Tom: Yeah.
Greg: I was annoyed that it's in Leicester, because I don't go there that often.
Tom: But maybe it's a reason to go, Greg, you can broaden your horizons. And that will be your new for next week.
Greg: Very true.
Arda: Leicester is where it's at clearly, because we are there.
Tom: Indeed. I second that. I second that.
Greg: Bring it up north. Bring it up north.
Tom: Bring it up north. Indeed.
Arda : Maybe.
Savena: We'll talk after. See what we can do.
Tom: And I'm your host for this evening. So, this evening, I don't know what time you're listening to it but it's evening for us. And I'm Tom Passmore. So what is new in my life, you ask? Well, I'm glad you did ask. I bought a new house. I bought a house. It wasn't new. Well, it's new to me. In Southport. Which means that I've moved back to the north. I was officially in the East Midlands, which is the Midlands, and now I'm back in the north. And I feel it in my heart. My heart is singing.
Greg: [laughs]
Tom: So you should definitely bring This Machine Is Black to the north. You should bring it to Southport. That's where it's at.
Greg: [laughs]
Tom: Now, because that's where I'm at.
Arda: [laughs] You have the first spot in our tour then. That's where it is.
Tom: Indeed
Greg: Change the Southport sign to say home of Tom Passmore.
Tom: I mean, I've only been here a week and they're already signed to do it. So this is the impact I have. But anyway, talking about this beautiful segue into stat of the week. Greg [laughs].
Greg: Nailed it.
Tom: Nailed it. Nailed the segue.
Greg: Yeah, so another interesting story from Tik Tok and a question around how their algorithms work. So, there are currently 168,000 videos on Tik Tok, using Megan Thee Stallion’s new song Thot Shit, but unlike previous viral songs, there is not a single trending dance to it and this is part of an indefinite strike by black creators. So what people have noticed on Tik Tok is a lot of the money and a lot of the credit for these viral videos and you know, we see tonnes of these, they use them all over their advertising. It leaks on to every other social media channel, you see these viral dance videos everywhere now. But by and large, the black creators that are coming up with these dances are not being credited, and they're not seeing any of the money for it. So a campaign has been started by a consultancy firm, Define and Empower, to get these creators to stop feeding into the community to try and raise the profile of these creators and raise this issue. So I say good on ‘em, and I say Tik Tok, sort your shit out.
Tom: What can they do to sort it out, do you think Greg?
Greg: Well, the main thing for me is and it comes back to an issue that Tik Tok have had for a long while, is the algorithm. There have been a number of cases where they sell this story of being inclusive and supporting people and you know, on the side of all shapes and forms of life and actually when you look at what the kind of content that is promoted and pushed to the front, often it is minorities that are kind of devalued, and they don't get pulled to the front. Savena, Arda what do you think?
Savena: There's like a couple of things that really, I like about this story. One. Yes. Unionised. This is the way forward. I'm all for big tech companies. Do you see it happening in Amazon. You’ve kind of seen it with other big tech companies about unionising and striking and like the people who are actually on the floor, doing the work, making an impact to those above, like, I think that's a really important part of fighting back against tech for bad. And I think this is a really great example of it. But also, this just reminds me of the film, Bring It On, with the cheerleaders.
Greg: [laughs]
Savena: Because it's true. The white cheerleaders are stealing the dance moves from the black cheerleaders.
Arda: [laughs]
Savena: And they're obviously poorer in the film. And then it's revealed like, oh the reason that this white cheerleading group have been winning all the time is because they've been stealing all these dance moves. And that movie is old.
Arda: Love the example.
Savena: Why is this still happening?
Tom: Life imitating art or art imitating life. What’s going on?
Savena: Exactly. Are we all just cheerleaders in the Christian Denson masterpiece?
Tom: [laughs] Masterpiece.
Savena: I just like, this problem isn’t old and it's just frustrating that it’s happening again. And it's just reiterating and like you said, it's even worse because, you know, that was one cheerleading competition. This is an entire digital platform which you can access from around the world and is sorted by computers. So yeah.
Tom: Well, it's just kind of this knock on effect. Oh, sorry Arda. Go for it. Go for it.
Arda: Oh, no, I was just gonna kind of continue that point that Savena was saying. And also when I heard about this story, I was livid. I was like, yeah, there's nothing trending. There you go. What do you expect now? But it also got me thinking as well. Like, yes, it is something fundamentally wrong with Tik Tok and how they promote things. But also is it that you know, when black creators create a trend that it's not really, it's more of a given rather than like, oh, this is amazing. Come on Jimmy Fallon. Come on Ellen. Rather than just being like, oh, this is a given, this is cool, but like, yeah, whatever. And then seeing other people create and being like, Oh my god, how new, how spicy, how this and then being like, come on our shows. So is it a mixture of the platform being black bad, but also just people having their biases as well, is what kind of started making me think about it when I read this article.
Tom: That's exactly what I was going to say but you said it much better than I was ever going to because I would’ve fumbled my way through it and used words that yeah, like this, like this. No, it's like, I don't want to use the word boring because it's like, why did tech companies keep doing the boring same things? Just being douchebags. hey have the power to change things like they like as in the social media platforms have the power to change things, and they could very quickly actually, like, literally, probably overnight, change the balance. And like, this is what I really like about this, this, this kind of strike, because it goes you know what I actually liked, like you were saying, Savena. It’s like this unionisation, it's like, okay, we're not gonna do this, and it's going to damage you. And then it's just like, okay, now actually change, like, hopefully, this will impact change. I don't know. It gives movement for artists and it gets movement for different people in different ways. Which brings me nicely on to the charity news of the week. Talking about movement, Greg.
Greg: [laughs] Maybe a slightly inappropriate segue there? [laughs]
Tom: Definitely, definitely. But why is it inappropriate? Tell us.
Greg: So yeah, more bad, bad stuff. The UN refugee agency UNHCR has reportedly, not reportedly, they have shared ringer biometric data with the government of Myanmar. And they claim that this act is perfectly appropriate as refugees gave consent. So this started out, they were following the kind of war crimes and the assault on Rohingya in Myanmar, the refugees moved to Bangladesh to get away. And the UN refugee agency came in to support them and basically collected lots of people's data, supposedly to support them. They say it's part of repatriating them, so getting them back into Myanmar, but you know, there's a serious question mark there around whether that was the right and appropriate way to use the information. Now they've pushed back and said, well, we got concerned and we made it really clear that providing this information back to their government was only with their consent, and they didn't have to consent to that in order to receive support in Bangladesh. Yeah, there's real question marks there over, you've got an international agency who's saying give us your day and give us consent to share this with other people. And you've got these people who literally had to escape their country otherwise they would have died. And yeah, it's just, why? Why did they have to? Who thought it was a good idea to even now at this stage, even consider giving that data back to a government? And let's face it, it's a military, not a government, who tried to wipe them off the face of the earth. It just seems, It baffles me. It absolutely baffles me that they did this.
Tom: What is their, like, have they refuted this? Have they come back against this, like, what’s the UN said?
Greg: They've said, yeah, we did it. But we got consent.
Savena: On God.
Greg: And it was all to ensure that they were repatriated. So the whole idea was, well, we'll give that information to the Bangladeshi government. But then obviously, at some point you want, you're going to want to go back to Myanmar when everything's cool and fine. So we're going to give you, so just to be clear, this included the biometric data of 830,000 people. It also included details of their family composition, their place of origin and information on their relatives overseas. So it's literally like a genocide playbook. Really, if you're wanting to wipe a whole group of people off the face of the earth, that would give you a really good start.
Savena: It's like, frustrating on so many different levels. Because there's just, you know, you've got the big name, the UN, what are you doing? What is going on? And then like, just one, the first question is, why is there a need to collect this information in the first place? As in, yeah, they have, you know, digital identity cards, but even that, in its formation, is deeply flawed, especially when you look at how India has gone about collecting biometric data and systematically leaving certain groups out like people from different casts or religion. So that in this entire idea that your biometric should be linked to your digital identity card is flawed in itself and then you're doing this with like people who are extremely vulnerable people who have been put under immense pressure to uproot their entire life, and have gone through so much trauma. And then you're saying that people can produce affirmative consent under that circumstance? Their idea and their definition of consent is so deeply flawed here. I don't understand how that can be their rebuttal. But how could they can be come back and be like, no, no, it's consent. Like, we all know, consent across the internet is like really bad, like GDPR put that in their wording of like, you need a firm, like, active consent. And for a lot of people, that's that oh, yeah, to tick box, it doesn't mean hiding all I thought patterns or like making like readable privacy policy doesn't mean any of that. It just means like, no, no, they said it then. They didn't read anything else. No no no it’s right, it’s right. And like, that is not that's not consent, you wouldn't consider that consent in any other circumstance, but on online, it's fine. So it just baffles me that people who are meant to be dealing with very vulnerable communities all the time, think that's okay. Ohhhh, that's just very, on so many levels, very frustrating. I don't even know the word for it.
Arda: Yeah. Reading that just made me very angry, but also very upset at the same time. There's just like a mixture of emotions that you can't really explain. And definitely what Savena said, the consent here is, it's not true. Because as a refugee, you kind of, you have nothing literally just left everything that you have. And for you to get services, even if it's like, even if they're saying, oh, they didn't have to do this to get services, but maybe it was easier for them to get services if they gave away their data. So rather than waiting, I don't know a week to get what they need, maybe they just needed to give their data or they'd wait two days. I don't know the ins and outs of it. But of course, they would want as much support as possible. And the UN is such a trusted name, like everywhere they're just a trusted name. If the UN representative comes up to you and you're a refugee, you've just left everything, you're gonna want to do anything and give everything so you can get the support because you fully trust them just in that name that they have created that. So for them to do this is just like, violation on so many trust levels. And just in the essence of being the UN as well. And just all the thing is what's really, really pissed me off is that the amount of data that they took from these people is not necessary. It's actually not necessary. What was the reason? What was the reason? Because they didn't need to know this much information. So like, why?
Tom: That's where I'm at. It's like, there's a massive power dynamic at play here. Like there's basically there's a group of powerless, terrified, persecuted individuals that have a commodity. They're actually, like, their data isn't . they cannot like value. Because you know, like, and then there's another powerful, like, friendly benevolent entity comes along, but if you like, you don't have to give us this but you know, you could give us this, right? You're gonna be like, okay, fine. I mean, it means nothing to me like it means nothing to me, great fine, you know my name, you know who it is, you know, you know, my lineage is. I don't care. I'm hungry, I'm tired, I'm scared, I'm lonely. I don't know where my friend is. I don't know where my wife is. I don't know where whoever is. Like, just shut up. I will sign whatever you want me to sign, I will tick whatever box you want me to take in whatever way you want me to tick it as long as I can stop having this conversation and get some warmth, some clothing, some food, some water, and some protection. Great. And it's just like, the power dynamic is all wrong. You can ask people for that sensitive information in that scenario. It's not right. It's not just. It's not fair.
Greg: I think like 99.999% of the world don't understand, like, affirmative or active consent, like, literally in any situation. The thing that blows my mind with this, though, is like, so like names, age, things like that will be bad enough. But this is biometric data as well. So like this, this literally can't be changed. That is with them. So they are identifiable forever now, which is terrible.
Arda: What was the reason?
Tom: Yeah. Why? Yeah.
Greg: Yeah. The other thing that blows my mind and bear with me here, it made me think of the times I've worked with, like at risk groups. So the homeless or sex workers and the organisations that work with them, will always refuse to share their information with the government or other services because although it might provide them with access to health care or things like that, there's too much of a risk that it might lead to other things as well, like arrest. So in every case, those charities refuse to share that information. I just don't get why the UN thought it was okay.
Savena: And yeah, that's just so like, a tragic symptom of what we live in now in that, you know, people who are the most vulnerable, can't even access services, because they're so scared of what the government are going to turn around and do with their information. And like, you know, we're probably in a much more fortunate state than these people who are, you know, refugees and moving to a whole different country to escape a really oppressive regime. But like, the symptoms of like, using invasive tech and testing out how far people can go are always going to be focused on marginalised communities and those at risk first. They're the people that they're going to try this out with, because they are perceived to have the smallest voice and the smallest amount of power. And they're probably going to be able to get away with it. Like you see it with how they tried to use facial recognition technology on like black communities first, and how they got away with using algorithms, which they sorted out whether you got a loan or not in poorer communities. And they were like, yeah, we can get away with this because one, they're probably just less likely to fight back. But two, they have less power to fight back. And so literally, as you guys were saying, like, what can you do in that situation? How can you say no? And then how can you say, the absence of no, like you said, is not consent. And also, they don't have the option? You can't say that's like actual consent. It's very, very, oh it’s so annoying.
Greg: There's a level of knowledge as well. Like, Myanmar is like, way, near that area of the world, there are pockets that are literally like, kind of going back in time. I wonder in these cases, whether they would have even the slightest understanding of even the term data and what it means and how it could affect their lives in the future?
Savena: Yes, it’s a luxury to have, like, the knowledge that we do, and to be able to implement it and talk about it in a way that we feel like we can actually make actions from it. A lot of people can't afford, which is sad.
Tom: Yeah, and like we've seen it time and time again, how like this data is used, like, incorrectly poorly to, like kind of really manipulate situations. Like so you had it with it with Cambridge Analytica, and you had it with like, kind of the Brexit vote you had it with. And like this was all about Facebook in its release of its data, and like they're just bad guys. Which takes us nicely on to tech news of the week.
Savena: [laughs]
Greg: [laughs]
Arda: So subtle.
Savena: I know. Look how proud he is.
Greg: [laughs]
Tom: Very proud. Very proud. I wish the listeners could see my proud face.
Greg: [laughs] Am I up? Do you want me to?
Tom: Yes, yes. That Greg, that was a segue. So I took one story and kind of manipulated it slightly to give hints about what was coming up next. So then this went seamlessly from one segment to another. So this segment is now tech news of the week. It’s all about Facebook.
Greg: I missed it because normally it’s an absolute shambles.
Tom: There you go Greg. Take it away. Seamless.
Greg: So yeah, like a stop clock, you know, is occasionally right twice a day, Facebook seems to have done something good around identifying deep fakes. Now, I don't get a lot of this. It's very technical. But essentially, they've looked at a way, in partnership with Michigan State University, at identifying and attributing deep fakes, using reverse engineering from a single AI generated image. So apparently, previous ways of detecting deep fakes would need some knowledge of the model that had been used or access to a number of different images so that you could identify it, but they can identify a deep fake just from one image. And this is apparently not a new concept, but it is a new way of applying the concept. And not only can it identify deep fakes, but that process of attributing means that they can identify clusters of deep fakes. So they could identify a coordinated disinformation act, which is really cool.
Tom: I mean.
Arda: But why?
Tom: I don;t know what you’re talking about. you've got why.
Arda: It's not even that. I'm thinking, okay, Facebook's done something ‘good’ but how will they use this to ruin the world later?
Greg: [laughs]
Savena: What will this bring for the shareholders?
Tom: Ahhhh, so cynical. He might be a really nice guy. Zuckerberg might be a really nice guy, that's just...
Arda: Zuckerberg looks like the type of guy who got like a boots meal deal, but gest like a plain cheese sandwich and thinks he's like, wow.
Tom: [laughs] I'm going to step in and say you cannot judge people on the way that they look. Just because he likes plain cheese sandwiches doesn't make him a bad guy. Like, I think maybe this is what he was going for the whole time.
Greg: [laughs]
Tom: He wanted to create a dystopian universe where he could come back and be the saviour by spotting fake robots on the internet.
Greg: That's probably quite plausible, actually.
Sevena: Maybe. The thing about Facebook is anytime they do anything with kind of facial recognition technology, I just, I will never get over the fact that they used the tonnes and tonnes of images that we uploaded to Facebook to train their own facial recognition technology. That’s what they did for years, and they only got caught and they got found out when they released that technology. And people were like, wait, how did you actually train this up because it's really good. And there was like, you know, just happened to have a bank with loads of images of like people from all around the world. I wonder how we got that. And that is just an insane thing. They've kind of repackaged and gone away and they've come back with something that looks a bit different. And sure this seems really great, in terms of what they could potentially do for the world. But again, I'm just like, well, how would did you train it? Where did you get the images from?
Tom: You know why. You know the answer to that question. What is it Arda? What's the answer to that question?
Arda: [laughs]
Greg: It's very self serving, I think, you know. They’re covering their own asses here, because they’ve seen that this is becoming an increasing problem. And you could see the headlines, some deep fake happens sometime in the future and maybe one of the far right groups in the US goes off on one and attacks something or someone are, you know, some deep fake about I don't know, Biden happens and he is assassinated because someone is convinced and then all gets fed back to Facebook and they would be absolutely ruined. So I think all they're doing is just, you know, covering their asses because it's in their favour to identify something that could be or there's, you know, a repeat of the Russia incident of interfering with the election by posting deep fakes. So then they get dragged off again. So I think that they're just covering their own asses basically.
Sevena: Yeah. Okay when I get a hold of some deep fake material and that's it.
Tom: I think you're looking at this the wrong way.[laughs] And this isn't just like me being all happy and smiley because it’s tech for good and everything's good. I think actually what's happening here is they're trying to figure out a way to identify the ways of identifying deep fakes so that when they release their own deep fakes, they cannot be identified.
Arda: Yes, yes, I see that happening.
Tom: Yeah, that's the only reason people try to reverse engineer this shit. Is to see how it's not reverse engineerable in the future. That's my opinion.
Arda: [laughs] It's a good take on it. There's just one thing about this that also just rings out to me, is the fact that Facebook are funding research projects and you see this across loads of big tech companies. Google have their own research brand, like Amazon, Facebook, all of them do. And that is just in itself just sad to me because, like, you are funding some of the cleverest people, some of the biggest thought leaders in this field, who could really make such an amazing difference, but they are under the thumb of the company who is causing the most harm. Like what could Michigan State University really like go into research if they were given free rein funding from the government to actually look into the harm that tech is actually causing us?
Sevena: And I think that is like a reason that we're not seeing, I mean, we're seeing some great progress, obviously, against the big tech fund. But I think there is a reason that we're not seeing as much out there as we could, because the biggest players are being paid by the people who are causing the most harm. That’s just crazy. Like you imagine if Shell was still funding, like climate change research, or like, you know, happened to be funding a huge exhibition in central London about climate change. That would be - people would have things to say about that.
Arda: I think then goes to like how, just as I'm doing, and I use this a lot, but like how people know about climate change so much more now. They know the ins and outs of it, they know, they should care, they know, like, you know, how they can kind of what impact they can do all these things. They know that Shell’s not a nice guy, kind of stuff. And we're not, we're not there yet, when it comes to like data privacy, and it's not like or just in terms of just that conversations around big tech. People have a lot of apathy towards it and it's just like, yeah, this is what it is, this is there's not really much we can change kind of stuff. A lot of things that we hear when we create our spaces. And so it's like, when Facebook comes out with these things, or any other big tech platform comes out with like research scholarships and stuff, people don't have much to say about it. They're just like, oh, my God, this is great. And they'll send it to people and they'll try to get involved with it, because it's a very big name that they think would help them in their career. But actually, they don't really take a step back and have the knowledge to be like, this is a bit fishy. Like, why, why is this happening? And I think then it's just kind of then, I just need more collective voices in this space. So more people just learn about what's going on and also then can form their own opinions and can actually engage with it. And also, then when something like this happens, a lot of people can question like, why are you doing this? Like, what is your end goal? What's the endgame kind of thing?
Sevena: But no one's giving money out like Google and Facebook. That's the problem. Because even like this stuff, I'm like, oh my God, we could do that.
Tom: Yeah. Yeah.
Sevena: Exactly. We have loads of money. If we just like gave into the big man.
Greg: [laughs]
Arda: No, no, sir, I can't blame people at the same time. Because I'm like, I get it. Like I wish you know, everyone was good and pure as us but.
Sevena: No, no, no [laughs]
Arda: I’m joking. I’m joking.
Greg: [laughs]
Arda: It's just a sad dichotomy we live in.
Tom: Yeah, it's, I don't know. It's crazy how these massive tech companies like Facebook have so much money because like, Facebook's free. Like, there's no darkness around the free services that tech companies offer. In the same way that like, you know, like so Google have their free mail service called Gmail. And that's free. And what are people using that service for? In this week's rant of the week. So that was a segue there.
Greg: [laughs]
Tom: Yeah, we have a, we have a rant of the week. So who wants to take it away?
Sevena: I will take it away. Bear with me, because these are just like a weekly occasion, you just get to listen to it for once and it's been recorded. This week the government, it's kind of come out that a lot of them have been using their personal Gmail accounts to conduct a lot of business. And so for example, you've got, you know, the now infamous Matt Hancock has come out and his Gmail has been linked to a lot of kind of COVID related contracts that have come out. And it's mad because these things aren't like, are you a fiver off the pub or anything like that. These are multimillion pound like contracts, which have been handed out.
Tom: Billion points.
Sevena: Billion pound contacts that are being handed out. And it's just crazy that it’s on private personal accounts. Because one, why isn't that being questioned that relationship, as we all know, like those relationships are crazy close that it's like his brother in law's like mum or something like that, who own shares and stuff. That in itself is showing that you are a level of comfortableness with the people that you are handing out billions of pounds to, which in itself should be questioned. Two, if it's in their private email account, how is that going to be held accountable? We elect these people to essentially run the country for us. Whether we agree with that or not like that is essentially the world the society that England lives in right now. And their actions are being held on private accounts, which aren't under the jurisdiction of the government, which we have elected, how are we ever going to be able to hold them to account? It’s not on public record, I don't know what this is saying. I can't do a subject asset request, I can't do a Freedom of Information request either to get that information. And if they're able to kind of do this thing, there's also leaked into other areas you see like whatsapps being sent you see like, they're allowed to send messages which delete in this instantly using Signal. And that just kind of amplifies issues we already have about transparency and accountability with the government that a lot of us don't already trust. And there's amazing work being done against it, with like Foxworth Legal and that the citizens who have like teamed up to fight against the digital NHS kind of back, back hand moves but also trying to fight against governments being able to use these like instantly disappearing messaging services to conduct business. Like you’ve seen it. Dominic Cummings released little screenshots that were taken like a Nokia 250 or something like that. Like the messages that were sent back and forth in group chats and broadcasts that were sent out, this is this is the government in action. This isn't like, you know, sporting out five aside 40 for Sunday, even if that is how they do their football, whatever. But this is actual policy and decisions being made in those conversations. And the public, who elected them have a right to hold them accountable for that. They can't, they can backtrack after it really easily as well and be like, well, we never even said that. You've got no evidence of it. And just like erodes any sense of transparency and accountability and responsibility that they have in order, like in their office. And it's just incredibly frustrating that they keep doing it. Like they just keep fucking up.
Arda: They have no fear.
Tom: And nothing happens to them.
Araa: That’s because they have no dear. If you have no fear, then you don't feel like…
Tom: It’s not courage [laughs]
Arda: There's never any implications for what they do. They just get away with it. You know, get a little slap on the wrist if that and then they just go back to their five star five floor villa somewhere in South Kensington living it up and happy and they're just like, yeah, I did that and what.
Sevena: And then it goes back to your lovely segue about Gmail, the free product, like how secure is that information that you are sharing as well. If this is like important information about contracts and you are using it using a private email that probably isn't the same protections that are government like, civil service kind of email would have, what are you doing? How are you securing that information about a multi billion pound contract? How do you know that's not going to be leaked then? There's literally a news story today about how someone's actual phone number was really like publicly available for the past seven years or something. There was another news story of the day about how someone left top secret nuclear like army plans or something at a bus stop in Kent.
Tom: Yeah. I think this comes back to what Greg was saying before is, Government doesn't know how to set up a Microsoft Windows on their infrastructure. So, of course, people are going to start using Gmail. They don't know how to use Outlook.
Arda: Read your frickin systems then. You have all the money. Literally get everybody Macs. Live it up.
Tom: Yeah. No, I just yeah, I just find it so frustrating that they keep doing all this stuff and time and time again, they get caught with their pants down. And nothing happens. But literally nothing happened. Like the only reason that Hancock's gone, is because he resigned because he thought it would look bad. No one's gonna force him out. Like, like, maybe the only decent thing he did was, you know, resign. Like but yeah, like this. It's this thing, like, and I've just like, I want them to be, I want politicians to be brave. And I want them to be kind of noble people. But yeah, if you have no, if you have no fear, you can't be those things. And basically, like, if you have no fear, you've got no balls. And finally, Greg [laughs]
Sevena: [laughs] It's taken me a second to understand what a segue was but yeah.
Tom: The listeners have to catch up. Here we go listeners. You’re on this roller coaster.
Greg: It’ll make perfect sense in a second. So have you seen Cloudy with a Chance of Meatballs?
Arda: I love it.
Tom: I love the segue.
Greg: [laughs] Fantastic film. And someone's actually made it a reality. So for those who haven't seen Cloudy with a Chance of Meatballs, what have you been doing? But also, it's a fun family romp about a mad cat scientist who invents a machine that creates food from nothing from air, and someone has actually created a machine that creates food from air. Now in the movie, it creates, like burgers and meatballs. It's less fun, less fun in reality, but it is food. So it produces bacteria that can be eaten. So it's basically like a protein alternative to crops like soya. Why this is exciting is because these kinds of crops, like soya, have huge impacts on the environment. So you know, swathes of forests are cut down to make money from planting soya. And it's fed to animals in many cases. And this machine, you would use a tenth of the land to produce the same amount of protein bacteria, or whatever it is. So yeah, it's food from air which is crazy.
Tom: That is amazing. That is like, yeah, I think we should do a round of applause for that person very quietly, because we don’t want to damage the mics or whatever. But yeah, well done. Well done. That's the future. Yeah. Yeah.
Sevena: That's so exciting. You’re seeing so many interesting, like non-meat options now. Like, even just like 10 years ago, the only thing I could order off a menu was either like a mushroom risotto, or just like a really bad veggie patty. And now I'm like, whoa, look at all these great options.
Tom: Yeah, I mean, but I remember once going into a pub, and in the vegetarian section, there was tuna pasta bake.
All: [laughs]
Tom: I’m in Dorset. Like, honestly, honestly, yeah. I just remember looking at it and thinking right, brilliant. Tuna pasta bake in the vegetarian section. The first word there is meat.
Anyway. But yeah, so that's all we have time for today. So thanks for listening listeners. And so Arda, Savena, how is that for you?
Arda: Great. Thanks for having us.
Savena: Thanks.
Tom: Excellent. And how can people find you online? Like Twitter, Facebook, LinkedIn? Do people use LinkedIn?
Savena: We’re not on Facebook. You can find us on Instagram, Twitter and LinkedIn. Iidentity two underscore zero. Or you can visit our website which has all the links to everything and about our upcoming exhibition. All the good things and that is identity. twenty.org
Tom: Fantastic. And when does This Machine Is Black open?
Savena: And it opens the public on the 13th of August up until the 30th of August in Leicester, UK.
Tom: Fantastic. And you can get tickets from?
Savena: From our website.
Arda: Only £3.
Tom: Fantastic.
Savena: Yup. Only £3. Same as your coffee. Come on.
Tom: Cool. Well, I would love to come to the last version of that but I'm more importantly I'm looking forward to you when you come up to Southport. That would be great. So listeners what did you think? We'd love to hear your thoughts. What did you think of my segues? I am proud of myself. I wish we had captured my joyous face when that happened but you can’t. Get in touch with us on Twitter at Tech for good live rr you can email us at Hello at Tech for good dot live. And we'd love it if you could give us a nice iTunes review. It helps us more than you know, if you give her a nice five stars and also tell you rate about us because who doesn't like to listen to podcasts? Thank you to our producers for producing this fine, brilliant podcast. Brilliant. Thank you, every week. Do a great job. Also, don't forget this podcast is run by volunteers and we survive on sponsorships and donations. So right now one of our primary goals is to make sure all of our podcast episodes are accessible by making sure that every single episode is transcribed. Sadly, this does cost money and we desperately need your help to make this become a reality. So if you have ever tuned in to one of our podcasts or attended one of our events, please consider shipping us, chipping in for the price of a cup of coffee. So you've got a choice between you know, This Machine Is Black or I'll do my caffeine intake. So if you head over to tech for good dot live slash, donate. And I’d just like to thank Podcast.Co for hosting us. Thank you.
Greg: Bye.
Arda: Bye.
Sevena: Bye