TFGL2021 - S2 - Ep 9 - Fighting The Media
Welcome to this episode of the Tech For Good Live podcast.
Fay Schofield is on hosting duties, and she’s joined by TFGL team members Greg Ashton and Tom Passmore
And our special guest this week is Lou Lai, Strategist, fundraiser, marketer, charity lover. Lou is the Transformation Director at Manifesto London, Founder of FemMentored and a trustee of Blood Cancer UK.
Transcript
Fay: Hello, and welcome to another Tech For Good Live podcast. If you're new here, run please run for the hills. The podcast title might sound like a lovely look at adorable little technology projects doing good in the world. But more often than not, it's actually all about the really terrible stuff that's happening in the world. Yes, we are one cheery bunch. I suppose we should start every tech for good live with a content warning based on the types of things we usually discuss. But today's segment feels pretty heavy as it's in regards to sexual assault. So we're letting you know upfront, so please feel free to skip this episode of the podcast. We're also going to be talking about newspapers being divisive, nothing new there really. And Amnesty International are launching an algorithmic accountability lab, which is a mouthful to say but we are very intrigued by it. We do have a good news story as well to finish up with which is obviously about alcohol because the pubs are now open, so why not? Joining me today we have Greg Ashton and the question for you, Greg, if you could have the answer to one unknown topic, what would it be?
Greg: I want it to be something like really impressive. Like, is there life on other planets or where does gravity come from? Well, actually, what I really want to know is what the hell is the name of this pudding that I had as a kid. It was kinda like Angel delight but more like a coasted and you came with like a crumb topping. And yeah, that's, there's no, it's not around any more. And nobody when I mentioned it knows what it is. Yeah, so if I could have one answer, it would be that.
Fay: It would be that. Listeners, answers on a postcard, please. That does sound delightful, though.
Greg: It was so good. So good. But they don't make it anymore.
Fay: There we go. I love it. I love it.
Tom: Could it be like a yoghurt pot?
Greg: No, no, you kind of warmed it up in the microwave and then you sprinkled on this like crumbly top. It was delicious.
Tom: You've made that up
Greg: [laughs] Well, if I've made it up, what I want to know is who's going to pay me for the rights to make it?
Fay: Nobody. Nobody on this podcast episode. We're also joined by Tom. Tom, same question to you. What would your answer to one unknown topic be? What would it be?
Tom: The one that popped into my head and I think this is going to tell the world about me. And it's arguably like to know what happened to the Lost Legions of Varis in the Nuremberg forests during the reign of Augustus.
Fay: Love it, love it. We've gone from puddings to history. Keep it going, keeping us cultured Tom. And I'm Fay. I'm hosting today. Won't do as good a job as Bex obviously but I think if I was going to know what the answer to one unknown topic, it's gotta be, what's in area 51? What are they hiding? I want to know. Is it aliens. Hope so, kind of, because I think aliens are cool. Is it zombies? Probably just a load of nukes to be honest. But yeah, what have they got in area 51? I don't know. They don't want anyone to know. I'm intrigued. I'm a nosy bugger. I want to know the answer. And we have a guest with us today. Thank God, to steer us on the straight and narrow. Lou Lai. Strategist, fundraiser, marketer, charity lover. And you are the Transformation Director at Manifesto. You're also the Founder of Fermented and you're a Trustee for Blood Cancer UK. Welcome, Lou. And how the hell do you have time to get anything done?
Lou: Exactly. And maybe that should have been my question to my unknown. How is it possible to do all of these things and be and still be alive? But yeah, great to be here. And I guess similar to you in terms of like, what is the answer to the unknown topic? Because naturally, I went to aliens first. I was like, cool. So life on outer space, you know, what's out there? I equally then went to Area 51. But actually, kind of in the spirit of podcasts, like, what happened to the missing crypto queen? Like, like seriously, like, I have to know with all the news of like bitcoin and stuff like, where is she?
Fay: That’s a good one.
Lou: Yeah.
Fay: See, this is what happens when you prepare for a podcast agenda. You come with a great answer, right? Greg Ashton turned up with pudding.
Greg: [laughs] I was well prepared.
Fay: To be fair, we'll find you the answer. Maybe crypto queen is the person that invented that pudding. No wonder it's disappeared. Who knows? Who knows? Lou, great to have you here. Thank you. Thank you for joining us. Greg, kick us off. What's stat of the week?
Greg: So we've got a bit of a roller coaster ride this week, but we're starting right down the bottom and we're gonna go work our way up. So the podcast that gets jollier as it goes on but we're starting with a fairly bleak topic of conversation, which is that fewer of 60 rape cases led to a charge in England and Wales in 2020. The kind of numbers around this have shown a real steep decline in the number of charges coming through. And this is on the back of an end to end review of rape charges in England and Wales, which was due to be published last year by the government, but it got pushed back to May and it's just been pushed back to the end of June. The number of victims dropping out is increasingly lengthy investigations, trial processes has really gone up as well. So not only have you got the number of charges being pressed, but also the number of people like continuing through to the end of a case, has massively dropped off as well. So it's kind of a lose-lose situation really. It makes for pretty bleak reading at this point.
Fay: Yeah, heavy, heavy, heavy topic. Heavy, heavy topic. But it does seem as though you've linked to another article here, which is showing some good progress with victims. What's going on? What's going on there, Greg?
Greg: Well, so kind of an early taster of this review that's going on has been that they're going to introduce back into the proceedings with these cases. So they're going to make it easier for victims to provide their evidence prior to a trial through video, meaning that they can get through that process earlier on, provide their evidence and not have to worry about attending, you know, a trial where the person that they're accusing is present. I don't know if you've ever been to court, but you know, as a victim, the whole process is pretty dire. And you could turn up and find that, you know, your case has been pushed back for whatever reason. Maybe the defendant didn't turn up. So you're having to sit in a room and wait to hear that and then go back. And just all sorts of things like that can make it very, very stressful, and ties back to that issue of people dropping out of these cases. So this is a really good move forward. And if this kind of is an indication of what's gonna come out of the review next month, there's positive signs that it’s really going to have a good change and a good impact on how things are being carried out currently.
Fay: Yeah, it's mad to think, isn't it, that something I don't know, something is, I guess not, I don't want to say as simple as video because obviously, giving any kind of testimony, if you are a victim, and you've been through something as horrific as this will be, you know, will probably be one of the hardest things you ever do. But like having video as a form, for people to give evidence just seems like such a kind of simple, humane way for obviously victims to ever have to face, you know, kind of their attacker ever again. So it's great to see that. Yeah, it's great to see that this is coming through and will be introduced. But yeah, very heavy, very heavy topic to start off the podcast with so as you said, Greg, starting, starting at the bottom of the roller coaster, and hopefully, hopefully working our way up a little bit. What's happening in charity news of the week, as we make our way up this roller coaster, hopefully, fingers crossed.
Greg: Yeah. So I mean, we're kind of coasting along on a plateau at this point, right?
Fay: Oh great. Oh gee. Welcome new listeners. Turn off now.
Greg: [laughs] So Stonewall have been attacked, or attached, as I wrote in the notes here, attacked by major news outlets following a new campaign that they were running about diversity champions. It's very similar stuff that we've seen in recent years. You've got places like the Sunday Times times, Daily Mail, who were doing these, these kinds of articles, and it's your classic kind of like culture of fear, you know, calling people out. So it's, it does feel that, there was a quote from Stonewall, which said look at you look at some of the news outlets today, and you'd think we're still in 1989 with the same narrative trying to drive fear and division in communities and the whole point is, is to kind of stair up that that culture of fear and anger that you see with the likes of Trump to get people attacking Stonewall. But what I found really interesting about this article was not the article itself, it was some supporting information. So Stonewall tweeted about this calling it out. And somebody off the back of that did a little investigation to look at the kind of articles that are being written about trans people. These kinds of attacks are specifically targeting trans people. And what they found was between 2018 and 2019, the Times wrote 158 stories on trans people. But then in 2020, it was 324. And in 2021, based on current projections (so how many they've done so far, and if they keep going at the same rate) they're going to have published 565 stories about trans people. So they're pretty much doubling every year. And based on, you know, to the best of our knowledge, none of these stories are written by trans people themselves. So they're writing, you know, 500 articles this year, and not a single one has had any input from trans person or the trans community.
Lou: And so they're not, because I guess so much of like, the stories as well, you know, it's kind of like, the authenticity and the lived experience is such a vital part of representing a community, a group of people. So it's kind of like quite like, I guess, still, I guess their point around, it feels like we're sort of still in 1989. Like the progression here of actually accurately representing those experiences, to this day isn't happening.
Tom: Is there any way that Times could be defended to say, oh, it's because they're normalising the act of like, yeah, the trans community? So the fact that it wasn't the papers before, but it is now is a positive growth or is that just what the bad people say?
Greg: Yeah, because I don't think it is a positive growth, because you'll look at the types of articles that they're posting and, you know, it really isn't, it's very, very easy to kind of say that it's negative. And the thing, the clever thing that they're doing now is they're trying to use the kind of frictions that have been occurring in the LGBTQ+ community to drive a wedge, basically. And there are the frictions within the community around muddying trans and gay rights and some nervousness around doing that. And they're trying to drive a wedge there and say, well, we're all on the side of the gay people, but keep your trans people over there. So it's really insidious, the way that they go about it.
Tom: It's quite strategic, almost like divide and conquer.
Greg: Yeah, yeah, it really is. It really is. It's like we're not reporting on the news anymore. We're trying to drive public opinion.
Fay: Yeah, it isn't, I don't want to say it's not interesting. That's not the right word for this as a topic, but it's alarming. That's what I'm looking for. Words. Brain. Come on, think. It is alarming, just the level of transphobia that does exist in the UK, there was an article, I think, by Vox last year, which really kind of dug into how transphobia has become like a product of British culture, and you've got people in the limelight, such as JK Rowling, who are kind of pushing that agenda through and it was, if you kind of dig into kind of the stats and the facts about it, and it's just, you know, it's kind of in a way what you just said, Greg, it's like, you know, the UK in general is very supportive of, you know, gay people in the gay community, but when it comes to the trans community, it's almost a completely different thing. And, you know, trans folks are treated in a completely different and unacceptable kind of way. So maybe this you know, this increase in stories, especially, as you said, if they are coming from such a negative standpoint, it leans into that, unfortunately, into that idea that yeah, Britain is just full of transphobia.
Greg: So it papers, that's the thing. You know. Yeah. The social panic.
Fay: Oh what a cheery podcast this is [laughs] Oh God.
Greg: But I guess my question for the group, Lou especially, you know, as a trustee yourself, when you know, you find your charity is being attacked by the national media, how do you react to that. It's hard enough campaigning as a charity anyway for opinion. But if you're then also, like fighting the media, and five fairly large national newspapers like, it must be pretty disheartening, although Stonewall are well grounded in, you know, fighting the good fight. But yeah, what does everyone think?
Lou: Yeah, I mean, completely. And I think it's interesting because there, it's, you know, that the converse of sort of charities being, I guess, targeted around the time of money spent on non-charitable activities, by the media versus them driving literally, extreme workloads for kind of media teams, PR teams, you know, reputational management, where they're diverting resources to mitigate their reputation, because ultimately the impact on that could be the bottom line of service delivery, because people might stop supporting, it can cause issues. So it's sort of, you know, it absolutely can be a bit of a wrecking ball for organisations that, you know, they're not necessarily set up for sort of that crisis media kind of management and don't have the resources that big brands do. So it's sort of causing them to remove efforts from some activities, that then they're potentially being sort of, I guess, told that they're also not spending enough time and money on those things. So it's kind of interesting, the sort of balance you find in trying to mitigate those things. Absolutely. It can be really challenging, and I think, you know, as well, for charities is, you know, so much of it is around, like, how do you advocate and sort of represent communities in a way that, you know, it might alienate some kind of outlets, and that's a risk you have to take, because ultimately, you know, is about the community of people that you're supposed to be representing, and that you are being a voice for them and a platform and, you know, organisations like Stonewall, like, absolutely, they are well versed in doing this, but, you know, they also want to be delivering programmes of work that make workplaces and brands a safe place for this community. So it's kind of like for me, I struggle with that sort of pressure that's being put on charitable organisations to sort of have to respond to stuff like this, when it's not necessarily accurately reported. It's not being researched. There's no time invested in sort of validating what it is, what's the purpose of this content that we're sort of sharing with audiences and our readership?
Greg: Yeah, I think the time and money thing there is so important because the media are doing this to make money, and the charities are having to then spend money defending against those attacks and those kind of talking points. So yeah, it's a real lose-lose. And I guess then you create intentions internally, where it's kinda like, well, you know, charity is a fairly risk averse. So if they're doing something that is causing this backlash and reducing their money, because I haven't spent time tackling it. Does that then create frictions internally about well, should we be doing this and do we need to change our approach?
Fay: Oh, this roller coaster ride at the moment is just plateauing out from cheerful, isn't it? Good, Lord. Um, but no, super, like, you know, super important points and a lot of good things to think about.
Greg: I just want to note Fay, that I mentioned before that somebody had done that research around Yeah, identifying the number of cases. So it was Random Access Emily, on Twitter there, who says, self reports as a snotty little tra and researcher of anti trans hates groups. So thank you Random Access Emily for that little nice snippet of stats there.
Fay: Yeah, we'll be sure to tag in Random Access Emily and I'm saying this as a reminder to myself since I'm I manage the Twitter account, so I can make sure to tag Emily in when we put this podcast live. Okay, on the doom roller coaster we carry on. Greg, are we about to take an upturn with tech news of the week?
Greg: Yeah, yeah, yeah.
Fay: Oh thank God. I feel like we’re on that but, you know when you on Oblivion, you go to Alton Towers you’re just like hanging over the edge?
Greg: Yeah.
Fay: No. Yeah, yeah.
Lou: Yeah. Don’t look down.
Fay: That's my part time job being the voice of oblivion apparently.
Greg: [laughs] Yeah, this is going the other way. This is that bit where you can feel it hook onto the thing as it drags you up the hill. Although I guess then you've got a climbing sense of dread but that’s standard.
Fay: [laughs] I'll take it. It’s standard for this podcast, isn't it anyway, carry on.
Greg: So Amnesty International. They have what's called amnesty tech, which is kind of an arm of Amnesty International looking at topics around the world affecting tech and people's rights in relation to tech, and they’re launching an algorithmic accountability lab. So this is going to be a multi disciplinary team of data scientists, AI experts, human rights researchers. And they're all going to be doing cutting edge research into the use of algorithms, automated decision making systems that are used to deliver things like welfare, education, health care etc. They're gonna have three primary objectives. So to increase transparency around the use of these things, investigate and expose the Human Rights impacts of these systems and seek accountability for any violations that they find. And campaign for effective regulation over the design of use of the systems. So it's really, really cool. And you know, they're looking for staff right now. So they're looking for a lead for this lab. So if you're, if you're in the market for a job and this sounds interesting, then yeah, they're looking for a head of this. So what do we think? Will it work? How cool does it sound?
Tom: It sounds really cool. Will it work? I hope so. Like I mean, like, I mean, maybe. Maybe.
Greg: [laughs]
Lou: What will stop it, Tom, what's the thing? What are the, what are the pitfalls that you can see come in, that might stop it being a success?
Tom: Well, you can open up algorithms, that's fine. But then you can then still, like close off other functions within it and over algorithms that it interacts with. So everything about that one algorithm is great. It's fine. It's open, you understand that it's there, and it's accessible, and everyone understands it. But then every now and again, it calls this other one that's actually sensitive and hidden. You can't see that. So like, you can still hide black boxes with inside, like the kinds of technology, which is a real shame. And then people hide behind like commerciality and IP and stuff like that. So well, if I can get through that and if they can start opening this up, then first of all, like, it will allow people to see what's actually going on and how these decisions are getting made, and also improve them. Because these are experts, like data scientists, AI experts, human rights researchers doing this awesome work. These are experts in their field being like, okay, you this could be improved here based on our knowledge and expertise. Oh, cool. Yeah, we didn't realise that. So that's when it could be real good. It's just if people will allow the opening up of that, is my that's, you know, that's where it could fall down.
Lou: And I'm hoping that, because it sounds ridiculously cool and I have recently been rewatching Spooks, the entire kind of back catalogue, so anything that talks about a collective of hackers and like, Oh, yeah, I think the know how that works, is actually if at the heart of it, that it's the human rights piece, that sort of it powers and sits behind everything, if all roads lead back to that for how are we furthering and bettering human rights out of this, I think that sort of as part of the decision making needs to be such a core part of it and how every decision or every decision, I guess, Tom, to your point to not do something around sort of an algorithm, etc. that has to be sort of, like how is that being validated around the sort of the genuine impact it’s going to make to human rights? And I guess, like, you know, one of my things that I've been thinking about is potentially how do we address some reggression that's happened in inequality in human rights in the last year? Like, what does that mean for, you know, a big collective in technology sort of focus like this in terms of like, what things did we think were in place that are now gone backwards? And have to sort of start again, you know, that for me, it's kind of like, in terms of technology and that human rights piece is, Yeah, it's really, really interesting to see. That they're investing in this, that they're talking about this as a thing. It's been springboarded off an amazing report. And actually, it's also clearly I guess, leaning into service access, and kind of how we will probably in the future access public services, and that is sort of a community of people, I guess, globally.
Greg: Yeah, I agree with everything there. Yeah, it's so important because of that service access question, which, you know, GDPR has kind of softened the blow I think in Europe, but there's still a lot of questions and it's still early days. So I think, you know, organisations like amnesty, this, this was where they had to go next because there are so many. It is basically like the Wild West online. So without somebody like Amnesty, you know, watching the borders and checking everything, how things are happening, governments have just been so slow to respond to this. And I think having Amnesty and this team doing these checks, could really really push forward some, some changes, maybe some, you know, legislation that could actually have a beneficial impact. And I was thinking as well for charities. So charities, you know, some of the more deep stuff, are just, you know, it’s way above their heads. So having somebody like Amnesty International with a crack team kind of investigating it and saying, this is where bad things are happening, or these are the things that you need to watch out for, I feel like a lot of charities can benefit from this, when they're making potentially difficult decisions about like, what tech should we be using, where are the ethical pitfalls, should we be on Facebook? You know, can we balance that whole decision of what happens somewhere else on Facebook versus the amount of money we can make? So yeah, I feel like this is going to be a huge boom to the charitable sector.
Fay: Just to play devil's advocate, though. So obviously, Amnesty International came under fire last week. No, it was earlier in April, isn't it?
Greg: Yeah.
Fay: Basically, kind of having a culture of white privilege. And looking at The Guardian article. Now, one of the whistleblowers who came out, you know, is quoted as saying, we joined Amnesty hoping to campaign against human rights abuses but were instead let down through realising that the organisation actually helped perpetrate them. Perpetuate them. I’m crap at pronouncing that word. Apologies.
Greg: [laughs]
Fay: So it's interesting. I think this product, you know, I think this product sounds really cool. And it sounds you know, it sounds as though it could have like, a really, really positive change. And obviously, the amnesty tech team is possibly separate, you know, is possibly separate from the team or, you know, this bit of the organisation which came under fire. But that's something I don't know, that's something like, you know, when I first kind of saw, like, amnesty and human rights, that's where my mind immediately jumps. So it'll be interesting to kind of see, okay, how do they actually come out of this at the other end? I don't know is, is the Amnesty tech team, going to face any backlash from being like, well, hang on a minute, your way, you're coming in here wading with this, you know, of this new kind of product, and you're going to be tracking human rights abuses, but you don't even have your own house in order. Yeah, just wanted to throw that out there, see what people thought about that.
Greg: I mean, I'm not sure many charities are going to put their neck out on the line and say, oh Amnesty are out for that, because this is a pretty big problem across the charity, charitable sector at the minute. It seems like every week, there's another story. So yeah, I'm not sure they put their neck out and say we're better than you. And, yeah, it is terrible. But also, do as I say, not as I do. So just because they had these terrible injustices internally doesn't then discount under the recommendations and things that they're doing externally. You know, it can be, it doesn't, you know, you can, it's very easy to spot problems externally. Much harder to spot them internally, as a person and as an organisation.
Tom: Yeah, no, definitely. I just, I think we even need to be able to, like kind of move on from bad decision making processes. And like, as in, in some ways, I think this is a really good step, because it's very much about transparency, and about being open. And it's using those kinds of tools to be like, right, we're being transparent, we're trying to work, we're trying to be transparent, we're trying to be open, we're trying to give back to this, the tech system, which is basically like, this idea of openness, which I love. And it's also creating a rulebook of what good looks like, because I've worked in the public sector, and people don't quite know what good looks like, because they don't quite fully understand every aspect that they can pull together and loads of different ways. And then being like, No, don't do that. That's bad. Yes, it's clever and it's like a real good use of the data but don't do that. Let's strip that out and move on. Like, and if you have this rulebook to understand the why. Because again, data doesn't really tell you the why about these things. Like to then join them all together and be like, ah, we shouldn't be doing that because of this, but we shouldn't be doing that because it adds additional value, and protects the people that these algorithms are actually meant to add value to. So I think that's where the you cases become really interesting.
Lou: Yeah, and I think just picking up like, you can't separate a global technology ambition for systemic rooted racism in an organisation that you just can't and, like, I think, part of the journey with this is, is the openness, the transparency and the reflection. And, you know, as part of the journey around this, you know, in the, you know, in what they're trying to achieve, they will need a diverse set of team of skills of experiences to make that a reality and to be reflecting the different needs of human rights that sort of occur in lots of different places that, you know, it sounds like it's the, you know, it's a global ambition, which is a big, you know, that's a big vision and a big sort of statement to have. So the intent behind that has to be supported by trust in an infrastructure, an organisation who will address what is going on, you know, and even this, you know, we've had Amnesty UK new CEO announced, Sasha, who, you know, hopefully, you know, from what he's talking about is very much kind of addressing head on that the challenges that were surfaced within that Guardian report. And I know some of the people who came forward, and were part of that and it was very, very difficult, because a lot of them also had lived experiences, they went to amnesty, because twenty years ago, they helped a family member, kind of who was a victim of some sort of human rights violations. So all of these things have to be addressed. And I think Greg, your point around like, unfortunately, this is sort of a huge sector wide kind of conversation that needs to continue to be heard, like, it's going to be uncomfortable. But change doesn't happen if we don't face the truth of what has been happening, what potentially, we've been allowing to or perpetuating as a sector and authentically move forward to do the right thing to actually do what we all want to do, which is have the biggest impact possible, depending on what our mission is as a charitable organisation.
Fay: Very, very, very good points. And it will be, it'll be interesting to see what happens with this product. And you know, when you read the word hackers and tech lives, it just actually just makes me think of like Angelina Jolie in Hackers, which still to this day is one of the best films ever made. And I stand by that statement. But in terms of kind of chatting about, you know, what goes on internally at charities, I think this leads us in quite nicely to our rant or nice of the week. Greg, what's going on?
Greg: Yeah, we might as well continue this. You’ve sparked that conversation, so it makes sense to carry it on. So yeah, we were just saying there about a bit of a sector wide issue around racism and inequalities within organisations. Girlguiding has been the most recent revelation around that with another internal report coming out. And this seems to be the theme. So there was an article in the third sector, and it was kind of posing that question of should charities be coming clean about internal reports? So I think, in most cases where we've had this information come out, it's an internal report that's been either leaked or released, either relatively shortly afterwards, or, you know, some months later. And it really kind of drives into that conversation we were just having there about transparency, and having these head-on conversations about the issues within the sector, and owning up to them. And then just saying, yeah, we haven't got it right. Let's carry on. So what does everyone feel about, like internal reports, how they’re being handled? You know, should they be more transparent or do they need time to kind of process these things and tackle them internally? What does everyone think?
Lou: It feels like, you know, the content of those reports, you know, were relevant to public interest in terms of actually the safety of the teams and the staff who are working there. I guess, the sort of, the responsibility you have to use the pennies and the pounds to sort of achieve the, you know, the, I guess, the mission that you're looking to, and I think it's probably, you know, a combination of absolutely, you know, there's internal reflection time, there's sort of, you know, time to think about, you know, what does this mean, because ultimately, like you were saying, Greg it's like, you know, sharing information, but you know, what people want to know is so what next? Like it can't just end there, you know, what next? What's the definitive action that is being taken? What does accountability look like? Who's taking responsibility for this and what does this mean for us, culturally, as an organisation of values behave, you know, it just so intrinsically crosses so much of an organisation, it almost feels impossible to sort of, like separate that from talking about yourself as an organisation because it is just such a intrinsic part of of what's going on. So I kind of feel like it would be almost impossible to not in any way sort of be surfacing this as something that needs to be sort of viewed outside of just the internal organisation. I appreciate where, you know, there may be legalities that need to be followed but ultimately, they're I guess, in some ways is this a conversation for organisations where they might need to be thinking about actually what are our processes and procedures for handling situations like this? Because we've all seen various different kinds of reporting of, you know, when it's handled badly, it's really bad. And actually, what does taking responsibility look like? That, for me is the interesting conversation about where you see the true values at the heart of an organisation and that leadership shining through.
Tom: Yeah, I think this stuff is like, yeah, this stuff is hard. Like running organisations. Working with people, like, basically, anything to do with people is hard work, when you've got them in a working environment of volunteering, or when you're trying to help them or whatever, is doubly hard work. And then you've got all of these interplays that are coming on. And I think it's really important to actually yeah, share these reports, I think not straight away, like not instantaneously, because I don't think it helps anyone, like basically just throwing mud at people. But I think being like, this happened, this was the report. These are the steps we're going to do to address it, because again, kind of linking into what I was saying before is this idea of what does good look like, right? Again, this is hard. I don't think there's any perfect organisations out there, like, so I think everyone needs to learn from each other. And everyone needs to go through this journey because this seems to be where organisations are at the moment, like, across the board. Like, for a long time, people that look like me, have been in control and power and that's not the case anymore. And basically it’s like, when people have liked me, I've opened the door again, it's fine. Look, have a look, you people have realised how actually bad it is. And it's like, okay, cool. This has been bad. Let's fix it. And it's up to all of us to do that. And the only way I think of doing that is A open, transparent kinds of working together to be like, Oh, we tried this and it didn't work because of this, like so you've got like tall talks, like startups who like Zappos, who, like everyone have access to this was all great and lovely. And that didn't work because people got in the way, or you have like, very strict, hierarchical organisations. That doesn't work, because people got in the way. So we need to find a blended approach to this, which means that people can be allowed to be people, and then it all works together. And that was rock hard. Honestly, hard, hard stuff, and open transparency is the only way I can see it working. So releasing reports like that.
Greg: Yeah, you've got to move past the guilt.
Tom: Yeah.
Greg: Yeah, guilt is just such a damaging force because you do, just you get caught up in it. And it's either anger or fear. And people just react badly. And it's kind of like when you do a retrospective, when you're doing like agile, you've got to kind of say, okay, mistakes were made. But let's focus on the solutions and say, you know, if we just keep talking about all the problems, we're never going to get any better. So we just have to accept that we all work for organisations that were started by rich old people back in the day. So we're, you know, we're coming from a place of imperialistic, kind of colonialistic, kind of development and assuaging that guilt of having loads more money than other people, and say, right, great. That's in the past. Let's look forward and let's just try to be better.
Fay: Oh, nice moral to the story. Nice story. But no. Yeah, you've all made, like, really, really good points. And there's a really kind of powerful quote as part of, you know, as part of this article, which is just saying it's time for our sector to embrace transformative justice and public accountability is such an important part of that. I don't have a segue. I was just trying to..
Tom: I have a great segue. Can I?
Fay: Yeah, please!
Tom: Talking about solutions. Now let's talk about an absolute solution.
Greg: [laughs] Ohhhhhhhh.
Fay: You can tell her the scientist is on this call. Not me. Cool. What's happening? What's that? What's happening for our and finally? That was great Tom. Maybe you can host next time.
Greg: Just end it there. Just end it there. Oh no, I have to do the next segment, otherwise it doesn't make sense. So someone has made not only a sustainable alcohol, but an alcohol that actually removes 2.73 kilos of CO2 from the atmosphere. So this is from a company called Avalon, and they've made a Calvados that yeah, removes the equivalent of driving 21 kilometres in an average car from the atmosphere. Which is pretty cool. And they say it appealing to everyone from gin drinkers when served long with tonic to those who enjoy those spirits brown. Not sure what that means. But yeah, there you go. You can get drunk and feel better about how you are saving the planet?
Fay: I love it. I must admit I did read like the title as like the spirit that doesn't give the planet a hangover. But I'd missed out the word planet and I was like, oh my God. Drink. Hangover. Because honestly, the older I get, the worse they become. Good Lord.
Greg: You spend more money.
Tom: Spend more money.
Greg: Yeah. Expensive booze tends to give you very little hangover I’ve found.
Tom: I've got a little bit of a hangover from Eurovision.
Fay: That’s allowed though, that's allowed, because yeah, we had to drown our sorrows on that. Anyway, so it's sustainable for the planet. It's clearing stuff up. How are they going to actually be using it? You can tell I haven't read this article apart from the headline. That might, that might be clear. But yeah, what are they actually going to do with it?
Greg: Drink it. You can buy it.
Fay: Drink it? Oh right [laughs]
Tom: Yeah, it’s Calvados.
Greg: Yeah. It’s Calvados.
Tom: Apple brandy.
Fay: This is the person that literally only drinks beer. And occasionally like, white wine. Drink it. Okay, cool.
Greg: We'll buy you a bottle of Calvados and you can .
Tom: Thank you. That's very kind. Thank you.
Lou: Yeah, I heard that Tom. Yeah, yeah, yeah. Thanks Greg.
Tom: Yeah, yeah. Greg, you're a nice guy. Nice guy. Sharing is caring.
Fay: But to be fair, Greg did say you know if it's more expensive, it doesn't give you that much of a hangover. So technically, yeah.
Tom: Yeah, if it’s more expensive.
Lou: We should test that theory.
Tom: Yeah, we should.
Lou: Like that's the hypothesis for Greg. So we need to validate this.
Tom: We need to research. We need a lot of research.
Greg: [laughs]
Tom: If anyone wants to give us some money to do a research project.
Fay: Yeah. Because I've got to ask listeners for money at the end to help make the podcast accessible. Just as a caveat, any donations received do not go towards this drinking research project. Just to sort of say that. That's all we have time for today. That's gone quick! Thanks for listening. People who like stuck around on the roller coaster of doom. Lou, how was that for you?
Lou: It was, it was a roller coaster. But that pretty much reflects daily life, right? You think you've got something's gonna happen this day and then the world that we live in and particularly my world, like mainly working with the not for profit sector every day is a different challenge. And COVID-19 is just throwing everything out of the water so yeah, I feel like I strapped in and we had a good good journey and I'm yeah, super happy to have been here despite sweating like I said, Fay when I saw that first question about, I don’t know how I missed this on the briefing document that I glanced at earlier. What the hell?
Fay: Don’t worry. I threw it in as a curveball right at the last minute and trust me you had a better answer than Greg and his pudding.
Lou: Thank God for that pudding. Now I'm intrigued and may well Google that this evening because it's fun putting memories of being born in the 80s. So yeah
Fay:I love it. Where can people find you online Lou? Like website, Twitter, where do you want. This is your chance to plug. What do you want to plug?
Lou: Okay cool. Yeah on Twitter at Lou Lai UK but mainly use that to find me on Fermented which is my women for women, mentoring modern day networks using technology like Slack and virtual events to connect women in the for good space to lift each other up because not everyone feels confident about achieving success in our sector. And it's about time we did something about it. So yeah.
Fay: Oh, yeah. Nice. All right. Well, Lou Lai UK. Find her. Find her online people. Right listeners, would you think I'm asking that question? Do I want to know? Yes, I do want to know, we'd love to hear your thoughts. Please do get in touch with us on Twitter, which is at Tech for good live. You can email us as well at Hello at Tech for good dot live. And we would love it if you gave us a nice little iTunes review. It helps spread the word and tell your mates about the podcast. Thank you to our producers for producing this podcast as well. Big shout out to them. As I kind of mentioned at the beginning, not at the beginning, five minutes ago, this podcast is run by volunteers and we survive solely on sponsorships and donations. Right now our primary goal is to make sure that all of our podcast episodes are accessible. So we're making sure that every single episode is transcribed. You can find those on our website as well. Sadly, this does cost money and we do desperately need your help to make this become a reality. So if you've ever tuned in to one of our podcasts, you attended our event, please do consider chipping in even if it's just for the price of a cup of coffee, which I know doesn't really work anymore because we've all been working at home for years but in the days of past when you used to get a Starbucks for like three quid you can instead donate it to us and we can make an accessible podcast and you can do that donation by visiting tech for God dot live forward slash donate. And finally thank you to Podcast.co for hosting us. Great thanks everybody. Bye.
Greg: Bye.
Tom: Bye.
Lou: Bye